Participatory Historians in Digital Cultural Heritage Process: Monumentalization of the First Finnish Commercial Computer Game – Jaakko Suominen & Anna Sivula

Abstract: The paper deals with the question of how digital games become cultural heritage. By using examples of changing conceptualisations of the first commercial Finnish computer game, the paper illuminates the amateur and professional historicising of computer games. The general theoretical contribution of the paper is in the explanation of cultural heritage processes where contemporary cultural phenomena are historicised and in the illustration of the role of production of monuments in the historicising.

 

Introduction

Laurajane Smith argues that heritage is not only something material, which merely relates with the past. Rather, it is a process of engagement of contemporaries. According to Smith, heritage is an act of communication, and an act of creating meaning in, and for, the present. At the same time, it signifies cultural identity work, a cultural and social process, which engages with acts of remembering, hence creating ways of understanding the present (Smith 1–2.). The process of defining cultural heritage occurs within game cultures as well. Academically, and in hobbyist communities, and partially within the game industry, cultural heritage debate has roused demands such as that certain digital games have to be saved and preserved ”before it is too late” (e.g. Lowood et al.). In the sense of Laurajane Smith’s ideas, the reason for the preservation is the shared conceptualization that digital games are meaningful and they should be able to pass on to new generations. Digital games are not – yet – in the World Heritage List by UNESCO, but there are already game canons, lists of significant, important, and revolutionary games; collected and conducted by hobbyist communities and semi-officially nominated committees.[1] Even though those debates about the heritage value of game cultures circle around material issues and, in many cases, specific items – digital and non-digital – the debates are part of the process of engagement and communicative identity work described by Smith.

Recognized heritage ought to be preserved, and scholars, as well as game hobbyists, have examined various possibilities for digital game preservation. They have approached that from the perspectives on creation of (museum) collections and archives, documenting and emulation and migration of game software code and so forth, which all can be perceived as ‘heritage work’ or ‘heritage management’ (Smith 2006) for ensuring that valuable items could be transferred for new generations. (See e.g. Swalwell; Heinonen & Reunanen; Guttenbrunner et al.; Barwick et al.. For a critical overview, see Newman.)[2] The discussion of digital game preservation is significant, but it, primarily, lacks serious contemplation regarding one of the key questions that is the focus of this paper: how the game cultural elements are recognized and selected as being worthy of preservation, of becoming elevated to the status of cultural heritage?  Obviously, one simple answer to the question is that particular games and devices have received wide recognition and impact as novelties in their contemporary contexts and therefore their value is somehow self-evident. We argue that there are other reasons to consider: more local and marginal means and, especially, historicized value of something to be the first of its kind. These canonical items of cultural heritage, we call here monuments.

Our primary theoretical concepts in this paper are the above mentioned (cultural) heritage process and monument. The cultural heritage process is observed in light of a case of early Finnish commercial computer games. Instead of being particularly interested in what digital game actually was the earliest production in Finland, we will merely deal with the question of public discourse of firstness and its connections with the cultural heritage process. The emergence of such discourse, representing the past of Finnish game cultures in a precise manner, we argue, is a sign of a particular phase of a cultural heritage process where specific actors have a motivation to discover origins of national game cultures and industry. Thus, we ask here, who is historicizing Finnish commercial computer games? When did the question of the first game emerge? How is the debate related to the process in which digital games become cultural heritage?  The case provides answers to the primary research question: how are certain items selected and transformed to the cultural heritage of digital game culture, particularly in the role of a monument? This article provides a model for comparison on other case examples in different contexts.

The article consists of the following sections: we will begin with an introduction to our essential theoretical concepts based on contemporary academic discussion on cultural heritage. Then, we will illustrate our case and describe public debate about the first, Finnish commercial computer game. In conclusion, we will return to theoretical conceptualizations of historicizing “firsts.”

Cultural heritage, community and monuments

Raiford Guins (108–109) has described the remnants of arcade game machines, such as Pac-Man or Pole Position cabinets, as unintentional monuments. Leaning on Austrian art historian Alois Riegl’s definition (1903), Guins states that even though the machines were monumental in their own age, they were not intended “for deliberate commemoration.” According to Guins, their monument status is new — or what we would contend: newly historicized.

Monuments are the vital elements in the production of cultural heritage.[3] The topical question is how and when an average digital game object is transformed from an ordinary artefact or a commodity to a realm of memory (Nora 626), or, as we prefer, a monument of digital culture.

A monument is a historical artefact that has a specific symbolic value to a certain cultural heritage community, i.e. a group of people who share an understanding of their common history.[4] In the cultural history of games, for instance, the famous and somehow special game devices and games, such as the first coin-op games or home consoles, now presented in museums and private collections, can become monuments to game culture. Such monuments are able to commodified as new products such as retrogames.[5]

Figure 1. Commodore 64 has unintentionally become a monument of the 1980s home computer culture in Finland, as well as in other places. Here is a C-64 advertisement “Liberator” from the first issue of MikroBitti home computer magazine (1/1984) referring to an internationally recognized deliberate monument, the Statue of Liberty. Later on, C-64 was also advertised as “the Computer of the Republic” with e.g. references to Finnish national flag and national romantic famous paintings, due to C-64’s popularity and market dominance.

Figure 1. Commodore 64 has unintentionally become a monument of the 1980s home computer culture in Finland, as well as in other places. Here is a C-64 advertisement “Liberator” from the first issue of MikroBitti home computer magazine (1/1984) referring to an internationally recognized deliberate monument, the Statue of Liberty. Later on, C-64 was also advertised as “the Computer of the Republic” with e.g. references to Finnish national flag and national romantic famous paintings, due to C-64’s popularity and market dominance.

A monument is an active element in a dynamic network of cultural heritage processes. A monument derives its cultural value and meaning from historical interpretations. A monument, a particular gaming artefact for example, is a link between the different elements of the production of social memories. Things, places, events, and stories are comprised in a monument. (Aronsson 197.)

The monument is historical by nature. The making of a monument requires a historical antecedent. When elements of cultural heritage are selected and thus, cultural heritage produced, the argumentation is grounded on histories. History, in this context, is a representation of the past, based on research and traceable source materials. The value of particular game devices, games and game related practices, is built on the historical representations of them, but the research conducted by professional, trained historians is not the sole source of these representations. Therefore, it is important to ask; who writes the history? The one, who conveys the history and conducts the process of cultural heritage?

As mentioned above, cultural heritage as a concept does not only refer to material or immaterial objects, but to a dynamic process (Smith 44–45; Bortolotto 21–22).  In this circular process, cultural heritage is produced, used, and reproduced. Instead of only consisting of objects, cultural heritage is merely an experience of historical continuum and social participation (Smith 45 and 49–50). Cultural heritage is also an instrument of various sorts of group-identity work, which has several transnational, national and local levels. (Sivula 2015; Sivula & Siro 2015.)

There are several groups, as well as individuals, who are developing their historical identity with digital games: game developers, players, journalists, and collectors, to name a few. On the other hand, there is not any indigenous group of digital culture, who possesses an exclusive right to the cultural heritage of digital games. A heritage community experiences the possession of cultural heritage and thus uses it in identity work and maintains its symbolic value. (Sivula 2010, 29.) According to Pierre Nora, the realms of memory are remnants or symbols of the past, “where [cultural] memory crystallizes and secretes itself” (Nora 1989, 7).

Cultural heritage is an instrument of identity work with the symbols and traces of the past, experience of participation, and shared historical experience. (Sivula 2015.) The identity work is performed by a cultural heritage community, as seen below.

 

Figure 2. This basic pattern illustrates the three types of identity work of a cultural heritage community. The heritage community shares and is aware of a common history, which values certain traces of the past as historical symbols and/or historical evidence, and experiences participation in a mutual, historical project. (Sivula 2015, 66.)

Figure 2. This basic pattern illustrates the three types of identity work of a cultural heritage community. The heritage community shares and is aware of a common history, which values certain traces of the past as historical symbols and/or historical evidence, and experiences participation in a mutual, historical project. (Sivula 2015, 66.)

The researching and interpretation of the past keeps the cultural heritage process active. Further, the practice of researching, interpreting and representing the past can be observed as the three phases of historiographical operation. According to Paul Ricoeur, the three phases are: 1) documentation, 2) explaining and understanding the past and 3) the historical representation of the past. (Ricoeur 169–170, 182–184 and 234–235; Sivula 2006, 44–45). The cultural heritage process begins with an attempt at historicizing the past, selected by a heritage community. A historian, either amateur or a professional, steps through all the three phases of historical operation, until the past is documented, explained and understood, and further represented in the form of a history.

Monuments – tangible or intangible – are the traces of the past, used in the identity work of a cultural heritage community both as documentary, historical evidence, and meaningful, historical symbols. The symbolic and/or evidential value of a monument, as a realm of social memory, is based on history. Written or oral histories are acting as, and used as frame stories, establishing the meaning of cultural heritage. However, when digital game culture is concerned, one is able to find these histories, for example, in game magazines and online forums consisting of feature articles on (developments of) particular games, genre, developers, and devices; or personal memoirs or one’s personal gaming histories. A tangible or intangible monument, in its turn, serves as evidence and thus solidifies the plot and content of heritage communities’ historical self-comprehension. (Sivula 2015, 64–67.)

Histories are, during the cultural heritage process, used in order to highlight some important moments and attach some remnants of the past i.e., monuments, to these highlighted moments of shared history. (Sivula 2015, 66.) Monuments are usually attached to the beginning of the historical story, or to the turning points of the historiographically described process. Monuments are, therefore, often attached to historically important turning points, or to the instance in which a progressive series of events starts to unfold. In Finland, for instance, the Commodore 64, the most popular home computer of the 1980s, is that sort of monument, which signifies the turn towards home computer gaming and the micro computing age and, which, functions as a media technological symbol for a certain generation of people. In Japan, the Nintendo Famicom console has the similar role, and we are able to find a plethora of examples from other countries.

Figure 3. Pelaa! (Play!) Exhibition in Salo Art Museum in Finland in 2009 is an example of how to give new meanings for game cultural objects. Here in the above picture, for example, is the Nokia mobile phone Snake game, and both Nokia cell phone and its Snake game are key objects of Finnish national technology historical frame stories. Photo: Petri Saarikoski.

Figure 3. Pelaa! (Play!) Exhibition in Salo Art Museum in Finland in 2009 is an example of how to give new meanings for game cultural objects. Here in the above picture, for example, is the Nokia mobile phone Snake game, and both Nokia cell phone and its Snake game are key objects of Finnish national technology historical frame stories. Photo: Petri Saarikoski.

In the monumentalisation process, the meaning of the object obviously transmutes from its original significance. J. C. Herz (61–62), for instance, richly describes the change in the videogaming context in her famous popular book on videogame history, Joystick Nation. In her work she portrays an early coin-op videogame exhibition at the American Museum of the Moving Images, where game cabinets’ new displacement has illuminated and underlined their novel contextualization. The machines were not situated as close to each other as they would have been in arcades, where their placement catalysed an aesthetic elevation in the author’s mind: “They are privileged with space, like statues or really expensive clothing, and thus become Design Objects. And this is when you realize, for the first time, that these cabinets, apart from containing your favourite videogames, are really just goddam beautiful.”

There is plethora of games that are not actively played anymore. Some of them have already been forgotten, but some of them, nonetheless, have the potential to become monuments of digital culture. The cultural heritage potential of a game appears, most often, to be rested on the argument of being “the first” or being a “historical turning point.”

Figure 4. “Now it’s time to put the Finlandia hymn [composed by “the greatest composer of Finland” (Wikipedia) Jean Sibelius] on a record player, because the first, Finnish game has conquered the world”. Niko Nirvi's review of Sanxion (programmed by Stavros Fasoulas, published by Thalamus in 1986) in MikroBitti 12/1986, 72, illustrates how contemporaries are able to historicize games in a way that affects later historical writing.

Figure 4. “Now it’s time to put the Finlandia hymn [composed by “the greatest composer of Finland” (Wikipedia) Jean Sibelius] on a record player, because the first, Finnish game has conquered the world”. Niko Nirvi’s review of Sanxion (programmed by Stavros Fasoulas, published by Thalamus in 1986) in MikroBitti 12/1986, 72, illustrates how contemporaries are able to historicize games in a way that affects later historical writing.

We have noticed that the frame stories of the cultural heritage process of computer games are not global (though in many cases globalized), but are rather national histories. In Finland, there are already some popular histories available, and there is a vivid, ongoing discussion on the beginnings and turning points of digital gaming in Finland. The symbolic monuments are not yet largely selected, but they are under historical construction (see e. g. figures 1, 3, 4). The usability of these selected items of cultural heritage depends on their historical value. Selected items can be used, for example, as unique celebrated artefacts in museums, and/or as commodified, copied, varied, and reproduced elements in retro- and heritage industrial contexts. On the other hand, monuments are able to be based on shared experiences: they are not curiosities, they are unique items or have particular cult status as rarities but merely popular and international items such as above mentioned Commodore 64 computer or specific popular game products. However, in this case, we focus on a rarity as a potential monument. The next section of the paper will deal with the case of the first commercial computer game in Finland.

Debate on the first Finnish commercial computer game

There are never ending debates in different fields regarding what was the first of a particular type of invention, technology, media form, or something else. This debate has already been recognized earlier, for example, by computer historians. The history of computer games and videogames is not an exception. The debate on what is the first video game or computer game has mainly been international – or essentially, US oriented. One is able to find variations of this discussion from almost every videogame history book or textbook of game studies, which repeat stories and report new findings related to American Tennis for Two, Spacewar!, Pong and so on.  When the national and local digital game historical representations of the past have begun to emerge, the similar debate has achieved domestic dimensions and bloomed as national versions. This has happened in Finland as well, mainly within computer and game hobbyist communities and in online discussion forums and publications.

Computer scientist and historian John A. N. Lee (57) provides several reasons for the “common desire to be associated with firsts” within the history of computing. On the one hand, it is certainly desirable to become recognized in history as an inventor or a founder or discoverer of some sort of historical origins of the important phenomenon. On the other hand, other reasons can be economic: “Unique firsts do have a place in the identification of the owners of intellectual property rights with respect claims on patents, copyrights, and such.” Lee notes critically that in many cases, it is difficult to define something as being the first and continues further: “Everyone likes firsts but the attraction is for fame and fortune rather than downstream usefulness—firsts are better left to the Guinness Book of Records than being the subject of endless, meaningless arguments in scholarly journals.” (See also Haigh)

Overall, the discussion about the first digital game in Finland has primarily dealt with the issue of the first Finnish commercial computer game publication and not the very first Finnish (digital) game ever produced, perhaps because the publication is less difficult to master: before commercial publications there was a quite uncertain phase of non-commercial amateur game projects, a period of producing and playing of games with mainframe and mini computers (Saarikoski 264). Some studies dealing with earlier developments, at least partially, have appeared (see e.g. Saarikoski; Paju; Saarikoski & Suominen), as well as studies pondering questions of the earliest computers and microcomputers in Finland (e.g. Suominen 2003; Saarikoski 2004; Paju).

Even though the question of the earliest Finnish commercial computer game release seems rather straightforward from the first sight, it is much more complicated than that. Basically, we can challenge all of the elements of the question: what does “Finnish” mean? And what do we signify with “a computer” game’ or with a “commercial?”

Let’s now trace the tracks and marks regarding the online debate of the first commercial computer game in Finland using Google search as a helper. It appears that there are only a few hits with the keywords “first Finnish computer game” or “first Finnish video game” (during the process of writing first manuscript of the paper in Spring 2014). However, for example, the Dome.fi-site, which has focused on forms of popular culture, such as television, cinema and games; consists of various articles and discussions about the issue. Jukka O. Kauppinen, a pioneering game journalist and one of the key persons researching the historicisation of digital gaming in Finland (Suominen 2011; Suominen et al. 2015), published, along with with Miikka Lehtonen and Teemu Viemerö, an article about the early years of the Finnish game industry and the “first Finnish games” on the 1st of December 2013. The authors opened their article with a summary introduction and referred to an antecedent text handling the 30-year anniversary exhibition about the Finnish game industry. The exhibition had been had been organized for the DigiExpo2013 fair by the association of Finnish game importers, FIGMA. Game distributer firms trace their history from the establishment of Petri Lehmuskoski’s company, Toptronics, in 1983. In the article, Kauppinen and his colleagues stated that not only importing, but also the production of the first commercial games began in Finland 30 years before prior (Viemerö et al. 1.12.2013.).

The above-mentioned writers noted that the company Amersoft was probably the first game publisher in Finland. They looked through the company’s different phases by introducing its, as well as some other publishers’ early releases. They discussed the following games: Joe the Whizz Kid (1985), RahaRuhtinas (1984, Amersoft), Sanxion (1986), Uuno Turhapuro muuttaa maalle (1986, Amersoft), Painterboy (1986), Delta (1987), Quedex (1987), Octapolis (1987), and Netherworld (1988). About Amersoft, they wrote:

The book publisher Amersoft was probably the first Finnish game publisher whose contribution to domestic game field was very significant. The best knowledge available suggests that that the first domestic commercially published game was RahaRuhtinas [“Money Prince”], which came out in 1984 which was a pseudo-3D graphic implemented adventure. Little information remains about the game’s aims or storyline for future generations, however, the Finnish adventure game was, according to some recollections, quite functional and entertaining (Viemerö et al. 1.11.2013).

Figure 5. Raharuhtinas represented in the Dome online magazine article 1st of December 2013.

Figure 5. Raharuhtinas represented in the Dome online magazine article 1st of December 2013.

Quite obvious sources in tracing popular knowledge of game cultural histories are main social media platforms, particularly Wikipedia and also game historical vlogs on YouTube. In the winter 2013–2014, Finnish Wikipedia’s chronological list of Finnish games stated that RahaRuhtinas was the first game (Wikipedia: Suomen videopelialan historia 30.11.2013).  Wikipedia referred to another of Jukka O. Kauppinen’s articles, published on June 27, 2011. The article was titled “Is this the first Finnish game ever” There, Kauppinen noted that “who knows how long the search for the first Finnish commercial computer game has lasted, and there has not been a definitive answer to the question so far. Although there are several good candidates.” Kauppinen first mentioned the Yleisurheilu (Track and field sports) game for Commodore 64, released in 1985 by Amersoft and stated that RahaRuhtinas had an even earlier release date. He continued: “According to some claims, there are some older Vic-20 games as well, but it seems that one cannot find quite now very exact evidence about them” (Kauppinen 27.6.2011.) In his article, Kauppinen also referred to a discussion that occurred in April 2011, in MuroBBS online discussion forum. However, Raharuhtinas was not actually mentioned there, only more recent commercial games and older non-commercial games (MuroBBS 14.4.2011). Obviously, it is worthwhile to follow article links and references and trace their mutual connections and cross-references in an ongoing loop bouncing between Wikipedia entries, online articles, and message boards.

Information dealing with Amersoft and Raharuhtinas became more specified in 2013 and in spring 2014. In autumn 2013, game historians, hobbyists and collectors Markku Reunanen, Mikko Heinonen and Manu Pärssinen, published an article about the history of Finnish games in Finnish Yearbook of Game Studies (Pelitutkimuksen vuosikirja 2013). Their article was based on their database of Finnish games published at the Videogames.fi site. They claimed: “So far the oldest finding is an adventure game Raharuhtinas, programmed by Simo Ojaniemi and published by Amersoft in the year 1984.” On the 14th of December 2013, however, Videogames.fi was updated and a new game appeared. The site alleged that the first game, also programmed by Simo Ojaniemi, was called Mehulinja (Juice line), not Raharuhtinas: “[Mehulinja] requires a VIC-1211 Super Expander extension. According to our current information, Mehulinja is the first commercially published computer game. The game won I came-made-won programming contest in 1984.” The example shows how researchers, at least, were careful when claiming something as being the first.

Videogames.fi refers to another website called Sinivalkoinen pelikirja (http://sinivalkoinenpelikirja.com/) (Blue-white game book [colours referring to the Finnish national flag]), which has published a review of the Mehulinja game on 22 March, 2013. The Sinivalkoinen pelikirja site was connected to an ongoing book project, a chronicle about Finnish game history. The book was published in spring 2014. On the one hand, the book, written by journalist Juho Kuorikoski and based on the website, claimed that RahaRuhtinas is “as far as we know, the first commercial Finnish game for Commodore 64.” Kuorikoski mentioned three “small games” programmed by Simo Ojaniemi for VIC-20 published in the same year: Mehulinja, Herkkusuu (Sweet Tooth) and Myyräjahti (Vole Hunt) (Kuorikoski 12). On the other hand, he declares that Raharuhtinas was the first Finnish game released (20) and that Yleisurheilu was only “one candidate for being the first Finnish game ever.” (25). That variation proves the uncertainty of the first.[6]

Figure 6. Mehulinja entry on sinivalkoinenpelikirja.com website.

Figure 6. Mehulinja entry on sinivalkoinenpelikirja.com website.

Similar updating of the information has happened on a YouTube channel by alias AlarikRetro. He published a video review – another type of history – of Raharuhtinas on December 1th, 2013 and remarked that the game was the first Finnish release. Only a few days later, the 14th of December, he included an edit, in which he refers to the Videogames.fi site and states that actually Mehulinja was the first (AlarikRetro 8.12.2013 and AlarikRetro 27.12.2013) There are similar debates on other hobbyist sites.

In sum, the question of the first game has not been verified, although though it has received some emerging interest. Then, in July 2014, a novel turn took place, when Manu Pärssinen and Markku Reunanen discovered a new, an older candidate, which might have been the first commercial computer game in Finland. That was called Chesmac, a game programmed by Raimo Suonio in 1979 for the Telmac 1800 home computer. According to Suonio, the game, released by computer retailer Topdata, sold 104 copies. Pärssinen and Reunanen published several documents related to game, such as scanned photos of the game’s manual and an interview with the programmer (Pärssinen & Reunanen 28.7.2014). The news of this new first was circulated in online magazines as well as in newspapers (Kauppinen 28.7.2014; Berschewsky 28.7.2014). In the end, the leading Finnish newspaper, Helsingin Sanomat, published an interview with the programmer Raimo Suonio (Jokinen 10.8.2014). Thus, the history of Finnish commercial game releases turned out to be at least five years longer than previously thought and has garnered, for the first time, major public coverage in Finland. It therefore appears that the discussion amongst hobbyists and researcher-hobbyists has emerged and strengthened during last few years.[7]

We would argue that such interest in discussing and representing the past was not only related to collecting of games, or sort of hobbyist retrogaming boom, but also to international emerging interest towards digital game preservation, exhibitions, and a turn towards the research of national and local aspects of games and game cultures (See also English blog writing on the history of Finnish digital games: Skäpädi Pöy 28.8.2013). This shift was also connected to the organization and recognition of the Finnish game industry.  It is a sign of legitimization and institutionalization processes of digital games in society.

Figure 7. Helsingin Sanomat titled their interview as "Raimo Suonio, a pioneer of Finnish game developers. [...] developed the first commercial computer game in Finland." In the photo, Suonio holds his old Telmac 1800 computer.

Figure 7. Helsingin Sanomat titled their interview as “Raimo Suonio, a pioneer of Finnish game developers. […] developed the first commercial computer game in Finland.” In the photo, Suonio holds his old Telmac 1800 computer.

However, there has not been significant discussion about the first Finnish commercial game yet outside the hobbyist and academic communities, even though it seems to be emerging during the time of writing this article in autumn 2014.[8] Earlier, for example, one is not able to find many mentions of first games in the database of the largest Finnish newspaper Helsingin Sanomat, nor in many other newspapers published by the same corporation. The references are from the 2000s and they are not connected to the first ever Finnish commercial game, but rather to the first Finnish publication for a certain new platform, such as first game for PS3 (Digitoday 27.4.2007), PS4, Nintendo Wii (Kauppalehti 23.6.2009, 14–15), Steam downloading platform (Digitoday 13.9.2006), etc. These mentions belong, thus, to contemporary discussion where the importance of the game industry has been acknowledged and where turning points are aimed at aimed at explaining contemporary use and applicably only for future history writings. The issues are distinctively connected to the economy, ICT sector, and new cultural industry.

When Chesmac, Mehulinja, Raharuhtinas and other games were published in the late 1970s and the early 1980s, the game industry was an undeveloped field internationally. Historical understanding or awareness was not established, not even among game developers and players. The establishment of Finnish computer hobbyist and game oriented publications from the mid-1980s, created the needed public space for the creation and construction of historical understanding amongst hobbyists and players (see Saarikoski 2004; Suominen 2011). The press created hero stories about the earliest Finnish game designer individuals and occasionally introduced the first releases in a certain genre (first adventure game etc.) (Saarikoski 2004, 264), or underlined the historical importance of some new releases (such as the Sanxion game, published in 1986). Amersoft, however, had a somewhat marginal role in the early magazines, even though some of its publications were reviewed and it had advertised it products, primarily books, but also some games. These early computer hobbyist magazines and game magazines later on, including the above mentioned hero stories – which usually revealed histories of individual programmers or game designers as computer users, gamers and developers, have acted as sources when the interest towards the early phases of Finnish game industry has been emerged since the early 2000s.

In the beginning of the 2000s, new interest towards development of the Finnish game industry and education emerged. This was due to several, interconnected reasons. International success stories such as with the Max Payne (2001) PC-game, developed by Finnish company Remedy, raised interest towards the game industry. Importantly, it was situated in the international trend of new cultural and creative industries. Likewise, the triumph of cell phone corporation Nokia created an information and communication technological boom which, in its minor part, focused on mobile game software development.

Several game industry and education reports were published. Even though they mostly referred to the national history of game industry (typically excluding non-digital games, for example) very cursory, they articulated the more general trend, which with was related to the production of game historical narrative: the significant branch of industry had its roots. However, the origin story of game developers themselves did not mention Chesmac, Mehulinja, Raharuhtinas or some other early games but was fastened to so-called demoscene phenomenon due to the fact that some key persons of the focal firms, like Remedy and Housemarque, which had their background in the late 1980s and early 1990s demoscene (on demoscene’s role in the Nordic game industry, see Jørgensen et al. 2015). The demoscene origin story was introduced in the interviews of firm personnel in computer and game magazines and newspapers in the late 1990s. Later on, it has growth as a myth which has been repeated in publications as well as in the interviews of early game developers (e.g. Niipola 51–62; Kuorikoski 36–38).

But as we have argued, the primary “boom of the first” has started to emerge in last few years. It has mixed ingredients from new success stories of Finnish game industry, post-Nokia context, establishment of retrogaming, anniversaries as well as “awakenings” of memory organizations and researchers on questions of game history and preservation. What has happened? Who uses history, for what and why?

The First Game is both a piece of historical evidence and a symbol

We argue that cultural heritage process of digital games has reached a new phase, and the Finnish heritage community of digital games is actively involved in a new kind of identity work. The institutionalization of this new type of heritage has begun. The cultural heritage process of digital gaming can be observed in the context of the different levels of the cultures of history.

Oral and written histories are produced in three different fields. First, there is the academic field of history-cultural activities consisting of academic rules, refereed publications and academically trained researchers with doctoral degrees. Histories are based on the source criticism and supplementary rules of academic research. Secondly, there is a field of the public, which consists of politically controlled and publicly funded processes of cultural heritage with less strict academic control, but much more discussions on and monetary involvement. The institutionalization of cultural heritage takes often place on the second level of the cultures of history. It is conducted with political decisions, and there is not a specific means of control for the credibility of a frame story. The third field is the field of amateurs; such as individuals and groups of hobbyists, even families, selecting meaningful things from the more or less authentic remains of their pasts. The amateur is permitted to choose whatsoever (elements of) heritage and use any kind of frame stories as arguments, without an obligation to put the arguments to the test of any kind. The three fields of cultural heritage are interrelated.  Amateurs are often extremely active in the second field of cultures of history. An amateur may find academic research useful as a frame story that gives meaning to one’s own cultural heritage of her/his own.  An academic researcher or a politician may also be an enthusiastic amateur, and an academic researcher often uses the academic competences for to promote the cultural heritage process and consolidate the cultural heritage value of the historical remnants of her own hobby.  (Sivula 2013, 163; Aronsson 43.)

The case we described above shows that the first game historians were not usually “proper” professional historians, but more likely historically oriented amateurs. The active heritage community, in our case, consisted of the hobbyists.

It seems to be quite common, that the historicisation of a new culture begins among the community or groups of the amateur historians, involved in the historical process themselves. (Cf. history of computing and Lee 1996.) Because of this involvement, we refer to them as participatory historians. Amateur popular historians use often specific period-related concepts as metaphors or rhetorical elements.[9] Accurate or not, the amateur historian has already marked the turning points of the story, when an academic professional historian begins the research work. The preliminary plot of historical narrative, suggesting the argument for valuable cultural heritage, is often constructed by the amateurs.

The plot of history has, at least, a beginning and an end, and a change in between them. The emplotment of a history consists of the defining of the origins of the historicized phenomenon’s life cycle, marking some turning points of the process and constructing the end of the presentation. In the presentations of the history of digital gaming, there have been some international discussions on, what actually was the first game.  The battles of what came first are common in the discussions on the phenomena that are not yet historicized, however they can continue after that as well. Historian of an incomplete process is strongly interested in the beginnings of the process and the origins of the phenomenon.

Either the beginning or the end of a historical narrative is usually self-evident. The first and the last fact of a historical series are often chosen from among several options.  The defining of an origin, the beginning of the story, is an act of interpretation. It is, however, not an arbitrary one. The professional historian’s choice must be based on evidence. The interpretations are built in negotiations (Foucault 34; Ricoeur 143–144.) The plot of a written or orally solidified history determines the experienced value of the cultural heritage. The original game is experienced to be historically more valuable than the successor or a copy.

According to Michel Foucault, the past was an irregular chaos of events, and an oral or written history organizes these events. (Foucault 34–35.)  History gives comprehensibility to the past and solidifies the connections of separate events, building series of events and building the sense and sensibility of time and temporality. The oral or written, amateur or professional history, as a frame story of the cultural heritage process, solidifies the symbolic function of a monument.

There are some regular phases in every cultural heritage process. In our case, the digital game is originally used, functioning and experienced as a game. In the new context, though, it is defined in the historical frame story, it begins to be used and experienced as cultural heritage, either as a tool for to build the temporal identity of a heritage community, e.g. group of players, or as a tool of building the public image or other communicative activity of an enterprise or other corporation. Likewise, it could be used by the state or international organizations. For these goals they use all the other institutionalized cultural resources, such as education or cultural production. In the cultural heritage process, the use, function and experience of the game, all change. The public or private heritage community has either active or more or less subconscious goal of increasing the symbolic value of the game. The game with increased symbolic value, cultural heritage value, can still be played, although it might represent outdated technology and design.

When public resources and the academic field of history culture are involved in the cultural heritage process, the histories used as frame stories are most often based on academic, professional research. The interpretations pass the normal academic quality control. In the field of amateurs and in the private field the rules are different, but in many cases academic sub-contractors are hired for to produce the frame story.

When an object, e.g. a digital game, is identified as a symbol or evidence of the history shared by a group of the digital cultural heritage community, it receives a new social function. It is no longer only a game, but a monument or a place of memory. It is used, either with a playful sense of retro or in the more serious feeling of the memorizing the past, in commemorative rituals. It becomes a tool of identity work. (See also Heineman) Sooner or later, it may be rejected, changed, found to be useless or be replaced with another, more accurate tool, e.g. what we have learned with the changing definition of the first commercial computer game in Finland. Or the community, whose identity tool the cultural heritage was, may disband and move on (Bohman 17–23; Sivula 2013, 161–164).

Conclusion

Digital game culture is a unique field of contemporary culture, and a very interesting one at that. Our case study opens a view to the historiographical operations of participatory historians. Our case aids us in understanding the strengths and weaknesses, risks and opportunities of the historiographical practice related to monuments. It helps to develop the methodology of analysing the historiographical operations, historicizing the contemporary culture. To be critical, we ought to know, how the monument of the first digital game was erected.

In most cases of the production of new monuments, the role of the amateur field has been essential. The production of monuments is a part of historiographical operations and it is clearly located in the documentary and representative phases of the model of historiographical operation, presented by Paul Ricoeur.

The right to choose a monument of digital game culture cannot be monopolized by either academics or amateurs. In our case, both academics and computer game hobbyists were active, selecting objects that they considered worth of preserving and creating monuments of Finnish game culture. In the case of the cultural heritage process of Finnish computer games, the academic field of history culture is closely and continuously interacting with the history-cultural field of amateurs. Many actors of the academic field do have a position in the field of amateurs as well. In other words: there are many computer game hobbyists among the academic researchers of the history of digital culture. The historiographical operation of digital games produces plethora of monuments.

The question of what was the first game becomes important in the phase of representation of historiographical operation. That is the phase where the plot of history is created. The question of what came first is often already answered, even before a professional historian gets an opportunity to make any conclusions.

We can conclude that there are some preconditions for a reliable definition of the firstness, when concerned with digital games. All the concurrent definitions must be observed critically, paying attention to the goals and needs of inventors of the monuments.

First, there is the contemporary definition. A chronicling actor has a motive to spot and articulate a new field, turning point or a milestone. The actor wishes to claim that something important, even revolutionary has happened. We must notice who is acting and why.

Second, there is a retrospective definition. Usually, it is connected to a situation and phase where certain field of actions is the subject of reformation and re-definition. Need for birth stories and origin stories, when legitimizing a need for a cultural industry and several organizations related to it, has taken place. It this case as well, economy and politics have certain role in the process. There is a supply of and demand for money.

Third, there is a specified retrospective definition. That happens, for example, when celebrating anniversaries. In Finland and within digital game cultures, this sort of definition has not happened until recent years and celebrations of the 30th anniversary of commercial game development and digital game importing businesses.

The knowledge related to what is first might become more exact, although this is not necessary. A contemporary definition of what has been the first do not occur, if phenomenon does not feel like significant for contemporaries – if they don’t comprehend that they are living “historical moments.” With the Finnish case, it was not until the publication of “the first Finnish adventure game”, a release of specific popular genre, was the rhetoric of first actually launched. Another option is that they do not comprehend something as being first: this question applies to what is Finnish, what is a game and what is commercial? Because definitions of all of the three aspects are controversial, it is difficult to define something as first Finnish commercial game publication.

The question of what is the first, functions on at least two levels: on one hand, it can deal with the particular first (first game ever), but essentially there are difficulties, and in many cases, that are not necessary to define. On the other hand, questions regarding firstness are connected to larger turning points and they are less difficult to outline: there is, for example, no doubt that that Commodore 64 was the first popular home computer in Finland and the first popular computer gaming device available.

 

Acknowledgements: We are grateful to the Kone Foundation for funding the Kotitietokoneiden aika ja teknologisen harrastuskulttuurin perintö [Home Computer Era and the Heritage of Technological Hobby Culture] project, and the Academy of Finland for funding Ludification and the Emergence of Playful Culture (decision #275421). In addition, we thank the two anonymous referees for their useful comments.

 

Works Cited

Interviews

Reunanen, Markku 5.3.2014, Facebook chat with Jaakko Suominen.

Magazines and newspapers

Digitoday 2007

Kauppalehti 2009

MikroBitti 1984–1986

Poke&Peek 1983–1984

Online articles, videos and discussion forums (retrieved 13 June 2014)

Alarik: “RahaRuhtinas (C64)” Alarik – muistoista näytölle 27.12.2013.

AlarikRetro: “RahaRuhtinas (C64): Videoarvostelu” YouTube-video, published 8.12.2013.

Berschewsky, Tapio: “30 vuotta ennen Angry Birdsiä – Tämä on ensimmäinen kaupallinen suomalaispeli. Ilta-Sanomat Online” Ilta-Sanomat Online 28.7.2014.

Heinonen, Mikko: “Suomipelien kronikka” V2.fi 6.12.2009.

Jokinen, Pauli: “Raimo Suonio on Suomen pelintekijöiden pioneeri.” Helsingin Sanomat 10.8.2014.

Kauppinen, Jukka O.: “Onko tämä ensimmäinen suomalainen peli ikinä?” Dome.fi 27.6.2011.

Kauppinen, Jukka O.: “Suomalainen peliala 30 vuotta? Ehei, uusi löytö ajoittaa ensimmäisen kaupallisen suomipelin vuoteen 1979!” Dome.fi 28.7.2014.

MuroBBS discussion forum, chain: “Ensimmäinen Suomalainen videopeli?”, started 14.4.2011 at 19:26.

Pärssinen, Manu & Reunanen, Markku: “Ensimmäinen suomalainen tietokonepeli.” V2.fi 28.7.2014.

Rautanen, Niila T.: C= inside, Finnish Commodore Archive.

Sinivalkoinenpelikirja.com

Skäpädi Pöy: “A History of Finnish Games, Part 1” FRGCB (Finnish Retro Game Comparison Blog) 28.8.2013.

Suomen Pelinkehittäjät Ry: “Suomen pelialan lyhyt historiikki” 3.10.2011.

Videogames.fi

Viemerö, Teemu – Lehtonen, Miikka –Kauppinen, Jukka O.: “Suomalaisen pelialan varhaiset vuodet ja ensimmäiset suomalaiset pelit” Dome.fi 1.11.2013.

Wikipedia: Game canon. Last modified on 27 February 2014 at 18:46.

Wikipedia: Suomen videopelialan historia. Last updated: 30.11.2013 at 19:57.

Literature (all links checked 18 February 2016)

Aronsson, Peter. 2004. Historiebruk – att använda det förflutna. Lund: Studentlitteratur.

Barwick, Joanna – Dearnley, James – Muir, Adrienne. 2011. “Playing Games with Cultural Heritage: A Comparative Case Study Analysis of the Current Status of Digital Game Preservation.” Games and Culture, July 2011; vol. 6, 4: 373–390.

Bohman, Stefan. 1997. Historia, museer och nationalism. Stockholm: Carlssons.

Bohman, Stefan. 2003. ”Vad är museivetenskap och vad är kulturarv?” Palmqvist, Lennart, Bohman, Stefan (2003) Museer och kulturarv. Stockholm: Carlssons, pp. 9–24.

Bortolotto, Chiara. 2007. From objects to processes: UNESCO’s ‘intangible cultural heritage’ Journal of Museum Ethnography,  No. 19, ‘Feeling the Vibes: Dealing with Intangible Heritage’: Papers from the Annual Conference of the Museum Ethnographers Group Held at Birmingham Museum & Art Gallery, 18–19 May 2006 (March 2007), pp. 21–33.

Foucault, Michel. 2005. Tiedon arkeologia [L’archéologie du savoir, 1969]. Tampere: Vastapaino.

Guins, Raiford. 2004. “‘Intruder Alert! Intruder Alert!’ Video Games in Space.” Journal of Visual Culture, August 2004; vol. 3, 2: pp. 195–211.

Guins, Raiford. 2014. Game After. A Cultural Study of Video Game Afterlife.Cambridge, Massachusetts, London, England: The MIT Press.

Guttenbrunner, Mark – Becker, Christoph – Rauber, Andreas. 2010. “Keeping the Game Alive: Evaluating Strategies for the Preservation of Console Video Games.” International Journal of Digital Curation, Vol 5, No 1 (2010), 64–90.

Haigh, Thomas 2012. “Seven Lessons from Bad History. Journalists, historians, and the invention of email.” Communications of the ACM, September 2012, vol. 55, no. 9, 26–29.

Heineman, David S. 2014. “Public Memory and Gamer Identity: Retrogaming as Nostalgia.” Journal of Games Criticism, Vol 1, No 1 (2014).

Heinonen, Mikko and Reunanen, Markku. 2009. “Preserving Our Digital Heritage: Experiences from the Pelikonepeijoonit Project.” History of Nordic Computing 2. Second IFIP WG 9.7 Conference, HiNC2. Turku, Finland, August 2007. Revised Selected Papers. John Impagliazzo, Timo Järvi & Petri Paju (Eds.). Berlin, Heidelberg, New York: Springer.

Herz, J. C. 1997. Joystick Nation. How Videogames Gobbled Our Money, Won Our Hearts and Rewired Our Minds. London: Abacus.

Koselleck, Reinhart. 1985. Futures Past. On the Semantics of Historical Time. Translated by Keith Tribe. Cambridge, MA: MIT Press.

Kuorikoski, Juho. 2014. Sinivalkoinen pelikirja. Sl: Fobos Kustannus.

Le Goff, Jacques. 1978. “Documento/monumento.” Encyclopedia Einaudi 5, Torino: Einaudi, 38–48.

Lee, John A. N. 1996. “’Those Who Forget the Lessons of History Are Doomed to Repeat It’, or, Why I Study the History of Computing.” IEEE Annals of the History of Computing, Vol. 18, No. 2, 1996, pp. 54–62.

Lowood, Henry, Monnens, D.,  Armstrong, A., Ruggill, J., McAllister, K. & Vowell, Z. 2009. Before It’s Too Late: A Digital Game Preservation White Paper. Game Preservation Special Interest Group, International Game Developers Association.

McDonough, Jerome P. – Olendorf, Robert – Kirschenbaum, Matthew – Kraus, Kari – Reside, Doug – Donahue, Rachel – Phelps, Andrew – Egert, Christopher – Lowood, Henry – Rojo, Susan. 2010. Preserving Virtual Worlds Final Report. Urbana-Champaign, Illinois: IDEALS.

Newman, James. 2012. Best Before. Videogames, Supersession and Obsolescence. London & New York: Routledge.

Niipola, Jani. 2012. Pelisukupolvi. Suomalainen menestystarina Max Paynestä Angry Birdsiin [Game Generation. A FInnish Success Story from Max Payne to Angry Birds]. Helsinki: Johnny Kniga.

Nora, Pierre. 1989. “Between Memory and History: Les Lieux de Mémoire [1984].” Representations 26, Spring 1989, 7–25.

Nora, Pierre. 1998. “Era of Commemoration.” Realms of Memory: The Construction of the French Past, Vol. 3, Symbols. New York: Columbia University Press, 609–636.

Paju, Petri. 2003. “Huvia hyödyn avuksi jo 1950-luvulla – Nim-pelin rakentaminen ja käyttö Suomessa.” WiderScreen 2-3/2003.

Paju, Petri. 2008. Ilmarisen Suomi” ja sen tekijät. Matematiikkakonekomitea ja tietokoneen rakentaminen kansallisena kysymyksenä 1950-luvulla. Turun yliopisto, Turku.

Preserving.Exe. Toward a National Strategy for Software Preservation. 2013.

Reunanen, Markku – Heinonen, Mikko – Pärssinen, Manu. 2013. ”Suomalaisen peliteollisuuden valtavirtaa ja sivupolkuja.” Suominen, Jaakko – Koskimaa, Raine, Mäyrä, Frans – Saarikoski, Petri – Sotamaa, Olli (Eds.), Pelitutkimuksen vuosikirja 2013, Tampere: Tampereen yliopisto, 13–28.

Ricoeur, Paul. 2000a. Tulkinnan teoria. Helsinki: Tutkijaliitto.

Ricoeur, Paul. 2000b. La mémoire, l’histoire, l’oubli. Paris: Seuil.

Riegl, Alois. 1903/1996. The Modern Cult of Monuments: Its Essence and Its Development. In Nicholas Stanley Price, M. Kirby Talley Jr. and Alessandra Melucco Vaccaro (Eds.): Historical ad Philosophical Issues in the Conservation of Cultural Heritage. Los Angeles: Getty Conservation Institute, 69-83, <See also http://isites.harvard.edu/fs/docs/icb.topic822683.files/Riegl_The%20Modern%20Cult%20of%20Monuments_sm.pdf>.

Rosenthal, David S. H. 2015. Emulation & Virtualization as Preservation Strategies.

Saarikoski, Petri. 2004. Koneen lumo. Mikrotietokoneharrastus Suomessa 1970-luvulta 1990-luvun puoliväliin [The Lure of the Machine. Micro computer hobbyism in Finland from the 1970s to the mid-1990s]. Jyväskylän yliopiston nykykulttuurin tutkimuskeskuksen julkaisuja 67. Jyväskylä: Jyväskylän yliopisto.

Saarikoski, Petri & Suominen, Jaakko. 2009. “Computer Hobbyists and the Gaming Industry in Finland.” IEEE Annals of the History of Computing, July-September 2009 (vol. 31 no. 3), pp. 20–33.

Sivula, Anna. 2010. “Menetetyn järven jäljillä. Historia osana paikallista kulttuuriperintöprosessia.” Medeiasta pronssisoturiin – Kuka tekee menneestä historiaa. Toim. Pertti Grönholm & Anna Sivula. Turku: THY, 21–37.

Sivula, Anna. 2013. ”Puuvillatehtaasta muistin paikaksi. Teollisen kulttuuriperintöprosessin jäljillä.” Outi Tuomi-Nikula, Riina Haanpää & Aura Kivilaakso (toim.): Mitä on kulttuuriperintö? Tietolipas 243. Helsinki: Suomalaisen Kirjallisuuden Seura, 161–191.

Sivula, Anna. 2014. “Corporate History Culture and Useful Industrial Past . A Case Study on History Management in Finnish Cotton Factory Porin Puuvillatehdas Oy.” Folklore. Electronic Journal of Folklore 57 (2014), 29–54.

Sivula, Anna & Siro, Susanna. 2015. “The Town Scale Model as an Artefact and Representation of the Past.” Finskt Museum 2013-2015. Helsinki: Suomen muinaismuistoyhdistys ry, pp. 207–221.

Sivula, Anna. 2015. ”Tilaushistoria identiteettityönä ja kulttuuriperintöprosessina. Paikallisen historiapolitiikan tarkastelua.” Kulttuuripolitiikan tutkimuksen vuosikirja 2015. Jyväskylä: Kulttuuripolitiikan tutkimuksen seura, 56–69.

Smith, Laurajane. 2006. Uses of Heritage. London: Routledge.

Software Preservation Network Proposal. 2015.

Suominen, Jaakko. 2003. Koneen kokemus. Tietoteknistyvä kulttuuri modernisoituvassa Suomessa 1920-luvulta 1970-luvulle [Experiences with Machines. Computerised Culture in the Process of Finnish Modernisation from the 1920s to the 1970s.]. Tampere: Vastapaino.

Suominen, Jaakko. 2008. “The Past as the Future? Nostalgia and Retrogaming in Digital Culture.” Fibreculture, issue 11 (digital arts and culture conference (perth) issue), 2008.

Suominen, Jaakko. 2011. “Game Reviews as Tools in the Construction of Game Historical Awareness in Finland, 1984–2010: Case MikroBitti Magazine.” Proceedings of Think, Design, Play – Digra2011 conference. Utrecht School of the Arts, Hilversum 14-17 September 2011. DiGRA Digital Library.

Suominen, Jaakko. 2012. “Mario’s legacy and Sonic’s heritage: Replays and refunds of console gaming history.” Proceedings of DiGRA Nordic 2012. Raine Koskimaa, Frans Mäyrä, Jaakko Suominen (Eds.) Tampere, University of Tampere. DiGRA Digital Library.

Suominen, Jaakko – Reunanen, Markku – Remes, Sami. 2015. “Return in Play: The Emergence of Retrogaming in Finnish Computer Hobbyist and Game Magazines from the 1980s to the 2000s” Kinephanos – Canadian Journal of Media Studies.

Swalwell, Melanie. 2009. “Towards the Preservation of Local Computer Game Software: Challenges, Strategies, Reflections.” Convergence: The International Journal of Research into New Media Technologies, August 2009; vol. 15, 3: 263–279.

Thompson, Jason – McAllister, Ken S. – Ruggill, Judd E. 2009. “Onward Through the Fog: Computer Game Collection and the Play of Obsolescence.” M/C Journal, Vol. 12, No 3 (2009) – ‘obsolete’.

Whalen, Zack & Taylor, Laurie N. 2008. “Playing the Past. An Introduction.” Playing the Past. History and Nostalgia in Video Games. Zack Whalen and Laurie N. Taylor (Eds.) Nashville: Vanderbilt University Press, 1–15.


Notes

[1] See, for example, The Game Canon proposed for the Library of Congress, consisting of games such as Spacewar!, Tetris and Doom and selected by a committee comprising game historian Henry Lowood, game designers Warren Spector, Steve Meretzky and Matteo Bittanti, as well as blogger Christopher Grant.

[2] We thank referee number two for giving us information on some more recent software preservation projects: Preserving Virtual Worlds Final Report (2010); Preserving.Exe. Toward a National Strategy for Software Preservation (2013); Emulation & Virtualization as Preservation Strategies (2015); Software Preservation Network Proposal (2015).

[3] The constructionistically oriented researchers of heritagization, e.g. Laurajane Smith, do not use the concept of monument in the sense we do. From the point of view of the historicization of a tangible or intangible object, the concept of monument is useful.

[4] The specific group, working with its identity in the process of cultural heritage, can be named as cultural heritage community.

[5] The line between the artifacts/monuments and commodities becomes less clear when old devices and game software are bought and sold at Internet auction sites. Various music videos, works of art, books and new editions and revisions of old game products– in some degree commercials as well – are also commodities of the cultures of history (Author 2 & Author 1 2004). (See Suominen 2008; 2012.)

[6] In a Facebook chat discussion with Jaakko Suominen, Markku Reunanen explains background of the rewriting the history of the first. According to Reunanen, they received new information while they browsed online Finnish Commodore archive maintained by a hobbyist Niila T. Rautanen (Rautanen: Commodore Archive). Rautanen has gathered games, screen shots, some information and for example scanned early Poke&Peek Commodore magazines, published by the Finnish Commodore importer. The magazines proved to be an important source of information. Amersoft had released several games in 1984, and according to Reunanen, mentioned publication order of 1984 releases in Videogames.fi, was based on mainly to reasoning. VIC-20 computer was simpler than Commodore 64 and the popularity of VIC was decreasing in 1984. Reunanen states that Raharuhtinas for Commodore 64 represent “more advanced programming” and Mehulinja had won an earlier VIC-20 programming contest. (Reunanen 5.3.2014, FB-chat.)

[7] In addition to Jukka O. Kauppinen, Mikko Heinonen from Pelikonepeijoonit collector community, started in the 1990s, has specifically contributed to discussion. For example, he published “for honor of Finnish Independence Day,” “A Chronicle of Finnish Games” in 6 December 2009, where he divided the history into “prehistory,” “middle ages,” and “modern times” (Heinonen 6.12.2009) and started his “prehistory” from Amersoft publications and claiming wrongly that Yleisurheilu was published in 1986. The association of Finnish Game Developers, for their part, published on their website “A Short history of Finnish game industry” in October 2011 where they alleged that Sanxion by Stavros Fasoulas, published for Commodore 1986 was the first Finnish commercial game (Suomen Pelinkehittäjät Ry 3.10.2011). Actually, the particular game was the first larger international Finnish computer game hit, released by the British company, Thalamus, but not the first.

[8] The situation has partially changed after that, however, mainly because the introduction of Finnish Museum of Games project. The Museum, partially based on a crowd funding project, will be opened in January 2017 (http://suomenpelimuseo.fi/in-english/).

[9] That is why, for instance, in the above mentioned case, a journalist has applied terms such as “pre-history”, “middle-ages” and “modern times” to game historical representations.

 

Bios

Jaakko Suominen has a PhD in Cultural History and is Professor of Digital Culture at University of Turku, Finland. With a focus on cultural history of media and information technologies, Suominen has studied computers and popular media, internet, social media, digital games, and theoretical and methodological aspects of the study of digital culture. He has lead several multi-disciplinary research projects and has over 100 scholarly publications.

Anna Sivula has a PhD in History and is a Professor of Cultural Heritage at University of Turku, Finland. Sivula has studied theoretical, methodological and cultural aspects of cultural heritage process and heritage communities, historiographical operation and historical culture. She has written commissioned histories and led several research projects.

Born Digital Cultural Heritage – Angela Ndalianis & Melanie Swalwell

The collection and preservation of the ‘born digital’ has, in recent years, become a growing and significant area of debate. The honeymoon years are over and finally institutions are beginning to give serious consideration to best practice for digital preservation strategies and the establishment of digital collections. Digital technology emerges and disappears with incredible speed, as a once-new piece of hardware or software becomes old and is replaced by the next technological advancement. What happens to: videogame software and hardware of the 1980s and 90s? The web browsers, blogs and social media sites and content they once displayed? The artworks that relied on pre-2000 computers to create art? Are these – amongst many other – digital creations fated to be abandoned, becoming only memories of individual experience? Are they to be collected by institutions as defunct objects? Or are they to be preserved and revived using new digital technology? These are but a few of the serious questions facing collecting institutions. The question of who is responsible for collecting, preserving and historicising born digital cultural heritage is a crucial one, as is the issue of best practice – what are the best ways to preserve and make accessible such born digital heritage?

In June 2014, our “Play It Again”[1] project team ran an international conference on “The Born Digital and Cultural Heritage” that aimed to convene a forum where some of these issues could be discussed. “Play It Again” was a three year project focused on the history and preservation of microcomputer games written in 1980s Australia and New Zealand, but as the first digital preservation project to be funded as research in this part of the world (at least to our knowledge), it also had a broader significance. We tried to use it to raise awareness around some of the threats facing born digital cultural production more broadly, beyond 1980s digital games. Two of the project’s aims were to “Enhance appreciation for the creations of the early digital period” and “To build capacity in both the academic and cultural sectors in the area of digital cultural heritage and the ‘born digital’”, both critical issues internationally. A two-day event held at the Australian Centre for the Moving Image, Melbourne, the conference’s remit was thus deliberately wider than the focus of the Australian Research Council Linkage Project.

The need for cooperation between different stakeholders – legislative bodies, professionals working in different types of institutions, and the private sector – was a key recommendation of the 2012 “Vancouver Declaration,” a Memory of the World initiative (UNESCO). Born digital artefacts often require multiple sets of expertise, therefore our call for papers invited proposals from researchers and practitioners in a range of disciplines, spheres of practice and institutional contexts concerned with born digital heritage. This included libraries, archives, museums, galleries, moving image institutions, software repositories, universities, and more besides. We wanted to create a space where communication between the different types of professionals dealing with preservation of born digital cultural heritage could take place. Archivists, librarians, conservators, and moving image archivists share many challenges, yet, we suspect, often they attend conferences which are profession based, which enforces a kind of silo-ing of knowledge. Particularly in small countries such as Australia and New Zealand, there’s a need for conversations to take place across professional boundaries, and so we sought to bring people who perhaps don’t normally move in the same circles into contact.

The presentations during the conference ranged in approach from theoretical, to practical, to policy-oriented. We gloried in the range of papers that were presented. There were game histories, reflections on the demoscene, on net.art and other forms of media art, on born digital manuscripts, robots, twitter accounts and website archiving. As well as papers addressing different forms of heritage materials, there were also technical reports on the problems with hacking and patching disk images to get them to emulate, on software migration, and legal papers on copyright protection, and the ‘right to be forgotten’. (Audio of many of the presentations is available here. The variety of presentations made painfully visible the enormous task at hand in addressing born digital cultural heritage.

While Refractory focuses on entertainment media, in this issue we recognise that born digital entertainment media share many of the challenges of non-entertainment objects. Here, we have collected article versions of selected papers from the conference. The topics and subjects are varied – from those looking more broadly at approaches to born digital heritage and the preservation of digital art, to the documentation of and public discourse about early game histories, and to future creative writing practice facilitated through the collection of digital manuscripts.

In his paper “It Is What It Is, Not What It Was: Making Born Digital Heritage” (which was a keynote address), Henry Lowood examines the preservation and collection of digital media in the context of cultural heritage. Lowood is concerned with “the relationship between collections of historical software and archival documentation about that software” and poses the question “Who is interested in historical software and what will they do with it?” He argues that “answers to this fundamental question must continue to drive projects in digital preservation and software history”. Using the examples of ‘The Historian’, ‘The Media Archaeologist’ and ‘The Re-enactor’ his paper raises important questions about the function, purpose and varied approaches to the digital archive. The historian, he states, is interested in the digital archival material in order to interpret, reconstruct and retell its story in history. For the media archaeologist, “media machines are transparent in their operation” and, rather than requiring interpretation, speak of their pastness by making possible the playback of “historical media on historical machines”. Finally, for ‘The Re-enactor’, ‘authenticity’ is a crucial factor for digital preservation; however, the question of authenticity is fraught with debate – on the one hand, the re-enactor at one extreme insists on a “fidelity of play” with the software that engages with technology (hardware and software) in its original state, and at the other extreme is the re-enactor who is willing to forgo the historical machine in favour of emulation and virtualisation that recreates an embodied experience of ‘playing’ with the original software, whether a game or word processing program. In either case, as Lowood explains, “Re-enactment offers a take on born-digital heritage that proposes a commitment to lived experience.”

In their article “Defining The Experience: George Poonhkin Khut’s Distillery: Waveforming, 2012”, Amanda Pagliarino and artist George Poonkhin Khut present an account of Khut’s sensory artwork, Distillery: Waveforming 2012, which uses the prototype iPad application ‘BrightHearts,’ which was acquired by the Queensland Art Gallery. The Curator of Contemporary Australian Art requested that the acquisition “was captured in perpetuity in its prototype state”. The authors explain that this biofeedback artwork is ‘iterative’ and Khut continued to develop the work in other iterations that include updates for the BrightHearts app for touch screen devices. This article describes the development of the artwork and the issues that were addressed in its acquisition, archiving, and the consultations that took place between the artist and the collecting institution. As the writers argue “to secure the commitment of the artist to engage in collaborative, long-term conservation strategies is extraordinary and this has resulted in the Gallery acquiring an unparalleled archival resource” that includes documentation and description of the interactive principles and behaviour of the artwork in its early state and as it evolved in Khut’s art practise. This archival resource will make it possible for the work to be reinterpreted “at some point in the future when the original technology no longer functions as intended”. In this respect, Distillery: Waveforming is understood as a “legacy artwork intrinsically linked to past and future iterations” of Khut’s larger Biofeedback Project.

The next article “There and Back Again: A Case History of Writing The Hobbit” by Veronika Megler focuses on the iconic text adventure game The Hobbit (Melbourne House, 1981), which Megler co-wrote during the final year of her Bachelor of Science degree at Melbourne University. This paper is a case history of the development of the The Hobbit (based on J.R.R.Tolkien’s novel of the same name) into a game that could run on the first generation of home computers that were just beginning to hit the market. Little has been written about the development of the first generation of text-based computer games; this case history provides insight into this developmental period in computer game history. Megler describes the development process, the internal design, and the genesis of the ideas that made The Hobbit unique. She compares the development environment and the resulting game to the state-of-the-art in text adventure games of the time, and wraps up by discussing the game’s legacy and the recent revival of interest in the game.

Jaakko Suominen and Anna Sivula’s article “Participatory Historians in Digital Cultural Heritage Process — Monumentalization of the First Finnish Commercial Computer Game” continues with games, analysing how digital games become cultural heritage. By using examples of changing conceptualisations of the first commercial Finnish computer game, the article examines the amateur and professional historicisation of computer games. The authors argue that the production of cultural heritage is a process of constructing symbolic monuments that are often related to events of change or the beginning of a progressive series of events, and the article presents an account of the formation of games as symbolic cultural monuments within a Finnish context. Whilst many researchers and journalists have claimed that Raharuhtinas (Money Prince 1984) for Commodore 64 was the first Finnish commercial digital game, its status as such is controversial. As the authors explain, “in this paper, we are more interested in public discourse of being the first” and how this relates to the cultural heritage process. The case of the ‘first’ game, it is argued, illuminates how items are selected as building material for digital game cultural heritage.

In “Retaining Traces of Composition in Digital Manuscript Collections: a Case for Institutional Proactivity”, Millicent Weber turns to digital manuscripts, their collection, preservation and digital storage by collecting institutions. Weber argues that libraries, archives and scholars have not addressed the content of future digital or part-digital collections, or their capacity to support sustained scholarly research. This paper examines the potential content of future collections of poetry manuscripts and their capacity to support research into the process of composition. To predict this capacity, the article compares a study of compositional process, using handwritten and typewritten manuscripts, with a small-scale survey of early-career poets’ compositional habits. The draft manuscripts of three poems by the poet Alan Gould and three by the poet Chris Mansell are used to describe each poet’s compositional habits, while the survey component of the project obtained information about the drafting practices of 12 students of creative writing and poetry at the University of Canberra. Weber concludes that the results indicate both the great diversity of manuscript collections currently being created, and the importance of archival institutions adopting an active advocacy role in encouraging writers to create and maintain comprehensive and well-organised collections of digital manuscripts.

The collection and preservation of born digital cultural heritage is of critical importance. In the digital era, “Heritage refers to legacy from the past, what we live with today, and what should be passed from generation to generation because of its significance and value” (UNESCO/PERSIST Content Task Force 16). If we want to ensure that records and works from this era persist, we will need to substantially ramp up our efforts. Cooperation between different stakeholders is critical and the research sector has an important role to play, in undertaking collaborative research with cultural institutions to tackle some of the thornier challenges surrounding the persistence of born digital cultural heritage.

Works cited

UNESCO. “UNESCO/UBC Vancouver Declaration, The Memory of the World in the Digital Age: Digitization and Preservation.” N.p., 2012. Web. 17 Dec. 2012.

UNESCO/PERSIST Content Task Force. “The UNESCO/PERSIST Guidelines for the Selection of Digital Heritage for Long-Term Preservation.” 2016. Web.

 

[1] The “Play It Again” project received support under the Australian Research Council’s Linkage Projects funding Scheme (project number LP120100218). See our research blog and the “Popular Memory Archive” for more information on the project.

 

Bios

Associate Professor Melanie Swalwell is a scholar of digital media arts, cultures, and histories. She is the recipient of an ARC Future Fellowship for her project “Creative Micro-computing in Australia, 1976-1992”. Between 2011-15, she was Project Leader and Chief Investigator on the ARC Linkage Project “Play It Again“. In 2009, Melanie was the Nancy Keesing Fellow (State Library of New South Wales). She has authored chapters and articles in both traditional and interactive formats, in such esteemed journals as ConvergenceVectors, and the Journal of Visual Culture. Melanie’s projects include:

  • “Creative Micro-computing in Australia, 1976-1992”. Watch the filmhere.
  • Australasian Digital Heritage, which gathers together several local digital heritage research projects. Follow us onFacebook & Twitter @ourdigiheritage
  • Play It Again: Creating a Playable History of Australasian Digital Games, for Industry, Community and Research Purposes”, ARC Linkage, 2012-14. Follow us onFacebook & Twitter @AgainPlay, and visit the Popular Memory Archive.

 

Angela Ndalianis is Professor in Screen Studies at Melbourne University, and the Director of the Transformative Technologies Research Unit (Faculty of Arts). Her research interests include: genre studies, with expertise in the horror and science fiction genres; entertainment media and media histories; the contemporary entertainment industry. Her publications include Neo-Baroque Aesthetics and Contemporary Entertainment (MIT Press 2004), Science Fiction Experiences (New Academia 2010), The Horror Sensorium; Media and the Senses (McFarland 2012) and The Contemporary Comic Book Superhero (editor, Routledge 2008). She is currently completing two books: Batman: Myth and Superhero; and Robots and Entertainment Culture. She is also a Fellow of the Futures of Entertainment Network (U.S), and is the Hans Christian Andersen Academy’s Visiting Professor (2015-7), a position also affiliated with the University of Southern Denmark.   

Defining the Experience: George Poonhkin Khut’s DISTILLERY: WAVEFORMING, 2012 – Amanda Pagliarino & George Poonhkin Khut

 

Abstract:  George Poonkhin Khut’s sensory artwork, Distillery: Waveforming 2012, was the winner of the 2012 National New Media Art Award. This immersive installation artwork is a biofeedback, controlled interactive that utilises the prototype iPad application ‘BrightHearts’. Khut has an interest in the continued development of the ‘BrightHearts’ app to the point of making it available as a download from iTunes App Store to be used in conjunction with specialised pulse-sensing hardware.  The configuration of Distillery: Waveforming presented in 2012 at the Gallery of Modern Art, Brisbane, incorporated Apple iPad 4th generation devices running the ‘BrightHearts’ app supported by Mac mini computers that processed data and mapped sound and visuals that were fed back to users as animations on the iPads. At the conclusion of the exhibition the artwork was acquired into the Queensland Art Gallery collection.  The Curator of Contemporary Australian Art requested that the acquisition ensure that the artwork was captured in perpetuity in its prototype state.  The iPad devices underwent jailbreaks to safeguard their independent operation and management, and to allow for the permanent installation of non-expiring copies of the ‘BrightHearts’ app.  Source code for the ‘BrightHearts’ app was also archived into the collection. This paper describes the development of the artwork and the issues that were addressed in the acquisition and archiving of an iPad artwork

 

Figure 1. George Poonkhin Khut, Australia b.1969, Distillery: Waveforming 2012, Custom software and custom heart rate monitor on iPad and Mac mini signal analysis software: Angelo Fraietta and Tuan M Vu; visual effects software: Jason McDermott, Greg Turner; electronics and design: Frank Maguire; video portraits: Julia Charles, Installed dimensions variable, The National New Media Art Award 2012. Purchased 2012 with funds from the Queensland Government. Image: Mark Sherwood

Figure 1. George Poonkhin Khut, Australia b.1969, Distillery: Waveforming 2012, Custom software and custom heart rate monitor on iPad and Mac mini signal analysis software: Angelo Fraietta and Tuan M Vu; visual effects software: Jason McDermott, Greg Turner; electronics and design: Frank Maguire; video portraits: Julia Charles, Installed dimensions variable, The National New Media Art Award 2012. Purchased 2012 with funds from the Queensland Government. Image: Mark Sherwood

George Poonhkin Khut’s digital artwork Distillery: Waveforming is a body-focused, controlled, interactive experience. The artwork was acquired by the Queensland Art Gallery / Gallery of Modern Art (QAGOMA) in 2012 and has been the subject of an ongoing dialogue between the artist and the Gallery, through the Head of Conservation and Registration, regarding its long-term preservation.  At the heart of the artwork is an individual, human experience with certain intrinsic elements combining to create this experience. In their endeavour to provide a sound future plan for Distillery: Waveforming they have questioned ‘the experience’ from their individual perspectives – that of the artist and the collecting institution.

Distillery: Waveforming is both an independent artwork and an affiliated outcome of Khut’s long running work with heart rate biofeedback. This unusual duality plays a significant role in the ways in which the artist and the institution perceive the artwork, its preservation and future installations. Since the artwork’s acquisition into the QAGOMA collection the artist has remained involved and interested in the Gallery’s management of Distillery: Waveforming. Khut’s progress in his work on the biofeedback project has seen him make significant advances in software development, allowing him to release the iTunes application BrightHearts that was in-development at the time that Distillery: Waveforming was created. These advances in the biofeedback project provide current context to the dialogue and continue to shape the opinions of both artist and institution. Through this collaborative process QAGOMA has been able to build an extensive resource for the long-term preservation of Distillery: Waveforming.

HISTORY AND BACKGROUND: BIOFEEDBACK IN ART AND MEDICINE

George Poonhkin Khut’s biofeedback artwork Distillery: Waveforming was the winning entry in the 2012 National New Media Award (NNMA) held at the Gallery of Modern Art (QAGOMA 2012). The artwork entered the Queensland Art Gallery / Gallery of Modern Art collection at the conclusion of the exhibition. The curator of Contemporary Australian Art requested that the artwork be acquired to accurately reflect its display in the NNMA exhibition – that is as a prototype.

In 2011 when Khut was invited to enter the NNMA he was working as the Artist in Residence at the Children’s Hospital Westmead. In this residency Khut and his research colleagues commenced the BrightHearts Project that aimed ‘to assess the potential of small, portable biofeedback-based interactive artworks to mediate the perception and performance of the body in paediatric care: as experienced by children undergoing painful recurrent procedures’ (Khut et.al 2011).

Apple iPads loaded with games were already in use for diversion and distraction purposes during painful procedures at the Children’s Hospital Westmead. Khut chose to adapt his work for iPad technology for the BrightHearts Project based on this ‘diversional’ precedent and the excellent optical qualities of the iPad display (Khut 2014). In realising Distillery: Waveforming Khut channelled years of artistic practice in biofeedback and body-focused interactivity in the development of a cross-disciplinary artwork at the core of which was the prototype BrightHearts application (app) for Apple iPad.

When Distillery: Waveforming was displayed in the NNMA exhibition, from August to November 2012, the BrightHearts app was still in-development under a short-term Apple Developer licence. At this point in the provisioning, the prototype app generated the visuals on the iPad in response to a multilayered array of messages transmitted from a laptop or desktop computer over a network connection. This approach enable Khut to quickly prototype a variety of visualisation ideas by adjusting parameters on the desktop computer, without needing to compile and install the app on to the iPad each time. More importantly, at the time of its development – this networked approach also enabled him to incorporate live heart rate sensor data in a way that was not supported by the Apple operating system (iOS) at the time (before the introduction of the Bluetooth 4.0 wireless standard), and to continue his work with complex signal analysis, mapping and sonification algorithms that have been central to his work with body-focussed interactions since 2003. Essentially Distillery: Waveforming and the trial therapeutic devices at the Children’s Hospital Westmead were operating as ensembles that included iPads loaded with the prototype BrightHearts app, data collection devices, and desktop/laptop computers and network routing systems.

DISTILLERY: WAVEFORMING

Acquiring Distillery: Waveforming to reflect its status as a prototype was a curatorial imperative. Khut describes his approach to the long-running biofeedback project as ‘iterative’ and in this regard the artwork is an incremental representation of Khut’s artistic practice and a model demonstration of the developmental BrightHearts app for touch screen devices (Khut and Muller 2005). In the future Distillery: Waveforming will become a legacy artwork intrinsically linked to past and future iterations in the biofeedback project.

Distillery: Waveforming derives from Khut’s earlier work on BrightHearts that commenced in 2011 and his Cardiomorphologies series from 2004-2007. The mandala-like visuals were initially developed for Cardiomorphologies v.1 by John Tonkin using Java programming which Khut controlled via Cycling’74’s Max (version 4.5) application, a popular visual programming language for Apple and Windows computers. In 2005 the original visualisation software was expanded upon by Greg Turner for Cardiomorphologies v.2 using visuals generated from within the Max application. Turner used the C++ programming language to develop ‘Fireball’ a specialised graphic module (known in the Max programing environment as an ‘object’) – that enabled Khut to control the visuals with messages to each layer, for example, drawing a red coloured ring, the width of the screen, with a thickness of 20 pixels, and a green circle with a gradient, with a diameter of 120 pixels (Khut 2014; Pagliarino 2015, pp. 68-69).

Then in 2011 Jason McDermott, a multi-disciplinary designer working in the area of information visualisation and architecture, was engaged to re-write Greg Turner’s ‘Fireball’ visualisation software to enable it to run on hand-held technologies with touch sensitive controls. Using the open source C++ library openFrameworks with Apple’s Xcode (version 4) McDermott redesigned and expanded the potential of the software, developing BrightHearts as an iOS 5 mobile operating system application (McDermott 2013).

Development of the app continued when Khut received the NNMA prize worth AUS$75,000 and in April 2014 the BrightHearts app was released into the iTunes store. Heart rate data acquisition and processing is now integrated into the application software and the only external device that is required in conjunction with the app is a Bluetooth 4.0 heart rate monitor that captures the real-time heart rate data. The app is categorised as a Health and Fitness product that can be used to assist with relaxation and body awareness (iTunes 2014).

This history of development, change, modification and repurposing creates a landscape in which Distillery: Waveforming is an important new media artwork. As a legacy artwork the Gallery aims to maintain the component parts and software in their original form and function for as long as possible. Technologies change at such a rapid rate that the artwork will date in the years to come to reflect, quite evidently, an artwork of 2012.  Perhaps future users will consider what are at present beautifully rich and transcendent animations as rudimentary and the touch screen navigation amusing and unsophisticated. Perhaps future users will recapture the sense of appeal that early touch screen devices inspired in consumers. However, it is not the intention of the Gallery to create a sense of nostalgia but to offer insights into the balance between art, technology and science at this fixed point in time.  As a legacy the artwork will be an authentic installation and will offer an unambiguous window into Khut’s interdisciplinary artistic practice.

In its presentation in the NNMA exhibition Distillery: Waveforming was configured of five iPad devices running the prototype BrightHearts app that were built into a long, shallow, tilled table at which participants sat on low stools to interact with the artwork. Specifications set by the artist allow the Gallery to modify the configuration for smaller displays of no fewer than three stations in future installations. However it is necessary that the ambiance of the installation space affect a sense of calm and contemplation by utilising low light levels, soft dark colours and discrete use of technology. In the original installation the spatial arrangement situated participants in front of three video portraits of the artwork in use (Fig 1). Distillery: Waveforming is a composite artwork incorporating the iPad devices loaded with the prototype BrightHearts app, external data collection and processing equipment and video portraits displayed on monitors. The combined hardware and software systems include:

  • Five Apple iPads (3rd generation) operating on the iOS 5.1.1 operating system, with retina display high resolution (2,048 x 1,536 pixels at 264 ppi) and dual-core Apple A5X chip

Loaded with:

  • BrightHearts app (in-development)
  • Cydia – a software application that enables the user to search for and install applications on jailbroken iOS devices
  • Activator app – a jailbreak application launcher for mobile devices
  • IncarcerApp – an application that disables the home button and effectively locks on the BrightHearts app when in use, preventing the user from inadvertently exiting the app
  • Five heart rate sensors incorporating Nonin PureSat Pulse Oximeters (ear clip type) sensors, a Nonin OEM III Pulse Oximetry circuits and Aduino Pro Mini 328 microcomputers, running specially written code (OemPulseFrank.pde) to receive the pulse data from the pulse oximeter sensors – and relay this to the MacMini’s via a USB-serial connection.
  • Five Mac minis 5.2, 2.5 GHz dual-core Intel Core i5 processor, 4GB RAM, 10.7.5 (OSX Lion) operating system

Running:

  • Max6 application (Cycling74, 2012)
  • Custom written scripts running from OSX ‘Terminal’ utility, that receive pulse data from the sensors via USB port and pass this along to Max6
  • One 5.0 GHz network router that transmits control data from the Max6 software on the Mac minis to the corresponding iPads.
  • Three digital portraits displayed on 40” LCD / LED monitors hung in portrait orientation
  • Video portrait files include MPEG-4, QuickTime ProRes and AVC file formats
  • Five headband-style stereo headsets

The prototype BrightHearts app for Distillery: Waveforming was written for Apple iPad (3rd gen) models running iOS 5.1.1. It was written under a short-term Apple Developer licence that allowed for provisioning and testing of the app on multiple devices. The licencing arrangement for the app expired in July 2013, nine months after the artwork was acquired by into the collection. A key aspect of archiving this artwork was the need to gain control of the app and in advance of the expiration the Gallery implemented an archiving strategy that was developed through consultation between the conservator, curator and the artist who was in contact with the software developer.

The most challenging aspect of the acquisition for the collecting institution was the long-term management of the proprietary technology and software. At the time of acquisition the prototype BrightHearts app was capable of performing a function with external support but did not have status as an independent Apple-approved application. In fact, its completion and approval was still one and half years away. It was also important for the iPad operating system to be locked down to iOS 5.1.1 as the prototype BrightHearts app for Distillery: Waveforming will only launch in this version. Through a consultative process it was agreed that to administer the artwork as an authentic prototype it was necessary to increase the end user control of the technology and software.  This was achieved through jailbreaking the iPad devices and loading a non-expiring copy of the prototype BrightHearts app on the iPads (Pagliarino 2015).

MAINTAINING AN AUTHENTIC EXPERIENCE

Distillery: Waveforming has been acquired with the intention of maintaining authenticity and as such the Gallery has archived a full complement of digital files for the artwork. Included in this is source code for compiling the BrightHearts application with Xcode and source code for compiling the pulse-sensing Arduino microcontrollers for which Khut owns both copyrights (Khut 2012).

In conventional object-oriented programming source code, a programming sequence in readable text, outlines the steps that are necessary to compile software and make it function as intended, for example an app for an iPad. The source code has to be interpreted or compiled by a programmer in order to create the necessary machine code, for example Xcode if the work is developed for Apple OSX. Acquiring source code is thought to be a means of future-proofing digital artworks (Collin and Perrin 2013, p.52). This is undeniable as without the source code there is very little that can be used as a structural guide. However, Laforet et al. (2010, p.27) questions whether source code can really act as a safety net for software artworks in an age where there is a strong commercial imperative driving the development of digital technologies at the expense of the conservation of data.  The success of source code to future-proof artworks relies on accurate interpretation and, in the context of an authentic experience, a complete lack of bias towards alternate or more efficient ways of programming the software to run an artwork as it was intended.

In cases where an artwork was developed using a suite of applications and programming languages, documenting source code becomes a complicated task in comparison to artworks where the source code is contained entirely within a single object-oriented programming environment. The programing for Distillery: Waveforming is distributed across three operating systems and four programming languages: Arduino for the sensor hardware; Objective C and iOS5 (via Xcode 4) for the BrightHearts app; OSX for the desktop computer that operates as a terminal emulator, running sensor data routing and analysis processes; and most significantly  Max,  the visual programing application that is used to perform the core analysis, mapping and sonification processes between the incoming heart rate data and the outgoing messages controlling the appearance of the various layers of the iPad visuals and sounds. Laforet et al. confirms that the difficulties faced with software artworks created by individual programmers are that:

These projects are relatively small efforts, putting the work created with it in a very fragile position. Unlike more popular software and languages, they are not backed up by an industry or a community demanding stability. The software only works under very specific conditions at a very specific time. Migrating such a work is a tremendous task, likely to involve the porting of a jungle of obscure libraries and frameworks. (Laforet et al. 2010, p. 29)

The complexity of combining multiple source codes from various programming platforms to work within one artwork significantly increases the risk of error in interpretation. In the case of Distillery: Waveforming it seems highly unlikely that source code alone would be sufficient to recreate the artwork in future. Khut has recognised this and has considered alternate bespoke and existing documentation systems for both Distillery: Waveforming and BrighHearts for the purpose of preservation and representation.

Visually the prototype BrightHearts app consists of 22 individually controlled graphic layers. Each layer is comprised of a single polygon that can be drawn as a solid shape or a ring, the edges of the shape can be blurred and the colour can be varied according to hue, saturation, alpha & value (brightness). The layers are then blended using an ‘additive’ compositing process, so that the layers interact with one another, for example a combination of overlapping red, green and blue shapes would produce white. This additive blending is a crucial aspect of the work’s visual aesthetic.

While the visuals are rendered on the iPad by the app developed by Jason McDermott, using Xcode and the openFrameworks libraries, the actual moment-by-moment instructions regarding what shapes are to be drawn, colour, size brightness etc. are all sent from the Max document.

The Max document, the top level ‘patch’ as it is referred to in the Max programming environment, is the heart of the work: the primary mediating layer between the sensor and display hardware that determines how changes in heart rate will control the appearance and sound of the work. It consists of an input section that receives sensor data, an analysis section that generates statistics from the heart rate measurements, and mapping layers that map these statistics to the various audio and graphic variables of colour, shape, volume etc.

The modular design methods used in the Max programming environment allow for the creation of modular units of code referred to as ‘abstractions’ and ‘bpatchers’, that can be re-used in multiple instances to process many variables using a very simple set of instructions. The programming for Distillery: Waveforming makes extensive use of these modules, which are stored as discreet ‘.maxpat’ files within the Max folder on the Mac mini computer. These modules are used for many of the repetitive statistical processes used to analyse the participant’s heart rate, as well as the mappings used to create the highly layered visuals and sounds that are central to the aesthetic of Distillery: Waveforming.

In the analysis section of the programming, changes in average heart rate are calculated over different time frames: the average rate of the last four heart beats, the average rate of the last sixteen heart beats, then thirty-two heart beats and so on, as well as information about the direction of these changes, enabling the work to track when the participant’s heart rate is starting to increase or decrease.

Within Max, the twenty-two graphic layers of visuals used in the prototype BrightHearts app, are each controlled by a corresponding ‘bpatcher’ layer-control module. Each of these ‘bpatcher’ modules contain 107 variables that determine how the parameters of all the layers are controlled. That is what aspect of the participant’s heart rate patterning it responds to and how these changes are mapped to variables such as diameter and colour of the layer in question.

Each layer-control module is comprised of sixteen sub-modules responsible for specific aspects of each layer’s appearance such as diameter, hue, saturation, and shape-type. In the programming of the layer-control modules the boxes of numbers visible in each module describe how incoming data relating to heart rate is mapped to the behaviour of the layer, in this case its diameter, and what statistical information it will respond to such as a running average of the last thirty-two heart beats, a normalised and interpolated waveform representing breath-related variations in heart rate, or the pulse of each heartbeat (Fig 2).

Figure 2: Four of the twenty-two layer-control mapping modules in Max – used to control the shapes drawn on the iPad by the BrightHearts (prototype) app.

All of these variables, controlling the appearance of each layer, are stored and recalled using a table of preset values describing which statistics each layer and variable responds to and how it interprets this input. These numbers are adjusted by the artist to produce the desired mapping and dynamic range and then stored in the .json file and recalled as presets. The information contained in this table is stored as a ‘preset file’ in a .json xml format file that is read when the Max document is launched. These preset files document the precise mapping and scaling settings that determine the appearance and behaviour of each layer of the artwork. Together these layered behaviours and the preset values that describe them produce the final interactive visual aesthetic of the artwork.

Figure 3: Example of one section of the .json ‘preset’ file containing preset data that is read by each of the graphics mapping modules – in this example showing all the parameters used to control the behaviour of the diameter  for Layer 15.

 

For the artist, these preset tables are of central importance for documenting the appearance and interactive behaviour of the artwork for future interpretations, since it is these values that determine how the work responds to changes in the participant’s heart rate.

Strategies for hardware independent migration and reinterpretation

Khut has begun the process of documenting and describing the interactive principles and behaviour of the artwork independent from current technologies to enable the work to be recreated in the future, based on the Variable Media Network approaches set out by Ippolito (2003a).

For creators working in ephemeral formats who want posterity to experience their work more directly than through second-hand documentation or anecdote, the variable media paradigm encourages creators to define their work independently from medium so that the work can be translated once its current medium is obsolete.
This requires creators to envision acceptable forms their work might take in new mediums, and to pass on guidelines for recasting work in a new form once the original has expired.

Variable Media Network, Definition – Ippolito, 2003b

For Khut, the essence of the artwork that would need to be preserved and recreated, independent of the specific technologies currently used, is the experience of having one’s breathing, nervous system, pulse and heart rate patterning represented in real time in an interactive audio visual experience, and the various optical and kinaesthetic sensations and correlations that are experienced during this interaction.

Taking an experience-centred approach it is not the source code as much as the experience of the visuals and sounds changing in response to the live heart rate data that is most essential to recreating the artwork. The aesthetic experience of interacting with the artwork, and the maner in which it responds to changes in heart rate initiated through slow breathing and relaxation is crucial to its authenticity.

The schematic approach: an open ‘score’ for reinterpretation

The simplest approach to documentation for future reinterpretation is the use of a very flexible set of instructions outlining the core interactive form and behaviour of the artwork. This approach leaves many aspects of the artwork’s appearance open to interpretation. Essentially what is preserved is the basic nature of the transformation – from breath, pulse and nervous system to colour, diameter, shape and sound. Such an approach would comprise the following instructions:

The visuals and sounds have been designed to respond to two forms of interaction:
1) gradual decreases in heart rate caused by a general increase in the participants ‘parasympathetic’ nervous system activity that can be initiated through conscious relaxation of muscles in the face, neck, shoulders and arms,

2) breath-related variations in heart rate known as ‘respiratory sinus arrhythmia’ whereby slow inhalation causes an increase in heart rate, and slow exhalation causes a decrease in heart rate.

The result being a wave-like (sine) oscillation in heart rate to which the work owes its name (wave forming).

 

Features extracted from Participant’s pulse and heart rate Name of modulation source (controling the sounds/visuals) Visual representation on tablet surface Sonic representation as heard through headphones
Pulsing heart beat /beat/bang Gently throbbing circular shapes that either contract subtly with each pulse, or darken slightly with each pulse – to create a visual effect of subtle pulsing. A deep and soft throbbing noise that gets louder and brighter as heart rate increases, and softer as heart rate decreases.
Breath-related variations in heart rate – normalised and rescaled to emphasise slow, wave-like oscillations in heart rate that can be induced through recurrent slow breathing at around 6 breaths per minute. /IBI/dev-mean/4/normalised Ring-shaped layers that expand when heart rate is increasing, and contract when heart rate is decreasing. Synthesized drone sound, modulated with a ‘phasor’ effect controlled by breath-related changes in heart rate.
Gradual changes in average heart rate (average of last 32 beats) mediated by changes in autonomic nervous system (stress/relaxation), neck, shoulder arm muscle relaxation etc. /IBI/how-slow/32 Colour of background gradient – red for fastest heart rates recorded since start of session, green for medium, and blue for slowest average heart rate recorded since start of session. Pitch of synthesized drone sound – crossfades through overlapping notes in C Melodic Minor scale – from B6 to A2
threshold points triggered by decreases in heart rate (/IBI/how-slow/32) musical notes and burst of colour. Circular, expanding bursts of colour from centre – fading out when they reach the edge of the frame. Highly reverberant electric piano sounds triggered when threshold crossed – synchronised with burst of colour. Pitch descends in C Melodic Minor scale according to decrease in heart rate
When participants sustain a slow relaxed breath pattern at around 6 breaths per minute, Frequency-domain analysis of heart rate variability will report the appearance of a ‘resonant peak’ around 0.1Hz (6 breaths per minute). There are six thresholds: 25, 30, 35, 40, 45, 50. Each time one of these thresholds is crossed – a message is generated that is used to control an audio and visual event /spectrum/resonant-peak-resonance A large, soft-edged blue ring expands slowly out beyond the edges of the frame and then slowly fades away.

 

Threshold 25 = yellow

Threshold 30 = yellow-green

Threshold 35 = green-yellow

Threshold 40 = green

Threshold 45 = cyan

Threshold 50 = indigo

 

 

Very soft, muted and heavily reverberated piano note, with slow decay

 

Threshold 25 = D#3

Threshold 30 = A#3

Threshold 35 = D#4

Threshold 40 = F4

Threshold 45 = G4

Threshold 50 = A#4

Table 1: showing relationship of key mappings in Distillery Waveforming heart rate controlled artwork. Table 1 lists the key heart rate variables and their mapping to the main visual and sonic representations. The most basic recreation of the work according to the scheme laid out in this table would still require instructions for obtaining and generating the modulation sources from the heart rate data: the algorithms that scale and interpolate the heart rate data and translate these beat-by-beat messages into smooth, continuous control signals.

The translation approach: calibration tools and resources

A second, more precise approach for reinterpretation provides a set of documents to help future developers interpret and translate the original code and .json preset data to provide an aesthetic experience more closely aligned to the artwork at the time it was acquired by QAGOMA (Fig 3). This information is contained in a set of calibration images and accompanying tables that provide a crucial link for reinterpretations of the artwork, allowing future programmers to determine how values stored in the original preset files relate to the appearance of each of the work’s 22 graphic layers. Many aspects of the prototype BrightHearts app’s interpretation of these messages are not linear in their response and it can be seen that the gradients for each shape blend differently according to hue (Fig 4). It is hoped that these calibration images will help future programmers to compare how their own code interprets the messages stored in the preset files, against the behaviour and appearance of original prototype BrightHearts iPad app.

Figure 4: Example of one of the calibration images and accompanying tables describing how the messages from the Max software are interpreted by the visualisation software of the BrightHearts (prototype) App on the iPad.

Figure 4: Example of one of the calibration images and accompanying tables describing how the messages from the Max software are interpreted by the visualisation software of the BrightHearts (prototype) App on the iPad.

 

Summary of documentation strategy for future translation

Documentation element Description
Broad schematic mapping of real time heart rate statistics to sounds and visuals Describes the basic interaction concept and interaction experience: images and sounds controlled by slow changes in heart rate that can be influenced through slow breathing and relaxation/excitement.
Experiential aims and conditions for interaction Describes the environmental conditions proscribed by the artist – to ensure optimum conditions for interaction i.e. minimise audio-visual distractions.
Documentation of Max patch: Annotations in each section of Max code: the subsections (‘subpatches’, ‘abstractions’, and ‘bpatchers’) of the main file – describe the flow of information, through each section.

Document each section as a numbered image file, accompanied by notes describing how information is being modified/transformed.

Heart Rate Analysis
Sounds
Visuals – misc. top-level controls i.e. manage storage and retrieval of preset data, transition to ‘live’ visuals, control overall size, hue, position etc.
Annotated table of preset values describing the mapping of heart rate information to the behaviour of the visuals, extracted from .json presets Describing how each layer responds to the various heart rate statistics, and the quality of response over time (i.e. ‘easing’, non-linear scaling etc.)
iPad visuals – Annotated Calibration Images and tables, Indicating how the visuals should look given specific layer-control messages i.e. diameter, hue, alpha etc. describing the idiosyncrasies of the visualisation code.

Table 2 – George Khut’s documentation strategy for “Distillery: Waveforming

 

Conclusions

For Khut, Distillery: Waveforming is foremost an experiential artwork and therefore his ideas about documentation focus on capturing its functionality and the aesthetics of the interaction. Khut sees the fundamental element of Distillery: Waveforming to be something other than the source code and the technical hardware: namely the mappings between breath and relaxation-mediated changes in heart rate and the appearance of the sounds and visuals, and how these mappings give form to the subject’s experience of interactions between their breath, heart rate and autonomic nervous system.

The modular Max patch programming and the presets in the .json file form, for the artist, the compositional heart of Distillery: Waveforming. This programming draws the visuals in response to the real time heart rate data: the key to the artwork. By further documenting the interactive principles independent from the current technology, drawing on approaches proposed in the Variable Media Questionnaire, Khut has developed reference documents that allow for the translation of the original preset data and calibration for future interpretations of the visualisation software.  In this way Khut can describe the artwork with greater clarity in a non-vernacular, opening up opportunities for the artwork to be recreated in alternate modes.

As an artwork in the QAGOMA collection, Distillery: Waveforming sets a precedent as the first prototype-artwork to be acquired. Technology-based digital artworks are prone to being superseded at a rapid pace and attempting to manage even the medium-term future for such artworks is perplexing. To gain the assistance of the artist at the time of acquisition is constructive and very beneficial, but to secure the commitment of the artist to engage in collaborative, long-term conservation strategies is extraordinary and this has resulted in the Gallery acquiring an unparalleled archival resource (Pagliarino 2015, p. 74). Although the Gallery maintains an interest and intention to preserve Distillery: Waveforming in its original developmental state, providing clear evidence of Khut’s ‘iterative’, evolving art practice, the archival resource provides scope to reinterpret the artwork at some point in the future when the original technology no longer functions as intended.

Through this process of defining the experience, the artist and the institution have collaboratively addressed their common and divergent interests in the future care of Distillery: Waveforming. These differing views have created an opportunity to better understand the artwork and its position as an asset within a state collection and a physical, historical link to an ongoing, evolving artistic practice. Khut’s continued interest in the preservation of Distillery: Waveforming and his participation in dialogues about this artwork and other iterations of the biofeedback project have provided the Gallery with an extraordinary reference and flexibility to manage and display the artwork long into the future.

 

 

Works Cited

Collin, JD and Perrin, V 2013, ‘Collecting Digital Art: Reflections and Paradoxes – Ten years’ experience at the Espace multimedia gantner’ in Serexhe, Berhnard(Ed.), Digital Art Conservation – Preservation of digital art theory and practice, Germany, ZKM Centre for Art and Media Karlsruhe.

Cycling74, 2012, Max (version 6.0.8) computer software, Walnut, California, Accessed 22 September 2014.

Ippolito, Jon. 2003a, ‘Accomodating the Unpredictable’ in The Variable Media Approach: Permanence through change, Guggenheim Museum Publications, New York, and The Daniel Langlois Foundation for Art, Science & Technology, Montreal, pp. 46-53. Accessed 22 September 2014.

Ippoliti, Jon. 2003b, ‘Variable Media Network, Definition’ Accessed 22 September 2014.

iTunes 2014, ‘BrightHearts by Sensorium Health’ viewed 29 August 2014

Khut, George Poonkhin 2014, ‘On Distillery: Waveforming (2012)’ Born Digital and Cultural Heritage conference, Melbourne, Australia, 19-20 June 2014, viewed 29 August 2014

Khut, George Poonkhin 2014, personal communication, interview 5th September 2014.

Khut, George Poonkhin 2012, Distillery: Waveforming 2012 user’s manual (draft), in the possession of the Queensland Art Gallery, Brisbane.

Khut, George Poonkhin, Morrow, A & Watanabe, MY 2011, ‘The BrightHearts Project: A new approach to the management of procedure-related paediatric anxiety’, Preceding of OzCHI 2011: The Body InDesign.  Design, Culture & Interaction, The Australasian Computer Human Interaction conference, Canberra, Australia, 28-29 November 2011, pp. 17-21.

Khut, George Poonkhin  & Muller, L 2005, ‘Evolving creative practice: a reflection on working with audience experience in Cardiomorphologies’, in Lyndal Jones, Pauline Anastasious, , Rhonda Smithies, Karen Trist,  (Eds.), Vital Signs: creative practice and new media now, Australian Centre for the Moving Image, Melbourne, Australia, RMIT Publishing.

Anne Laforet, Aymeric Mansoux and Marloes de Valk,  2010, ‘Rock, paper, scissors and floppy disks’, in Annet Dekker (ed.), Archive 2020: Sustainable archiving of born digital cultural content, Virtueel Platform, viewed 4 February 2014, pp.25-36.

McDermott, J 2013, ‘Bright Hearts (2011)’, jmcd, viewed 4 September 2013, <http://www.jasonmcdermott.net/portfolio/bright-hearts>

Pagliarino, Amanda 2015, ‘Life beyond legacy: George Poonhkin Khut’s Distillery: Waveforming’, AICCM Bulletin, vol. 36, no. 1, pp. 67-75.

Queensland Art Gallery / Gallery of Modern Art, 2012, National New Media Award 2012 – George Poonkhin Khut 2012 NMA winner, QAGOMA, viewed 15 October 2014.

 

Bios

Amanda PAGLIARINO is Head of Conservation at the Queensland Art Gallery / Gallery of Modern Art, Brisbane.  Since 2003 she has worked on the conservation of audiovisual and electronic artworks in the Gallery’s collection. Amanda received a Bachelor of Visual Arts from the Queensland University of Technology in 1991 and a Bachelor of Applied Science, Conservation of Cultural Material from the University of Canberra in 1995.

George Poonkhin Khut is an artist and interaction-designer working across the fields of electronic art, interaction design and arts-in-health. He lectures in art and interaction design at UNSW Art & Design (University of New South Wales, Faculty of Art  & Design) in Sydney, Australia. Khut’s body-focussed interactive and participatory artworks use bio-sensing technologies to re-frame experiences of embodiment, health and subjectivity. In addition to presenting his works in galleries and museums, George has been developing interactive and participatory art with exhibitions and research projects in hospitals, starting with “The Heart Library Project” at St. Vincent’s Public Hospital in 2009, and more recently with the “BrightHearts” research project – a collaboration with Dr Angie Morrow, Staff Specialist in Brain Injury at The Children’s Hospital at Westmead, Kids Rehab, that is evaluating the efficacy of his interactive artworks as tools for helping to reduce the pain and anxiety experienced by children during painful and anxiety-provoking procedures.

 

Volume 27, 2016

Themed Issue: Born Digital Cultural Heritage

Edited by Angela Ndalianis & Melanie Swalwell

Introduction: Born Digital Heritage – Angela Ndalianis & Melanie Swalwell

  1. It Is What It Is, Not What It Was: Making Born Digital Heritage – Henry Lowood
  2. Defining The Experience: George Poonhkin Khut’s Distillery: Waveforming, 2012 – Amanda Pagliarino & George Poonkhin Khut
  3. There and Back Again: A Case History of Writing The Hobbit – Veronika Megler
  4. Participatory Historians in Digital Cultural Heritage Process: Monumentalization of the First Finnish Commercial Computer Game – Jaakko Suominen and Anna Sivula
  5. Retaining Traces of Composition in Digital Manuscript Collections: a Case for Institutional Proactivity – Millicent Weber

There and Back Again: A Case History of Writing The Hobbit – Veronika M. Megler

Abstract: In 1981, two Melbourne University students were hired part-time to write a text adventure game. The result was the game The Hobbit (Melbourne House, 1981), based on Tolkien’s book (Tolkien), which became one of the most successful text adventure games ever. The Hobbit was innovative in its use of non-deterministic gameplay, a full-sentence parser, the addition of graphics to a text adventure game and finally “emergent characters” – characters exhibiting apparent intelligence arising out of simple behaviours and actions – with whom the player had to interact in order to “solve” some of the game’s puzzles. This paper is a case history of developing The Hobbit, and covers the development process, the internal design, and the genesis of the ideas that made The Hobbit unique.

 

Fig.1 - C64/128 The Hobbit (disk version). Melbourne House.

Figure 1.  C64/128 The Hobbit (disk version). Melbourne House.

Introduction

This paper is a case history of the development of the text adventure game, The Hobbit (Melbourne House, 1981). The game was a translation of Tolkien’s novel of the same name (Tolkien) into a game that could run on the first generation of home computers that were just beginning to hit the market.

As co-developer of The Hobbit, I offer my recollections of the development process, the internal design, and the genesis of the ideas that made the game unique. Those ideas included the use of non-deterministic gameplay – the game played differently every time and sometimes could not be completed due to key characters being killed early in the game – very different to other games, which had only a single path through the game and responded the same way each time they were played. The Hobbit contained a full-sentence parser that understood a subset of natural language, dubbed Inglish, as compared to the simple “verb noun” constructions accepted by other adventure games of the time. There were graphic renditions of some of the game locations, another groundbreaking addition to a text adventure game. And finally, “emergent characters” – non-player characters exhibiting apparent personalities and intelligence – with whom the player had to interact in order to solve some of the game’s puzzles. In combination, these features led to a game experience that transformed the industry.

Little has been written about the development of the first generation of text-based computer games; this case history provides insight into this developmental period in computer game history. I compare the development environment and the resulting game to the state-of-the-art in text adventure games of the time. Lastly, I discuss the legacy and recent revival of interest in the game.

“Let us not follow where the path may lead.
Let us go instead where there is no path,
And leave a trail.”

– Japanese Proverb

The Tenor of the Times 

It was early 1981. I was a Bachelor of Science student at Melbourne University, majoring in Computer Science (CS) and just starting my last year. These were the early days of Computer Science education, and the curricula required today for undergraduate Computer Science students had not yet been developed. In our classes we were studying topics like sort algorithms and data structures and operating systems such as BSD Unix. Another class focused on calculating rounding and truncation errors occurring as a result of a series of digital calculations. We were taught software development using a systems analysis method called HIPO[1] – Hierarchical Input-Process-Output, the best practice in structured programming – and that documenting our code was a good practice. Object-oriented programming was still in the future.

During our first couple of years in the CS program, programming projects were written using “mark sense cards”, which we marked up with pencils and fed into card readers after waiting in a long queue of students – sometimes for an hour or two to get a single run. You had to get the program running within a certain number of runs or the card reader would redistribute the lead across the cards, making them illegible.

By the time we reached the last year of the Bachelor’s degree, in our CS classes we were actually allowed to log onto a Unix machine in the lab and work there, if we could get access to a terminal (which often meant waiting for hours, or booking a timeslot, or waiting till late in the evening). We programmed in Pascal, Fortran, Assembler, C (our favorite), and Lisp. Our favorite editor was, universally, Vi. I remember programming a PDP8 in Assembler to run a toy train around a set of tracks, switching the tracks as instructed; we hand-assembled the program, typed it in and debugged it using a hexadecimal keypad.

By this time I’d built my own PC, from a project in an electronics hobbyist magazine. I’d purchased the mother board, which came as a peg-board with a printed circuit on it, minus any components or cross-wiring. I would go to the electronics parts store with my list of chips, resistors, capacitors and diodes, and solder for my soldering iron.  In the store they’d say, “tell your boyfriend we don’t have these” – it was not even considered possible that I might be the person purchasing them. The system had a small number of bytes – around 128 bytes, I believe (that is not a misprint) – of free memory, and used a black and white TV as a monitor. For this system we wrote programs out on paper in a simple Assembler, hand-assembled it and typed it in using a hexadecimal keypad. There was no save function, so whenever the system restarted we had to re-type in the program. It was quite impressive to see the programs we could develop in that amount of space.

I was used to being one of around 2-4 women in my university classes, whether it was a smaller class of 30 students or one of the massive Physics classes holding perhaps two or three hundred. Sexism was alive and kicking. The norm for women – for most of the fellow students at my all-girl high school, MacRobertson – was to become secretaries or nurses (although my closest friend for many of those years became a lawyer, traveling to the ‘Stans to negotiate for oil companies, and is now chairman of the board). One fellow student (luckily, I don’t remember who) gave me the ultimate compliment: “you’re bright, for a girl!” In self-defense, I partnered with another woman – Kerryn – for any pair projects. Whenever we had 4-person group projects we joined with another frequent pair, Phil Mitchell and Ray, who were amongst the few men willing to partner with us; these group experiences later led to me recruiting the other three to work at Melbourne House.

My game-playing experience was very limited. There was a Space Invaders arcade game in the lobby of the student union at the university that I sometimes played. For a while there was a game of Pong there, too. The Unix system featured an adventure game we called AdventureColossal Cave, also often referred to as Classic Adventure (CRL, 1976). In our last year I played it obsessively for some time, mapping out the “maze of twisty little passages”, until I had made it to through the game once. At that point it instantly lost interest for me, and I don’t believe I ever played it again. I was not aware of any other computer games.

State-of-the-art PC games were a very new thing – PCs were a very new thing – and at the time were written in Interpretive Basic by hobbyists. Sometimes the games were printed in magazines, taking maybe a page or two at most, and you could type them into any computer that had a Basic interpreter and play them. The code was generally written as a long list of if-then-else statements, and every action and the words to invoke that action was hard-coded. The game-play was pre-determined and static. Even if you purchased the game and loaded it (from the radio-cassette that it was shipped on), you could generally solve the puzzles by reading the code. The rare games that were shipped as compiled Basic could still be solved by dumping memory and reading the messages from the dump.

Getting the Job

I was working early Sunday mornings as a part-time computer operator, but wanted a job with more flexibility. On a notice board I found a small advertisement looking for students to do some programming, and called. I met Alfred (Fred) Milgrom, who had recently started a company he called “Melbourne House”, and he hired me on the spot to write a game for him. Fred was a bit of a visionary in thinking that hiring students with Computer Science background could perhaps do a better job than the general state-of-the-art of self-taught hobbyists.

Fred’s specifications to me were: “Write the best adventure game ever.” Period.

I told Phil Mitchell about the job, as I thought he had the right skills. I brought him along to talk to Fred, who hired him to work on the game with me. Kerryn and Ray joined us later that year to write short games in Basic for publication in the books that Melbourne House was publishing. These books featured a series of games, most of them about a page or two in length. The books were often sold along with a radio-cassette from which you could load the game rather than having to type it in yourself. Ray only stayed briefly, but Kerryn I think stayed for most of the year, and wrote many games. She’d sit at the keyboard and chuckle as she developed a new idea or played a game she’d just written.

Software Design, Cro-Magnon Style

So, what would “the best adventure game ever” look like? I started with the only adventure game I’d ever played: Classic Adventure. What did I not like about it? Well, once I’d figured out the map and solved the puzzles, I was instantly bored. It played the same way every time. Each Non-Player Character (NPC) was tied to a single location, and always did the same thing. Lastly, you had to figure out exactly the incantation the game expected; if the game expected “kill troll”, then any other command – “attack the troll”, for example – would get an error message. You could spend a long time trying to figure out what command the game developer intended you to issue; as a result, most adventure games tended to have the same actions, paired with the same vocabulary.

Phil and I split the game cleanly down the middle, with clearly defined interfaces between the two halves. I took what today we would call the game engine, physics engine and data structures (although those terms did not exist then). Phil took the interface and language portion. I don’t remember who had the original idea of a much more developed language than the standard “kill troll” style of language used by other text adventures of the time; my thinking stopped at the level of having synonyms available for the commands. I had almost no involvement in the parser; I remember overhearing conversations between Fred and Phil as the complexity of what they were aiming at increased. For a time, Stuart Richie was brought in to provide language expertise. However, his thinking was not well suited to what was possible to develop in Assembler in the space and time available, so, according to what Phil told me at the time, none of his design was used – although I suspect that being exposed to his thinking helped Phil crystallize what eventually became Inglish. No matter what the user entered – “take the sharp sword and excitedly hack at the evil troll”, say, he’d convert it to a simple (action, target) pair to hand off to me: “kill troll”, or perhaps, “kill troll with sword”.  Compound sentences would become a sequence of actions, so “take the hammer and hit Gandalf with it” would come to me as two actions: “pick up hammer”, followed by a next turn of “hit Gandalf with hammer”.

I put together the overall design for a game that would remove the non-language-related limitations within a couple of hours on my first day on the job. I knew I wanted to use generalized, abstracted data structures, with general routines that processed that structure and with exits for “special cases”, rather than the usual practice of the time of hard-coding the game-play.  My intent was that you could develop a new game by replacing the content of the data structures and the custom routines – a “game engine” concept I did not hear described until decades later. We even talked about developing a “game editor” that would allow gamers to develop their own adventure games by entering items into the data structures via an interface, but I believe it was never developed. I very early on decided that I wanted randomness to be a key feature of the game – recognizing that that meant the game could not always be solved, and accepting that constraint.

I envisaged three data structures to be used to support the game: a location database, a database of objects and a database of “characters”. The location “database” (actually, just a collection of records with a given structure) was pretty straightforward, containing a description of the location and, for each direction, a pointer to the location reached. There could also be an override routine to be called when going in a direction. The override allowed features or game problems to be added to the game map: for example, a door of limited size (so you could not pass through it while carrying too many items) or a trap to be navigated once specific constraints had been met. There’s a location (the Goblin’s Dungeon) that uses this mechanism to create a dynamic map, rather than having fixed connections to other locations: for each direction, an override routine is called that randomly picks a “next location” for the character to arrive in from a given list of possible locations. Another innovation in the location database occurred when Phil added pictures to specific locations, and drew them when the player entered one of those locations. Rather than representing the entire map of the Middle Earth in the game (as I might do today), I simplified it into a set of individual locations where noteworthy events occurred in the story, and represented those as a linked set of locations, with the links oriented in the directions as laid out on the map. So, for example, “go North” from one location would immediately take you to the next location North in the game where a significant event occurred. I did not then have a notion of variable travel time based on distance between the two locations.

Similarly, I conceived of an object database with a set of abstract characteristics and possible overrides, rather than hard-coding a list of possible player interactions with specific objects as was done in other games. Each object had characteristics and constraints that allowed me treat them generically: weight, size, and so on – in effect, a simple (by today’s standards) physics engine. An object could have the capability to act as a container, and a container could be transparent or opaque; a transparent container’s contents could be seen without having to open it first. There were generic routines that could be applied to all objects: for example, any object could be picked up by something bigger and stronger than it, or put into a bigger container (if there was enough room left in it). Some routines could be applied to any object that matched some set of characteristics; an object could also have a list of “special” routines associated with it that overrode the general routines. There was a general “turn on” routine that applied to lamps, for example, that could also be overridden for a magic lamp by a different, more complex “turn on” routine. I went through the book noting where objects were used to further the plot (swords, lamps, and most obviously, the ring), then added those objects to the game, with appropriate generic characteristics and actions (weight, the ability for lamps to be turned on) and special routines as needed (for example, the ring’s ability to make the wearer invisible).

Each non-player character (NPC) was also an object that began in an “alive” state, but could, due to events in the game, stop being alive – which allowed a player to, for example, use a dead dwarf as a weapon, in the absence of any other weapon). However, the physics engine caused “kill troll with sword” to inflict more damage than “kill troll with (dead) dwarf”.

In addition to regular object characteristics, each NPC had a “character”, stored in the third database. I conceived of an NPC’s character as being a set of actions that the NPC might perform, a sequence in which they generally performed them and a frequency of repetition. The individual actions were simple and were generally the same actions that a player could do (run in a given direction, attack another character, and so on); but again, these routines could be overridden for a specific character. The sequence could be fixed or flexible: an action could branch to a different part of the sequence and continue from there, or even jump to a random location in the sequence. The apparent complexity of the character comes from the length and flexibility of its action sequence; the character “emerges” as a result. For example, Gandalf’s short attention span and kleptomania were represented by a sequence like: “[go] <random direction>. [Pick up] <random object> [Say, “what’s this?”]. [Go] <random direction>. [Put down] <random object>.”

The division between inanimate object and NPC was left intentionally a little blurry, giving extra flexibility. For example, the object overrides could also be used to modify character behaviour. I actually coded an override where, if the player typed “turn on the angry dwarf”, he turned into a “randy dwarf” and followed the player around propositioning him.  If he was later turned off, he’d return to being the angry dwarf and start trying to kill any live character. Fred and Phil made me take that routine out.

In order to develop each character, I went through the book and, for each character, tried to identify common sequences of behavior that I could represent through a sequence of actions that would capture the “texture” of that character. Some characters were easy; for a troll, “{If no alive object in current location} [go] <random direction> {else} [kill] <random object with status ‘alive’>” was pretty much the whole list. Others were harder, such as characterizing Thorin; and yes, I did write the now-classic phrase, “Thorin sits down and starts singing about gold.” (I hereby apologize for how frequently he said that; short character-action list, you see.) An action could invoke a general routine which was the same for all NPCs – like, choose a random direction and run, or choose a live object in the location and kill it; or, it could be an action specific only to this NPC, as with Thorin’s persistent singing (as seen in Figure 2). For Gandalf, the generic “pick up” routine was used under the covers, but overridden for Gandalf to utter “what’s this”.

Figure 1. Gandalf and Thorin exhibit classic behavior. Courtesy Winterdrake.

Figure 2. Gandalf and Thorin exhibit classic behavior. Courtesy Winterdrake.

Sometimes an alternate behaviour list could be chosen based on events, as can be seen in Figure 2. For example, the friendly dwarf would become violent once he’d been attacked (or picked up). For a while, we had terrible trouble with all the NPCs showing up in one location and then killing each other before the player had the chance to work his way through the game, before I got the character profiles better adjusted. Some character would attack another, and once a battle was in progress any (otherwise friendly) character entering that location would be attacked and end up joining in. The same mechanism was used to allow the player to request longer-running actions from NPCs, such as asking a character to follow you when you needed them to help solve a puzzle in a (sometimes far) different location from where they were when you found them. In general the NPCs were programmed to interact with “another”, and did not differentiate whether the “other” was the player or not unless there was a game-related reason for doing so. The NPCs exhibited “emergent behaviour”; they just “played” the game themselves according to their character profile, including interacting with each other. In essence, the NPCs would do to each other almost anything that they could do to or with the player.

Phil programmed the interface to accept input from the player, and after each turn he would hand control to the NPC system, which would allow each (remaining) alive character to take a turn, as can be seen in Figures 2 and 3. For the time, this design was revolutionary; the model then was to have a single, non-mobile NPC in a single location, with only a couple of specific actions that were invoked once the player entered that location, and behaving the same way each time you played the game. Even in the arcade games of the time, we were able to identify that each object the player interacted with behaved the same way each time, and they did not interact with each other at all.

Figure 3. The player modifies Thorin’s default behavior – to the player’s cost.

Figure 3. The player modifies Thorin’s default behavior – to the player’s cost.

At the beginning of the game, we would generate, for each NPC, a random starting point in that NPC’s action list, giving the game much of its random nature. This combination of factors led to the “emergent characters”; or, seen another way, “a bunch of other characters just smart enough to be profoundly, infuriatingly stupid” (Maher).

I quickly transitioned to the concept of the player merely being another character, with a self-generated action list. At some point I experienced the emergent nature of the characters while trying to debug and was joking about the fact that the characters could play the game without the player being there; that discussion led naturally to the famous “time passes” function, where, if the player took too long in taking his next action (or, chose to “wait”, as in Figure 1), the characters would each take another turn. This feature, which Melbourne House trademarked as
“Animaction” (Addison-Wesley Publishing Company, Inc.), was another innovation not seen in prior text adventures, where game-play depended wholly on the player’s actions. (It is also noteworthy how many of the game’s innovations began as jokes. I now believe this to be true of much innovation; certainly it has been, for the innovations I’ve been involved in.)

The next, seemingly obvious step to me was to allow – or even require – the player to ask the NPCs to perform certain tasks for him (as seen in Figure 4), and to set up puzzles that required this kind of interaction in order to solve them. This addition added another layer of complexity to the game. As commented by one fan, “As most veteran Hobbit players know, a good way to avoid starvation in the game is to issue the command “CARRY ELROND” whilst in Rivendell. In the game Elrond is a caterer whose primary function is to give you lunch and if you carry him then he will continue to supply you with food throughout the game.”[2] Another had a less tolerant view: “Sometimes they do what you ask, but sometimes they’re feeling petulant. Perhaps the seminal Hobbit moment comes when you scream at Brand to kill the dragon that’s about to engulf you both in flames, and he answers, “No.” After spending some time with this collection of half-wits, even the most patient player is guaranteed to start poking at them with her sword at some point.”[3]

Figure 4. The Hobbit starting location, and a player action that I never thought of.

Figure 4. The Hobbit starting location, and a player action that I never thought of.

The non-determinism of the overall game meant that it was not, in general, possible to write down a solution to the game. There were specific puzzles in the game, however, and solutions to these puzzles could be written down and shared. However, people also found other ways to solve them than I’d anticipated. For example: “A friend of mine has discovered that you can get and carry both Elrond and Bard. Carrying Elrond with you can by quite useful as he continuously distributes free lunches. And, to be honest, carrying Bard is the only way I’ve found of getting him to the Lonely Mountain. There must be a better way.” (“Letters: Gollum’s Riddle”) As commented by a retrospective, “And actually, therein sort of lies the secret to enjoying the game, and the root of its appeal in its time. It can be kind of fascinating to run around these stage sets with all of these other crazy characters just to see what can happen — and what you can make happen.” (Maher)

Inglish

While I worked on the game, Phil designed, developed and wrote the language interpreter, later dubbed Inglish. I had little interest in linguistics, so I generally tuned out the long discussions that Fred and Phil had about it – and was supported in doing so by the encapsulation and simple interface between the two “halves” of the game, which prevented me needing to know any more.

Figure 5. Opening scene from one of many foreign language versions.

Figure 5. Opening scene from one of many foreign language versions.

Every word was stored in the dictionary, and since only 5 bits are used to represent the English alphabet in lower-case ASCII, the other 3 bits were used by Phil to encode other information about speech parts (verb, adjective, adverb, noun), valid word usages, what pattern to use when pluralizing, and so on. I’ve seen screen images from versions of the game in other languages (e.g., Figure 5), but I do not know how the translations were done or how the design worked with these other languages.

 

Phil translated player commands into simple “verb object” commands to hand to me, with some allowed variations to allow for different action results. For example, I seem to remember that “viciously kill” would launch a more fierce attack, and use up more strength as a result, than just “kill”. Rather than a set of hard-coded messages (as was the norm), we generated the messages “on the fly” from the dictionary and a set of sentence templates. At the end of some action routine, I would have a pointer to a message template for that action. The template would contain indicators for where the variable parts of the message should be placed. I would then pass the message, the subject and object to the language engine. The engine would then generate the message, using, once again, spare bits for further customization.  To take a simple example, “Gandalf gives the curious map to you” used the same template as, say, “Thorin gives the axe to the angry dwarf”.

We were so limited by memory that we would adjust the size of the dictionary to fit the game into the desired memory size; so the number of synonyms available would sometimes decrease if a bug fix required more lines of code. It was a constant trade-off between game functionality and language richness. As a result of all the encoding, dumping memory – a common method of solving puzzles in other text adventures – provided no information for The Hobbit.

Software Development, Cro-Magnon-Style

Our initial development environment was a Dick Smith TRS80 look-alike, with 5 inch floppy drives. Initially I believe we used a 16k machine, then a 32k, and towards the end a 48k or perhaps 64k machine. Our target machine for the game was initially a 32k TRS80. During development, the Spectrum 64 was announced, and that became our new target. Game storage was on a cassette tape, played on a regular radio-cassette player. As the other systems became available we continued using the TRS80 platform as the development environment, and Phil took on the question of how to ports the game to other platforms.

We had a choice of two languages to use for development: Basic, or Assembler. We chose Assembler as we felt the added power offset the added difficulty in using the language.

During initial development, the only development tool available was a simple Notepad-like text editor, and the majority of code was written that way. Later I believe a Vi-like editor became available; even later, I have faint memories of a very early IDE that allowed us to edit, assemble the code and step through it (but that also inserted its own bugs from time to time).

We initially worked with the system’s existing random number generator, but realized that its pseudo-random nature made the game play the same way each time – against what I hoped to achieve. Phil then spent some time writing a “true” random number generator, experimenting with many sources of seed values before he was successful. He tried using the contents of various registers, but discovered that these were often the same values each time. He tried using the time, but the TRS80 did not have a built-in battery or time, and most people did not set the time each time they started the system – so again, if someone turned the machine on and loaded the game, we would get the same results each time. After some experimentation he finally succeeded, and the game – for better or worse, and sometimes for both – became truly random.

Debugging was a nightmare. Firstly, we were debugging machine code, initially without the advantage of an IDE; we ran the program, and when it crashed we tried to read the memory dumps. In Assembler, especially when pushing the memory limit of the system, the Basic programmer’s technique of inserting “print” statements to find out what is happening is not available. We had characters interacting with each other in distant parts of the game, and only actions in the current location were printed on the game player’s console. In one of several cases where a game feature was originally developed for other reasons, we initially wrote the “save” mechanism to help us debug parts of the game without having to start from the beginning each time. It then became part of the delivered version, allowing players to take advantage of the same function.

At some point, the idea of adding graphics came up, I think from Phil. Fred commissioned Kent Rees to draw the pictures, and Phil figured out how to draw them on the various systems; I do know that he adapted the pictures from the originals Kent provided in order to make them easier to draw. The first version of his code always drew the picture when you entered a location that had one; however, it was so slow and annoyed us (me) so much that Phil quickly added a switch to turn them off.

Sidelines

In between coding The Hobbit, we occasionally took time to work on other games. Fred would give us $20 to go and play arcade games, sometimes as often as each week, to see what other folk were doing and what the state of the art was in that industry. Someone in our group of four wrote a version of Pac-Man. We spent hours with one person playing Pac-Man, trying to get up to higher levels in the game, while the others leant over the arcade machine trying to figure out the algorithms that caused prizes to appear and how the behaviour changed across the game levels. We didn’t see it as piracy, as arcade games and home computers were at that time seen as being completely unrelated industries – it was more in the spirit of gaining ideas from another industry for application into ours.

Another game that we wrote was Penetrator (Melbourne House, 1981). Phil was the clear lead on that game while I worked on some pieces of it, and I think Kerryn may have worked on it a bit too.  It was a copy of the arcade game Scramble (Konami, 1981). Because of the speed (or lack thereof) of the processors at the time, we had to ensure that each separate path through the game took the same amount of time; even a difference of one “tstate” (processor state) between one path of an “if-then-else” to another would interfere with smooth motion, so we spent significant time calculating (by hand) the time taken by each path and choosing different Assembler instructions that would compensate for the differences (and given that “NO-op” took 2 tstates, it was not always easy). Another difficulty was getting the radars to turn smoothly, while handling the variable number of other activities taking place in the game. It took forever to get it “right”.

Figure 6. Screen shot from the game Penetrator

Figure 6. Screen shot from the game Penetrator

At the beginning we drew the screen bitmaps for all the landscapes on graph paper and then hand-calculated the hexadecimal representations of each byte for the screen buffer, but that became so tedious so quickly that Phil wrote an editor that we could use to create the landscapes. In the end the landscape editor was packaged with the game, as a feature.

Another “pressing” issue for shooter games of the time was that of keyboard debounce. At the time a computer keyboard consisted of an electrical grid, and when a key was pressed the corresponding horizontal and vertical lines would register a “high”. You checked the grid at regular intervals, and if any lines were registering high you used a map of the keyboard layout to identify the key that had been pressed. However, you had to stall for just the right amount of time before re-reading the keyboard; if you waited too long, the game seemed unresponsive, but if you read too quickly, you would read several key presses for each key press that the player intended. While it was possible to use the drivers that came with the keyboard, they did not respond quickly enough to use for interactive games. “Getting it right” was a tedious matter of spending hours fiddling with timings and testing.

Perhaps A Little Too Random

In addition to all the other randomness it exhibited, The Hobbit was also known to crash seemingly randomly. There were a number of reasons for this. Firstly, The Hobbit was a tough game to test. It was a much bigger game than others of the time. Unlike the other games, it was approximately 40k of hand-coded Assembler[4], as opposed to the commonly used interpreted Basic (a few more advanced games were shipped in compiled Basic). It was written without the benefit of formalized testing practices or automated test suites. The assembly and linking programs we used were also relatively new, and during development, we would find bugs in them. I remember spending hours debugging one time only to discover that the assembler had optimized away a necessary register increment, causing an infinite loop; I had a lot of trouble trying to invent a different coding sequence that prevented the assembler from removing the required increment. Altogether, I took away lessons about not letting your application get too far ahead of the ability of your infrastructure to support it.

Secondly, the game was non-deterministic; it was different every time it was played. It exhibited its own manifestation of chaos theory: small changes in starting conditions (initial game settings, all generated by the random number generator) would lead to large differences in how the game proceeded. Due to the “emergent characters”, we constantly had NPCs interacting in ways that had never been explicitly programmed and tested, or even envisioned. The game could crash because of something that happened in another location that was not visible to the player or to person testing the game, and we might never be able to identify or recreate the sequence of actions that led to it.

It was possible to have an instance of the game that was insoluble, if a key character required to solve a specific puzzle did not survive until needed (often due to having run into a dwarf on the rampage); this was a constraint I was happy to accept, though it frustrated some players. The ability to tell the NPCs what to do also meant that people told them things to do that we hadn’t accounted for. The very generality of the game engine – the physics, the language engine, and the ability for the player to tell characters what to do – led players to interact with the game in ways I’d never thought of, and that were certainly never tested. In some cases, they were things I didn’t realize the game was capable of.

Epilogue

The Hobbit was released in 1982 in Australia and the U.K. Figure 7 shows a typical packaging. It was an instant hit; amongst other awards, it won the Golden Joystick Award for Strategy Game of the Year in 1983, and came second for Best Game of the Year, after Jet-Pac. Penetrator came second in the Golden Joystick Best Arcade Game category, and Melbourne House came second for their Best Software House of the Year, after Jet-Pac’s publishers (“Golden Joystick Awards”). A couple of revisions were published with some improvements, including better graphics. Due to licensing issues it was some time before a U.S. release followed. The book was still covered by copyright and so the right to release had to be negotiated with the copyright holders, which were different in each country. The U.S. copyright holder had other plans for a future game. As a result, future book-based game ideas specifically chose books (such as Sherlock Holmes) that were no longer covered by copyright.

Figure 7. Game release package.

Figure 7. The Hobbit. Game release package.

At the end of 1981, I finished my Bachelor’s degree. We were beginning to discuss using the Sherlock Holmes mysteries as a next games project; I was not sure that the adventure game engine I’d developed was a good fit for the Sherlock style of puzzle solving, although there were definitely aspects that would translate across. However, I was also ready to start something new after a year of coding and debugging in Assembler. I’d proved that my ideas could work, and believed that the result Phil and I had produced was the desired one – an adventure game that solved all my frustrations with Classic Adventure, and in my mind (if not yet in other people’s) met Fred’s target of “the best adventure game ever”.

I interviewed with several major IT vendors, and took a job at IBM, as did Ray. Kerryn took a job in a mining company in Western Australia. Phil stayed on at Melbourne House (later Beam Software), the only member of our university programming team to continue on in the games industry. We eventually all lost touch.

During this time, I was unaware that the game had become a worldwide hit. Immersed in my new career, I lost touch with the nascent games industry. At IBM, I started at the same level as other graduates who had no experience with computers or programming; developing a game in Assembler was not considered professional or relevant experience. Initially I became an expert in the VM operating system (the inspiration and progenitor for VMWare, I’ve heard), which I still admire for the vision, simplicity and coherence of its design, before moving into other technical and consulting position. In late 1991 I left Australia to travel the world. I eventually stopped in Portland, Oregon, with a plan to return to Australia after 2 years – a plan that has been much delayed.

A 3-year stint in a global Digital Media business growth role for IBM U.S. in the early 2000’s brought me back in contact with games developers just as the movie and games industries were moving from proprietary to open-standards based hardware and infrastructure. The differences in development environments, with large teams and sophisticated supporting graphics and physics packages, brought home to me how far the games industry had come. But while I appreciate the physics engines and the quality of graphics that today can fool the eye into believing they are real, the basis of a good game has not changed: simple, compelling ideas still captivate and enchant people, as can be seen in the success of, for example, Angry Birds. I also believe that the constraints of limitations – such as small memories and slow processors – can lead to a level of innovation that less limited resources does not.

And Back Again

As the Internet era developed, I started receiving letters from fans of The Hobbit. The first person I recall tracking me down emailed me with an interview request for his Italian adventure fan-site in 2001, after what he said was a long, long search. The subsequent years made it easier to locate people on the Internet, and the emails became more frequent. At times I get an email a week from people telling me the impact the game had on the course of their lives.

In 2006, the Australian Centre for the Moving Image (ACMI) held an exhibition entitled “Hits of the 80s: Aussie games that rocked the world” (Australian Centre for the Moving Image), featuring The Hobbit. It felt a little like having a museum retrospective while still alive: a moment of truth of how much things have changed, and at the same time how little. The games lab curator, Helen Stuckey, has since written a research paper about the challenge of collecting and exhibiting videogames for a museum audience, using The Hobbit as an example (Stuckey).

In late 2009 I took an education leave of absence from IBM US to study for a Masters/PhD in Computer Science at Portland State University. (IBM and I have since parted company.) When I arrived one of the PhD students, who had played The Hobbit in Mexico as a boy, recognized my name and asked me to present on it. While searching the Internet for graphics for the presentation, I discovered screen shots in many different languages and only then began to realize the worldwide distribution and impact the game had had. Being in a degree program while describing work I’d done during my previous university degree decades before caused many conflicting emotions. I was also amazed at the attendance and interest from the faculty and other students.

In 2012, the 30-year anniversary of the release, several Internet sites and magazines published retrospectives; a couple contacted me for interviews, while others worked solely from published sources. The same year I was contacted by a fan who had been inspired by a bug (“this room is too full for you to enter”) to spend time over the intervening decades in reverse-engineering the machine code into a “game debugger” of the kind I wish we’d had when we originally developed it: Wilderland (“Wilderland: A Hobbit Environment”). It runs the original game code in a Spectrum emulator, while displaying the position and state of objects and NPCs throughout the game. His eventual conclusion was that the location is left over from testing (and I even have a very vague memory of that testing). That a game I spent a year writing part-time could cause such extended devotion is humbling.

In retrospect, I think we came far closer to Fred’s goal of “the best adventure game ever” than we ever imagined we would. The game sold in many countries over many years, and by the late 1980’s had sold over a million copies (DeMaria) – vastly outselling most other games of the time. During one interview, the interviewer told me that in his opinion, The Hobbit transformed the genre of text adventure games, and that it was the last major development of the genre: later games merely refined the advances made. Certainly Beam Software’s games after The Hobbit did not repeat its success.

While many of the publications, particularly at the time of release, focused on the Inglish parser, it is the characters and the richness of the gameplay that most people that contact me focus on. I believe that just as the game would have been less rich without Inglish, putting the Inglish parser on any other adventure game of the time would in no way have resembled the experience of playing The Hobbit, nor would it have had the same impact on the industry or on individuals.

In 2013, the Internet Archive added The Hobbit to its Historical Software Collection[5] – which, in keeping with many other Hobbit-related events, I discovered via a colleague’s email. Late that year, ACMI contacted me to invite me to join the upcoming Play It Again project[6], a game history and preservation project focused on ANZ-written digital games in the 1980s. That contact led to this paper.

As I complete this retrospective – and my PhD – I was again struck again by the power a few simple ideas can have, especially when combined with each other. It’s my favorite form of innovation. In the words of one fan, written 30 years after the game’s release, “I can see what Megler was striving toward: a truly living, dynamic story where anything can happen and where you have to deal with circumstances as they come, on the fly. It’s a staggeringly ambitious, visionary thing to be attempting.” (Maher) A game that’s a fitting metaphor for life.

Disclaimer

This paper is written about events 35 years ago, as accurately as I can remember. With that gap in time, necessarily some errors will have crept in; I take full responsibility for them.

 

 

References

Addison-Wesley Publishing Company, Inc. The Hobbit: Guide to Middle-Earth. 1985.

Australian Centre for the Moving Image. “Hits of the 80s: Aussie Games That Rocked the World.” N.p., May 2007. Web. 24 Feb. 2014.

Crowther, Will. Colossal Cave. CRL, 1976. Print.

DeMaria, Rusel Wilson, Johnny L. High Score!: The Illustrated History of Electronic Games. Berkeley, Cal.: McGraw-Hill/Osborne, 2002. Print.

Golden Joystick Awards. Computer and Video Games Mar. 1984 : 15. Print.

Letters: Gollum’s Riddle. Micro Adventurer Mar. 1984 : 5. Print.

Maher, Jimmy. “The Hobbit.The Digital Antiquarian. N.p., Nov. 2012. Web. 24 Feb. 2014.

Mitchell, Phil, and Veronika Megler. Penetrator. Melbourne, Australia: Beam Software / Melbourne House, 1981. Web. <Described in: http://www.worldofspectrum.org/infoseekid.cgi?id=0003649>.

—. The Hobbit. Melbourne, Australia: Beam Software / Melbourne House, 1981. Web. <Described in: http://en.wikipedia.org/wiki/The_Hobbit_%28video_game%29>.

Stuckey, Helen. “Exhibiting The Hobbit: A Tale of Memories and Microcomputers.” History of Games International Conference Proceedings. Ed. Carl Therrien, Henry Lowood, and Martin Picard. Montreal: Kinephanos, 2014. Print.

Tolkien, J. R. R. The Hobbit, Or, There and Back Again,. Boston: Houghton Mifflin, 1966. Print.

Wilderland: A Hobbit Environment. N.p., 2012. Web. 24 Feb. 2014.

 

 

Notes:

[1] https://en.wikipedia.org/wiki/HIPO

[2] http://solearther.tumblr.com/post/38456362341/thorin-sits-down-and-starts-singing-about-gold

[3] http://www.filfre.net/2012/11/the-hobbit/

[4] An analysis by the Wilderland project (“Wilderland: A Hobbit Environment”) shows the following code breakdown: game engine and game, 36%; text-engine for input and output, the dictionary, the graphics-engine, and the parser 22%, graphics data 25%; character set (3%), buffers (8%), and 6% as yet unidentified.

[5] https://archive.org/details/The_Hobbit_v1.0_1982_Melbourne_House

[6] https://www.acmi.net.au/collections-research/research-projects/play-it-again/

 

Bio

Veronika M. Megler now works for Amazon Web Services in the U.S. as a Senior Consultant in Big Data and Analytics. She recently completed her PhD in Computer Science at Portland State University, working with Dr. David Maier in the emerging field of “Smarter Planet” and big data. Her dissertation research enables Information-Retrieval-style search over scientific data archives. Prior to her PhD, she helped clients of IBM U.S. and Australia adopt a wide variety of emerging technologies. She has published more than 20 industry technical papers and 10 research papers on applications of emerging technologies to industry problems, and holds two patents, including one on her dissertation research. Her interests include applications of emerging technologies, big data and analytics, scientific information management and spatio-temporal data. Ms. Megler was in the last year of her B.Sc. studies at Melbourne University when she co-wrote The Hobbit. She currently lives in Portland, Oregon, and can be reached at vmegler@gmail.com.

It Is What It is, Not What It Was – Henry Lowood

Abstract: The preservation of digital media in the context of heritage work is both seductive and daunting. The potential replication of human experiences afforded by computation and realised in virtual environments is the seductive part. The work involved in realising this potential is the daunting side of digital collection, curation, and preservation. In this lecture, I will consider two questions. First, Is the lure of perfect capture of data or the reconstruction of “authentic” experiences of historical software an attainable goal? And if not, how might reconsidering the project as moments of enacting rather than re-enacting provide a different impetus for making born digital heritage?

Keynote address originally delivered at the Born Digital and Cultural Heritage Conference, Melbourne, 19 June 2014

Let’s begin with a question. When did libraries, archives, and museums begin to think about software history collections? The answer: In the late 1970s. The Charles Babbage Institute (CBI) and the History of Computing Committee of the American Federation of Information Processing Societies (AFIPS), soon to be a sponsor of CBI, were both founded in 1978. The AFIPS committee produced a brochure called “Preserving Computer-Related Source Materials.” Distributed at the National Computer Conference in 1979, it is the earliest statement I have found about preserving software history. It says,

If we are to fully understand the process of computer and computing developments as well as the end results, it is imperative that the following material be preserved: correspondence; working papers; unpublished reports; obsolete manuals; key program listings used to debug and improve important software; hardware and componentry engineering drawings; financial records; and associated documents and artifacts. (“Preserving …” 4)

Mostly paper records. The recommendations say nothing about data files or executable software, only nodding to the museum value of hardware artefacts for “esthetic and sentimental value.” The brochure says that artefacts provide “a true picture of the mind of the past, in the same way as the furnishings of a preserved or restored house provides a picture of past society.” One year later, CBI received its first significant donation of books and archival documents from George Glaser, a former president of AFIPS. Into the 1980s history of computing collections meant documentation: archival records, publications, ephemera and oral histories.

Software preservation trailed documentation and historical projects by a good two decades. The exception was David Bearman, who left the Smithsonian in 1986 to create a company called Archives & Museum Informatics (AHI). He began publishing the Archival Informatics Newsletter in 1987 (later called Archives & Museum Informatics). As one of its earliest projects, AHI drafted policies and procedures for a “Software Archives” at the Computer History Museum (CHM) then located in Boston. By the end of 1987, Bearman published the first important study of software archives under the title Collecting Software: A New Challenge for Archives & Museums. (Bearman, Collecting Software; see also Bearman, “What Are/Is Informatics?”)

In his report, Bearman alternated between frustration and inspiration. Based on a telephone survey of companies and institutions, he wrote that “the concept of collecting software for historical research purposes had not occurred to the archivists surveyed; perhaps, in part, because no one ever asks for such documentation!” (Bearman, Collecting Software 25-26.) He learned that nobody he surveyed was planning software archives. Undaunted, he produced a report that carefully considered software collecting as a multi-institutional endeavor, drafting collection policies and selection criteria, use cases, a rough “software thesaurus” to provide terms for organizing a software collection, and a variety of practices and staffing models. Should some institution accept the challenge, here were tools for the job.

Well, here we are, nearly thirty years later. We can say that software archives and digital repositories finally exist. We have made great progress in the last decade with respect to repository technology and collection development. Looking back to the efforts of the 1980s, one persistent issue raised as early as the AFIPS brochure in 1978 is the relationship between collections of historical software and archival documentation about that software. This is an important issue. Indeed, it is today, nearly forty years later, still one of the key decision points for any effort to build research collections aiming to preserve digital heritage or serve historians of software. Another topic that goes back to Bearman’s report is a statement of use cases for software history. Who is interested in historical software and what will they do with it? Answers to this fundamental question must continue to drive projects in digital preservation and software history.

As we consider the potential roles to be played by software collections in libraries and museums, we immediately encounter vexing questions about how researchers of the future will use ancient software. Consider that using historical software now in order to experience it in 2014 and running that software in 2014 to learn what it was like when people operated it thirty years ago are two completely different use cases. This will still be true in 2050. This may seem like an obvious point, but it is important to understand its implications. An analogy might help. I am not just talking about the difference between watching “Gone with the Wind” at home on DVD versus watching it in a vintage movie house in a 35mm print – with or without a live orchestra. Rather I mean the difference between my experience in a vintage movie house today – when I can find one – and the historical experience of, say, my grandfather during the 1930s. My experience is what it is, not what his was. So much of this essay will deal with the complicated problem of enacting a contemporary experience to re-enact a historical experience and what it has to do with software preservation. I will consider three takes on this problem: the historian’s, the media archaeologist’s, and the re-enactor.

Take 1. The Historian

Take one. The historian. Historians enact the past by writing about it. In other words, historians tell stories. This is hardly a revelation. Without meaning to trivialize the point, I cannot resist pointing out that “story” is right there in “hi-story” or that the words for story and history are identical in several languages, including French and German. The connections between story-telling and historical narrative have long been a major theme in writing about the methods of history, that is, historiography. In recent decades, this topic has been mightily influenced by the work of Hayden White, author of the much-discussed Metahistory: The Historical Imagination in Nineteenth-Century Europe, published in 1973.

White’s main point about historians is that History is less about subject matter and source material and more about how historians write.

He tells us that historians do not simply arrange events culled from sources in correct chronological order. Such arrangements White calls Annals or Chronicles. The authors of these texts merely compile lists of events. The work of the historian begins with the ordering of these events in a different way. Hayden writes in The Content of the Form that in historical writing, “the events must be not only registered within the chronological framework of their original occurrence but narrated as well, that is to say, revealed as possessing a structure, an order of meaning, that they do not possess as mere sequence.” (White, Content of the Form 5) How do historians do this? They create narrative discourses out of sequential chronicles by making choices. These choices involve the form, effect and message of their stories. White puts choices about form, for example, into categories such as argument, ideology and emplotment. There is no need in this essay to review all of the details of every such choice. The important takeaway is that the result of these choices by historians is sense-making through the structure of story elements, use of literary tropes and emphasis placed on particular ideas. In a word, plots. White thus gives us the enactment of history as a form of narrative or emplotment that applies established literary forms such as comedy, satire, and epic.

In his book Figural Realism: Studies in the Mimesis Effect, White writes about the “events, persons, structures and processes of the past” that “it is not their pastness that makes them historical. They become historical only in the extent to which they are represented as subjects of a specifically historical kind of writing.” (White, Figural Realism 2.) It is easy to take away from these ideas that history is a kind of literature. Indeed, this is the most controversial interpretation of White’s historiography.

My purpose in bringing Hayden White to your attention is to insist that there is a place in game and software studies for this “historical kind of writing.” I mean writing that offers a narrative interpretation of something that happened in the past. Game history and software history need more historical writing that has a point beyond adding events to the chronicles of game development or putting down milestones of the history of the game industry. We are only just beginning to see good work that pushes game history forward into historical writing and produces ideas about how these historical narratives will contribute to allied works in fields such as the history of computing or the history of technology more generally.

Allow me one last point about Hayden White as a take on enactment. Clearly, history produces narratives that are human-made and human-readable. They involve assembling story elements and choosing forms. How then do such stories relate to actual historical events, people, and artifacts? Despite White’s fondness for literary tropes and plots, he insists that historical narrative is not about imaginary events. If historical methods are applied properly, the resulting narrative according to White is a “simulacrum.” He writes in his essay on “The Question of Narrative in Contemporary Historical Theory,” that history is a “mimesis of the story lived in some region of historical reality, and insofar as it is an accurate imitation, it is to be considered a truthful account thereof.” (White, “The Question of Narrative …” 3.) Let’s keep this idea of historical mimesis in mind as we move on to takes two and three.

Take 2. The Media Archaeologist

My second take is inspired by the German media archaeologist Wolfgang Ernst. As with Hayden White, my remarks will fall far short of a critical perspective on Ernst’s work. I am looking for what he says to me about historical software collections and the enactment of media history.

Hayden White put our attention on narrative; enacting the past is storytelling. Ernst explicitly opposes Media Archaeology to historical narrative. He agrees in Digital Memory and the Archive, that “Narrative is the medium of history.” By contrast, “the technological reproduction of the past … works without any human presence because evidence and authenticity are suddenly provided by the technological apparatus, no longer requiring a human witness and thus eliminating the irony (the insight into the relativity) of the subjective perspective.” (Ernst, Loc. 1053-1055.) Irony, it should be noted, is one of White’s favourite tropes for historical narrative.

White tells us that historical enactment is given to us as narrative mimesis, with its success given as the correspondence of history to some lived reality. Ernst counters by giving us enactment in the form of playback.

In an essay called “Telling versus Counting: A Media-Archaeological Point of View,” Ernst plays with the notion that, “To tell as a transitive verb means ‘to count things’.” The contrast with White here relates to the difference in the German words erzählen (narrate) and zählen (count), but you also find it in English: recount and count. Ernst describes historians as recounters: “Modern historians … are obliged not just to order data as in antiquaries but also to propose models of relations between them, to interpret plausible connections between events.” (Ernst, Loc. 2652-2653) In another essay, aptly subtitled “Method and Machine versus the History and Narrative of Media,” Ernst adds that mainstream histories of technology and mass media as well as their counter-histories are textual performances that follow “a chronological and narrative ordering of events.” He observes succinctly that, “It takes machines to temporarily liberate us from such limitations.” (Ernst, Loc. 1080-1084)

Where do we go with Ernst’s declaration in “Telling versus Counting,” that “There can be order without stories”? We go, of course, directly to the machines. For Ernst, media machines are transparent in their operation, an advantage denied to historians. We play back historical media on historical machines, and “all of a sudden, the historian’s desire to preserve the original sources of the past comes true at the sacrifice of the discursive.” We are in that moment directly in contact with the past.

In “Method and Machine”, Ernst offers the concept of “media irony” as a response to White’s trope of historical irony. He says,

Media irony (the awareness of the media as coproducers of cultural content, with the medium evidently part of the message) is a technological modification of Hayden White’s notion that “every discourse is always as much about discourse itself as it is about the objects that make up its subject matter. (Ernst, Loc. 1029-1032)

As opposed to recounting, counting in Ernst’s view has to do with the encoding and decoding of signals by media machines. Naturally, humans created these machines. This might be considered as another irony, because humans- have thereby “created a discontinuity with their own cultural regime.” We are in a realm that replaces narrative with playback as a form of direct access to a past defined by machine sequences rather than historical time. (Ernst, Loc. 1342-1343)

Ernst draws implications from media archaeology for his closely connected notion of the multimedia archive. In “Method and Machine,” he says, “With digital archives, there is, in principle, no more delay between memory and the present but rather the technical option of immediate feedback, turning all present data into archival entries and vice versa.” In “Telling versus Counting,” he portrays “a truly multimedia archive that stores images using an image-based method and sound in its own medium … And finally, for the first time in media history, one can archive a technological dispositive in its own medium.” (Ernst, Loc. Loc. 1745-1746; 2527-2529.) Not only is the enactment of history based on playback inherently non-discursive, but the very structure of historical knowledge is written by machines.

With this as background, we can turn to the concrete manifestation of Ernst’s ideas about the Multimedia Archive. This is the lab he has created in Berlin. The website for Ernst’s lab describes The Media Archaeological Fundus (MAF) as “a collection of various electromechanical and mechanical artefacts as they developed throughout time. Its aim is to provide a perspective that may inspire modern thinking about technology and media within its epistemological implications beyond bare historiography.” (Media Archaeological Fundus) Ernst explained the intention behind the MAF in an interview with Lori Emerson as deriving from the need to experience media “in performative ways.” So he created an assemblage of media and media technologies that could be operated, touched, manipulated and studied directly. He said in this interview, “such items need to be displayed in action to reveal their media essentiality (otherwise a medium like a TV set is nothing but a piece of furniture).” (Owens) Here is media archaeology’s indirect response to the 1979 AFIPS brochure’s suggestion that historical artifacts serve a purpose similar to furnishings in a preserved house.

The media-archaeological take on enacting history depends on access to artifacts and, in its strongest form, on their operation. Even when its engagement with media history is reduced to texts, these must be “tested against the material evidence.” This is the use case for Playback as an enactment of software history.

Take 3. The Re-enactor

Take three. The Re-enactor. Authenticity is an important concept for digital preservation.   A key feature of any digital archive over the preservation life-cycle of its documents and software objects is auditing and verification of authenticity, as in any archive. Access also involves authenticity, as any discussion of emulation or virtualization will bring up the question of fidelity to an historical experience of using software.

John Walker (of AutoDesk and Virtual Reality fame) created a workshop called Fourmilab to work on personal projects such as an on-line museum “celebrating” Charles Babbage’s Analytical Engine. This computer programming heritage work includes historical documents and a Java-based emulator of the Engine. Walker says, “Since we’re fortunate enough to live in a world where Babbage’s dream has been belatedly realised, albeit in silicon rather than brass, we can not only read about The Analytical Engine but experience it for ourselves.” The authenticity of this experience – whatever that means for a machine that never existed – is important to Walker. In a 4500-word essay titled, “Is the Emulator Authentic,” he tells us that, “In order to be useful, an emulator program must be authentic—it must faithfully replicate the behaviour of the machine it is emulating.” By extension, the authenticity of a preserved version of the computer game DOOM in a digital repository could be audited by verifying that it can properly run a DOOM demo file. The same is true for Microsoft Word and a historical document in the Word format. This is a machine-centered notion of authenticity; we used it in the second Preserving Virtual Worlds project as a solution to the significant properties problem for software. (Walker, “Introduction;” Walker, “Analytical Engine.”)

All well and good. However, I want to address a different authenticity. Rather than judging authenticity in terms of playback, I would like to ask what authenticity means for the experience of using software. Another way of putting this question is to ask what we are looking for in the re-enactment of historical software use. So we need to think about historical re-enactment.

I am not a historical re-enactor, at least not the kind you are thinking of. I have never participated in the live recreation or performance of a historical event. Since I have been playing historical simulations – a category of boardgames – for most of my life, perhaps you could say that I re-enact being a historical military officer by staring at maps and moving units around on them. It’s not the same thing as wearing period uniforms and living the life, however.

Anyway, I need a re-enactor. In his 1998 book Confederates in the Attic, Tony Horwitz described historical re-enactment in its relationship to lived heritage. (Horwitz) His participant-journalist reportage begins at a chance encounter with a group of “hard-core” Confederate re-enactors. Their conversation leads Horwitz on a year-long voyage through the American South. A featured character in Confederates in the Attic is the re-enactor Robert Lee Hodge, a waiter turned Confederate officer. He took Horwitz under his wing and provided basic training in re-enactment. Hodge even became a minor celebrity due to his role in the book.

Hodges teaches Horwitz the difference between hard-core and farby (i.e., more casual) re-enactment. He tells Horwitz about dieting to look sufficiently gaunt and malnourished, the basics of “bloating” to resemble a corpse on the battlefield, what to wear, what not to wear, what to eat, what not to eat, and so on. It’s remarkable how little time he spends on martial basics. One moment sticks out for me. During the night after a hard day of campaigning Horwitz finds himself in the authentic situation of being wet, cold and hungry. He lacks a blanket, so he is given basic instruction in the sleeping technique of the Confederate infantryman: “spooning.” According to the re-enactor Scott Cross, “Spooning is an old term for bundling up together in bed like spoons placed together in the silver chest.” (Horwitz) Lacking adequate bedding and exposed to the elements, soldiers bunched up to keep warm. So that’s what Horwitz does, not as an act of mimesis or performance per se, but in order to re-experience the reality of Civil War infantrymen.

It interested me that of all the re-enactment activities Horwitz put himself through, spooning reveals a deeper commitment to authenticity than any of the combat performances he describes. It’s uncomfortable and awkward, so requires dedication and persistence. Sleep becomes self-conscious, not just in order to stick with the activity, but because the point of it is to recapture a past experience of sleeping on the battlefield. Since greater numbers of participants are needed for re-enacting a battle than sleep, more farbs (the less dedicated re-enactors) show up and thus the general level of engagement declines. During staged battles, spectators, scripting, confusion and accidents all interfere with the experience. Immersion breaks whenever dead soldiers pop up on the command, “resurrect.” In other words, performance takes over primacy from the effort to re-experience. It is likely that many farbs dressed up for battle are content to find a hotel to sleep in.

Specific attention to the details of daily life might be a reflection of recent historical work that emphasizes social and cultural histories of the Civil War period, rather than combat histories. But that’s not my takeaway from the spooning re-enactors. Rather, it’s the standard of authenticity that goes beyond performance of a specific event (such as a battle) to include life experience as a whole. Horvitz recalled that,

Between gulps of coffee—which the men insisted on drinking from their own tin cups rather than our ceramic mugs—Cool and his comrades explained the distinction. Hardcores didn’t just dress up and shoot blanks. They sought absolute fidelity to the 1860s: its homespun clothing, antique speech patterns, sparse diet and simple utensils. Adhered to properly, this fundamentalism produced a time travel high, or what hardcores called a ‘period rush.’ (Horwitz, Loc. 153-157)

Stephen Gapps, an Australian curator, historian, and re-enactor has spoken of the “extraordinary lengths” re-enactors go to “acquire and animate the look and feel of history.” Hard-core is not just about marching, shooting and swordplay. I wonder what a “period rush” might be for the experience of playing Pitfall! in the mid-21st century. Shag rugs? Ambient New Wave radio? Caffeine-free cola? Will future re-enactors of historical software seek this level of experiential fidelity? Gapps, again: “Although reenactors invoke the standard of authenticity, they also understand that it is elusive – worth striving for, but never really attainable.” (Gapps 397)

Re-enactment offers a take on born-digital heritage that proposes a commitment to lived experience. I see some similarity here with the correspondence to lived historical experience in White’s striving for a discursive mimesis. Yet, like media archaeology, re-enactment puts performance above discourse, though it is the performance of humans rather than machines.

Playing Pitfalls

We now have three different ways to think about potential uses of historical software and born digital documentation. I will shift my historian’s hat to one side of my head now and slide up my curator’s cap. If we consider these takes as use cases, do they help us decide how to allocate resources to acquire, preserve, describe and provide access to digital collections?

In May 2013, the National Digital Information Infrastructure and Preservation Program (NDIIPP) of the U.S. Library of Congress (henceforth: LC) held a conference called Preserving.exe. The agenda was to articulate the “problems and opportunities of software preservation.” In my contribution to the LC conference report issued a few months later, I described three “lures of software preservation.” (Lowood) These are potential pitfalls as we move from software collections to digital repositories and from there to programs of access to software collections. The second half of this paper will be an attempt to introduce the three lures of software preservation to the three takes on historical enactment.

  1. The Lure of the Screen

Let’s begin with the Lure of the Screen. This is the idea that what counts in digital media is what is delivered to the screen. This lure pops up in software preservation when we evaluate significant properties of software as surface properties (graphics, audio, haptics, etc).

This lure of the screen is related to what media studies scholars such as Nick Montfort, Mark Sample and Matt Kirschenbaum have dubbed (in various but related contexts) “screen essentialism.” If the significant properties of software are all surface properties, then our perception of interaction with software tells us all we need to know. We check graphics, audio, responses to our use of controllers, etc., and if they look and act as they should, we have succeeded in preserving an executable version of historical software. These properties are arguably the properties that designers consider as the focus of user interaction and they are the easiest to inspect and verify directly.

The second Preserving Virtual Worlds project was concerned primarily with identifying significant properties of interactive game software. On the basis of several case sets and interviews with developers and other stakeholders, we concluded that isolating surface properties, such as image colourspace as one example, while significant for other media such as static images, is not a particularly useful approach to take for game software. With interactive software, significance appears to be variable and contextual, as one would expect from a medium in which content is expressed through a mixture of design and play, procedurality and emergence. It is especially important that software abstraction levels are not “visible” on the surface of play. It is difficult if not impossible to monitor procedural aspects of game design and mechanics, programming and technology by inspecting properties expressed on the screen.

The preservation lifecycle for software is likely to include data migration. Access to migrated software will probably occur through emulation. How do we know when our experience of this software is affected by these practices? One answer is that we audit significant properties, and as we now know, it will be difficult to predict which characteristics are significant. An alternative or companion approach for auditing the operation of historical software is to verify the execution of data files. The integrity of the software can be evaluated by comparison to documented disk images or file signatures such as hashes or checksums. However, when data migration or delivery environments change the software or its execution environment, this method is inadequate. We must evaluate software performance. Instead of asking whether the software “looks right,” we can check if it runs verified data-sets that meet the specifications of the original software. Examples range from word processing documents to saved game and replay files. Of course, visual inspection of the content plays a role in verifying execution by the software engine; failure will not always be clearly indicated by crashes or error messages. Eliminating screen essentialism does not erase surface properties altogether.

The three takes compel us to think about the screen problem in different ways. First, the Historian is not troubled by screen essentialism. His construction of a narrative mimesis invokes a selection of source materials that may or may not involve close reading of personal gameplay, let alone focus on surface properties. On the other hand, The Re-enactor’s use of software might lead repositories to fret about what the user sees, hears and feels. It makes sense with this use case to think about the re-enactment as occurring at the interface. If a repository aims to deliver a re-enacted screen experience, it will need to delve deeply into questions of significant properties and their preservation.

Screen essentialism is also a potential problem for repositories that follow the path of Media Archaeology. It is unclear to me how a research site like the MAF would respond to digital preservation practices based on data migration and emulation. Can repositories meet the requirements of media archaeologists without making a commitment to preservation of working historical hardware to enable playback from original media? It’s not just that correspondence to surface characteristics is a significant property for media archaeologists. Nor is the Lure of the Screen a criticism of Media Archaelogy. I propose instead that it is a research problem. Ernst’s vision of a Multimedia Archive is based on the idea that media archaeology moves beyond playback to reveal mechanisms of counting. This machine operation clearly is not a surface characteristic. Ernst would argue, I think, that this counting is missed by an account of what is seen on the screen. So let’s assign the task of accounting for counting to the Media Archaeologist, which means showing us how abstraction layers in software below the surface can be revealed, audited and studied.

  1. The Lure of the Authentic Experience

I have already said quite a bit about authenticity. Let me explain now why I am sceptical about an authentic experience of historical software, and why this is an important problem for software collections.

Everyone in game or software studies knows about emulation. Emulation projects struggle to recreate an authentic experience of operating a piece of software such as playing a game. Authenticity here means that the use experience today is like it was. The Lure of the Authentic Experience tells digital repositories at minimum not to preserve software in a manner that would interfere with the production of such experiences. At maximum, repositories deliver authentic experiences, whether on-site or on-line. A tall order. In the minimum case, the repository provides software and collects hardware specifications, drivers or support programs. The documentation provides software and hardware specifications. Researchers use this documentation to reconstruct the historical look-and-feel of software to which they have access. In the maximum case, the repository designs and builds access environments. Using the software authentically would then probably mean a trip to the library or museum with historical or bespoke hardware. The reading room becomes the site of the experience.

I am not happy to debunk the Authentic Experience. Authenticity is a concept fraught not just with intellectual issues, but with registers ranging from nostalgia and fandom to immersion and fun. It is a minefield. The first problem is perhaps an academic point, but nonetheless important: Authenticity is always constructed. Whose lived experience counts as “authentic” and how has it been documented? Is the best source a developer’s design notes? The memory of someone who used the software when it was released? A marketing video? The researcher’s self-reflexive use in a library or museum? If a game was designed for kids in 1985, do you have to find a kid to play it in 2050? In the case of software with a long history, such as Breakout or Microsoft Word, how do we account for the fact that the software was used on a variety of platforms – do repositories have to account for all of them? For example, does the playing of DOOM “death match” require peer-to-peer networking on a local area network, a mouse-and-keyboard control configuration and a CRT display? There are documented cases of different configurations of hardware: track-balls, hacks that enabled multiplayer via TCPIP, monitors of various shapes and sizes, and so on. Which differences matter?

A second problem is that the Authentic Experience is not always that useful to the researcher, especially the researcher studying how historical software executes under the hood. The emulated version of a software program often compensates for its lack of authenticity by offering real-time information about system states and code execution. A trade-off for losing authenticity thus occurs when the emulator shows the underlying machine operation, the counting, if you will. What questions will historians of technology, practitioners of code studies or game scholars ask about historical software? I suspect that many researchers will be as interested in how the software works as in a personal experience deemed authentic.   As for more casual appreciation, the Guggenheim’s Seeing Double exhibition and Margaret Hedstrom’s studies of emulation suggest that exhibition visitors actually prefer reworked or updated experiences of historical software. (Hedstrom, Lee, et al.; Jones)

This is not to say that original artefacts – both physical and “virtual” – will not be a necessary part of the research process. Access to original technology provides evidence regarding its constraints and affordances. I put this to you not as a “one size fits all” decision but as an area of institutional choice based on objectives and resources.

The Re-enactor, of course, is deeply committed to the Authentic Experience. If all we offer is emulation, what do we say to him, besides “sorry.” Few digital repositories will be preoccupied with delivering authentic experiences as part of their core activity. The majority are likely to consider a better use of limited resources to be ensuring that validated software artefacts and contextual information are available on a case-by-case basis to researchers who do the work of re-enactment. Re-enactors will make use of documentation. Horwitz credits Robert Lee Hodge with an enormous amount of research time spent at the National Archives and Library of Congress. Many hours of research with photographs and documents stand behind his re-enactments. In short, repositories should let re-enactors be the re-enactors.

Consider this scenario for software re-enactment. You are playing an Atari VCS game with the open-source Stella emulator. It bothers you that viewing the game on your LCD display differs from the experience with a 1980s-era television set. You are motivated by this realization to contribute code to the Stella project for emulating a historical display. It is theoretically possible that you could assemble everything needed to create an experience that satisfies you – an old television, adapters, an original VCS, the software, etc. (Let’s not worry about the shag rug and the lava lamp.) You can create this personal experience on your own, then write code that matches it. My question: Is the result less “authentic” if you relied on historical documentation such as video, screenshots, technical specifications, and other evidence available in a repository to describe the original experience? My point is that repositories can cooperatively support research by re-enactors who create their version of the experience. Digital repositories should consider the Authentic Experience as more of a research problem than a repository problem.

  1. The Lure of the Executable

The Lure of the Executable evaluates software preservation in terms of success at building collections of software that can be executed on-demand by researchers.

Why do we collect historical software? Of course, the reason is that computers, software, and digital data have had a profound impact on virtually every aspect of recent history. What should we collect? David Bearman’s answer in 1987 was the “software archive.” He distinguished this archive from what I will call the software library. The archive assembles documentation; the library provides historical software. The archive was a popular choice in the early days. Margaret Hedstrom reported that attendees at the 1990 Arden Conference on the Preservation of Microcomputer Software “debated whether it was necessary to preserve software itself in order to provide a sense of ‘touch and feel’ or whether the history of software development could be documented with more traditional records.” (Hedstrom and Bearman) In 2002, the Smithsonian’s David Allison wrote about collecting historical software in museums that, “supporting materials are often more valuable for historical study than code itself. They provide contextual information that is critical to evaluating the historical significance of the software products.” He concluded that operating software is not a high priority for historical museums. (Allison 263-65; cf. Shustek)

Again, institutional resources are not as limitless as the things we would like to do with software. Curators must prioritize among collections and services. The choice between software archive and library is not strictly binary, but choices still must be made.

I spend quite a bit of my professional life in software preservation projects. The end-product of these projects is at least in part the library of executable historical software. I understand the Lure of the Executable and the reasons that compel digital repositories to build collections of verified historical software that can be executed on-demand by researchers. This is the Holy Grail of digital curation with respect to software history. What could possibly be wrong with this mission, if it can be executed?   As I have argued on other occasions there are several problems to consider. Let me give you two. The first is that software does not tell the user very much about how it has previously been used. In the best case, application software in its original use environment might display a record of files created by previous users, such as a list of recently opened files found in many productivity titles like Microsoft Office. The more typical situation is that software is freshly installed from data files in the repository and thus completely lacks information about its biography, for want of a better term.

The second, related problem is fundamental. Documentation that is a prerequisite for historical studies of software is rarely located in software. It is more accurate to say that this documentation surrounds software in development archives (including source code) and records of use and reception. It is important to understand that this is not just a problem for historical research. Documentation is also a problem for repositories. If contextual information such as software dependencies or descriptions of relationships among objects is not available to the repository and all the retired software engineers who knew the software inside-and-out are gone – it may be impossible to get old software to run.

Historians, of course, will usually be satisfied with the Archive. Given limited resources, is it reasonable to expect that the institutions responsible for historical collections of documentation will be able to reconcile such traditional uses with other methods of understanding historical computing systems? The Re-enactor will want to run software, and the Media Archaeologist will not just want access to a software library, but to original media and hardware in working order. These are tall orders for institutional repositories such as libraries and archives, though possibly a better fit to the museum or digital history centre.

In Best Before: Videogames, Supersession and Obsolescence, James Newman is not optimistic about software preservation and he describes how the marketing of software has in some ways made this a near impossibility. He is not as pessimistic about video game history, however. In a section of his book provocatively called “Let Videogames Die,” he argues that a documentary approach to gameplay might be a more pragmatic enterprise than the effort to preserve playable games. He sees this as a “shift away from conceiving of play as the outcome of preservation to a position that acknowledges play as an indivisible part of the object of preservation.” (Newman 160) In other words, what happens when we record contemporary use of software to create historical documentation of that use? Does this activity potentially reduce the need for services that provide for use at any given time in the future? This strikes me as a plausible historical use case, but not one for re-enactment or media archaeology.

Software archives or software libraries? That is the question. Is it nobler to collect documentation or to suffer the slings and arrows of outrageous software installations? The case for documentation is strong. The consensus among library and museum curators (including myself) is almost certainly that documents from source code to screenshots are a clear win for historical studies of software. Historians, however, will not be the only visitors to the archive. But there are other reasons to collect documentation. One of the most important reasons, which I briefly noted above, is that software preservation requires such documentation. In other words, successful software preservation activities are dependent upon technical, contextual and rights documentation. And of course, documents tell re-enactors how software was used and can help media archaeologists figure out what their machines are showing or telling them. But does documentation replace the software library? Is it sufficient to build archives of software history without libraries of historical software? As we have seen, this question was raised nearly forty years ago and remains relevant today. My wish is that this question of the relationship between documentation and software as key components of digital heritage work stir conversation among librarians, historians, archivists and museum curators. This conversation must consider that there is likely to be a broad palette of use cases such as the historian, media archaeologist and re-enactor, as well as many others not mentioned here. It is unlikely that any one institution can respond to every one of these use cases. Instead, the more likely result is a network of participating repositories, each of which will define priorities and allocate resources according to both their specific institutional contexts and an informed understanding of the capabilities of partner institutions.

 

References

Allison, David K. “Preserving Software in History Museums: A Material Culture Approach. Ed. Ulf Hashagen, Reinhard Keil-Slawik and Arthur L. Norberg. History of Computing: Software Issues. Berlin: Springer, 2002. 263-272.

Bearman, David. Collecting Software: A New Challenge for Archives and Museums. Archival Informatics Technical Report #2 (Spring 1987).

— “What Are/Is Informatics? And Especially, What/Who is Archives & Museum Informatics?” Archival Informatics Newsletter 1:1 (Spring 1987): 8.

Cross, Scott. “The Art of Spooning.” Atlantic Guard Soldiers’ Aid Society. 13 July 2016. Web. http://www.agsas.org/howto/outdoor/art_of_spooning.shtml. Originally published in The Company Wag 2, no. 1 (April 1989).

Ernst, Wolfgang. Digital Memory and the Archive. (Minneapolis: Univ. Minnesota Press, 2012). Kindle edition.

Gapps, Stephen. “Mobile monuments: A view of historical reenactment and authenticity from inside the costume cupboard of history.” Rethinking History: The Journal of Theory and Practice, 13:3 (2009): 395-409.

Hedstrom, Margaret L., Christopher A. Lee, Judith S. Olson and Clifford A. Lampe, “‘The Old Version Flickers More’: Digital Preservation from the User’s Perspective.” The American Archivist, 69: 1 (Spring – Summer 2006): 159-187.

Hedstrom, Margaret L., and David Bearman, “Preservation of Microcomputer Software: A Symposium,” Archives and Museum Informatics 4:1 (Spring 1990): 10.

Horwitz, Tony. Confederates in the Attic: Dispatches from the Unfinished Civil War. New York: Pantheon Books, 1998. Kindle Edition.

Jones, Caitlin. “Seeing Double: Emulation in Theory and Practice. The Erl King Study.” Paper presented to the Electronic Media Group, 14 June 2004. Electronic Media Group. Web. http://cool.conservation-us.org/coolaic/sg/emg/library/pdf/jones/Jones-EMG2004.pdf

Lowood, Henry. “The Lures of Software Preservation.” Preserving.exe: Toward a National Strategy for Software Preservation (October 2013): 4-11. Web. http://www.digitalpreservation.gov/multimedia/documents/PreservingEXE_report_final101813.pdf

Media Archaeological Fundus. Web. 21 Jan. 2016. http://www.medienwissenschaft.hu-berlin.de/medientheorien/fundus/media-archaeological-fundus

Newman, James. Best Before: Videogames, Supersession and Obsolescence. London: Routledge, 2012.

Owens, Trevor. “Archives, Materiality and the ‘Agency of the Machine’: An Interview with Wolfgang Ernst.” The Signal: Digital Preservation. Web. 8 February 2013. http://blogs.loc.gov/digitalpreservation/2013/02/archives-materiality-and-agency-of-the-machine-an-interview-with-wolfgang-ernst/

“Preserving Computer-Related Source Materials.” IEEE Annals of the History of Computing 1 (Jan.-March 1980): 4-6.

Shustek, Len. “What Should We Collect to Preserve the History of Software?” IEEE Annals of the History of Computing, 28 (Oct.-Dec. 2006): 110-12.

Walker, John. “Introduction” to The Analytical Engine: The First Computer.” Fourmilab, 21 March 2016. Web. http://www.fourmilab.ch/babbage/

— “The Analytical Engine: Is the Emulator Authentic?,” Fourmilab, 21 March 2016. Web. http://www.fourmilab.ch/babbage/authentic.html

White, Hayden. The Content of the Form: Narrative Discourse and Historical Representation. Baltimore: Johns Hopkins Univ. Press, 1987.

Figural Realism: Studies in the Mimesis Effect. Baltimore: Johns Hopkins Univ. Press, 2000.

— “The Question of Narrative in Contemporary Historical Theory.” In: History and Theory 23: 1 (Feb. 1984): 1-33.

 

Bio

Henry Lowood is Curator for History of Science & Technology Collections and for Film & Media Collections at Stanford University. He has led the How They Got Game project at Stanford University since 2000 and is the co-editor of The Machinima Reader and Debugging Game History, both published by MIT Press. Contact: lowood@stanford.edu