Born Digital Cultural Heritage – Angela Ndalianis & Melanie Swalwell

The collection and preservation of the ‘born digital’ has, in recent years, become a growing and significant area of debate. The honeymoon years are over and finally institutions are beginning to give serious consideration to best practice for digital preservation strategies and the establishment of digital collections. Digital technology emerges and disappears with incredible speed, as a once-new piece of hardware or software becomes old and is replaced by the next technological advancement. What happens to: videogame software and hardware of the 1980s and 90s? The web browsers, blogs and social media sites and content they once displayed? The artworks that relied on pre-2000 computers to create art? Are these – amongst many other – digital creations fated to be abandoned, becoming only memories of individual experience? Are they to be collected by institutions as defunct objects? Or are they to be preserved and revived using new digital technology? These are but a few of the serious questions facing collecting institutions. The question of who is responsible for collecting, preserving and historicising born digital cultural heritage is a crucial one, as is the issue of best practice – what are the best ways to preserve and make accessible such born digital heritage?

In June 2014, our “Play It Again”[1] project team ran an international conference on “The Born Digital and Cultural Heritage” that aimed to convene a forum where some of these issues could be discussed. “Play It Again” was a three year project focused on the history and preservation of microcomputer games written in 1980s Australia and New Zealand, but as the first digital preservation project to be funded as research in this part of the world (at least to our knowledge), it also had a broader significance. We tried to use it to raise awareness around some of the threats facing born digital cultural production more broadly, beyond 1980s digital games. Two of the project’s aims were to “Enhance appreciation for the creations of the early digital period” and “To build capacity in both the academic and cultural sectors in the area of digital cultural heritage and the ‘born digital’”, both critical issues internationally. A two-day event held at the Australian Centre for the Moving Image, Melbourne, the conference’s remit was thus deliberately wider than the focus of the Australian Research Council Linkage Project.

The need for cooperation between different stakeholders – legislative bodies, professionals working in different types of institutions, and the private sector – was a key recommendation of the 2012 “Vancouver Declaration,” a Memory of the World initiative (UNESCO). Born digital artefacts often require multiple sets of expertise, therefore our call for papers invited proposals from researchers and practitioners in a range of disciplines, spheres of practice and institutional contexts concerned with born digital heritage. This included libraries, archives, museums, galleries, moving image institutions, software repositories, universities, and more besides. We wanted to create a space where communication between the different types of professionals dealing with preservation of born digital cultural heritage could take place. Archivists, librarians, conservators, and moving image archivists share many challenges, yet, we suspect, often they attend conferences which are profession based, which enforces a kind of silo-ing of knowledge. Particularly in small countries such as Australia and New Zealand, there’s a need for conversations to take place across professional boundaries, and so we sought to bring people who perhaps don’t normally move in the same circles into contact.

The presentations during the conference ranged in approach from theoretical, to practical, to policy-oriented. We gloried in the range of papers that were presented. There were game histories, reflections on the demoscene, on net.art and other forms of media art, on born digital manuscripts, robots, twitter accounts and website archiving. As well as papers addressing different forms of heritage materials, there were also technical reports on the problems with hacking and patching disk images to get them to emulate, on software migration, and legal papers on copyright protection, and the ‘right to be forgotten’. (Audio of many of the presentations is available here. The variety of presentations made painfully visible the enormous task at hand in addressing born digital cultural heritage.

While Refractory focuses on entertainment media, in this issue we recognise that born digital entertainment media share many of the challenges of non-entertainment objects. Here, we have collected article versions of selected papers from the conference. The topics and subjects are varied – from those looking more broadly at approaches to born digital heritage and the preservation of digital art, to the documentation of and public discourse about early game histories, and to future creative writing practice facilitated through the collection of digital manuscripts.

In his paper “It Is What It Is, Not What It Was: Making Born Digital Heritage” (which was a keynote address), Henry Lowood examines the preservation and collection of digital media in the context of cultural heritage. Lowood is concerned with “the relationship between collections of historical software and archival documentation about that software” and poses the question “Who is interested in historical software and what will they do with it?” He argues that “answers to this fundamental question must continue to drive projects in digital preservation and software history”. Using the examples of ‘The Historian’, ‘The Media Archaeologist’ and ‘The Re-enactor’ his paper raises important questions about the function, purpose and varied approaches to the digital archive. The historian, he states, is interested in the digital archival material in order to interpret, reconstruct and retell its story in history. For the media archaeologist, “media machines are transparent in their operation” and, rather than requiring interpretation, speak of their pastness by making possible the playback of “historical media on historical machines”. Finally, for ‘The Re-enactor’, ‘authenticity’ is a crucial factor for digital preservation; however, the question of authenticity is fraught with debate – on the one hand, the re-enactor at one extreme insists on a “fidelity of play” with the software that engages with technology (hardware and software) in its original state, and at the other extreme is the re-enactor who is willing to forgo the historical machine in favour of emulation and virtualisation that recreates an embodied experience of ‘playing’ with the original software, whether a game or word processing program. In either case, as Lowood explains, “Re-enactment offers a take on born-digital heritage that proposes a commitment to lived experience.”

In their article “Defining The Experience: George Poonhkin Khut’s Distillery: Waveforming, 2012”, Amanda Pagliarino and artist George Poonkhin Khut present an account of Khut’s sensory artwork, Distillery: Waveforming 2012, which uses the prototype iPad application ‘BrightHearts,’ which was acquired by the Queensland Art Gallery. The Curator of Contemporary Australian Art requested that the acquisition “was captured in perpetuity in its prototype state”. The authors explain that this biofeedback artwork is ‘iterative’ and Khut continued to develop the work in other iterations that include updates for the BrightHearts app for touch screen devices. This article describes the development of the artwork and the issues that were addressed in its acquisition, archiving, and the consultations that took place between the artist and the collecting institution. As the writers argue “to secure the commitment of the artist to engage in collaborative, long-term conservation strategies is extraordinary and this has resulted in the Gallery acquiring an unparalleled archival resource” that includes documentation and description of the interactive principles and behaviour of the artwork in its early state and as it evolved in Khut’s art practise. This archival resource will make it possible for the work to be reinterpreted “at some point in the future when the original technology no longer functions as intended”. In this respect, Distillery: Waveforming is understood as a “legacy artwork intrinsically linked to past and future iterations” of Khut’s larger Biofeedback Project.

The next article “There and Back Again: A Case History of Writing The Hobbit” by Veronika Megler focuses on the iconic text adventure game The Hobbit (Melbourne House, 1981), which Megler co-wrote during the final year of her Bachelor of Science degree at Melbourne University. This paper is a case history of the development of the The Hobbit (based on J.R.R.Tolkien’s novel of the same name) into a game that could run on the first generation of home computers that were just beginning to hit the market. Little has been written about the development of the first generation of text-based computer games; this case history provides insight into this developmental period in computer game history. Megler describes the development process, the internal design, and the genesis of the ideas that made The Hobbit unique. She compares the development environment and the resulting game to the state-of-the-art in text adventure games of the time, and wraps up by discussing the game’s legacy and the recent revival of interest in the game.

Jaakko Suominen and Anna Sivula’s article “Participatory Historians in Digital Cultural Heritage Process — Monumentalization of the First Finnish Commercial Computer Game” continues with games, analysing how digital games become cultural heritage. By using examples of changing conceptualisations of the first commercial Finnish computer game, the article examines the amateur and professional historicisation of computer games. The authors argue that the production of cultural heritage is a process of constructing symbolic monuments that are often related to events of change or the beginning of a progressive series of events, and the article presents an account of the formation of games as symbolic cultural monuments within a Finnish context. Whilst many researchers and journalists have claimed that Raharuhtinas (Money Prince 1984) for Commodore 64 was the first Finnish commercial digital game, its status as such is controversial. As the authors explain, “in this paper, we are more interested in public discourse of being the first” and how this relates to the cultural heritage process. The case of the ‘first’ game, it is argued, illuminates how items are selected as building material for digital game cultural heritage.

In “Retaining Traces of Composition in Digital Manuscript Collections: a Case for Institutional Proactivity”, Millicent Weber turns to digital manuscripts, their collection, preservation and digital storage by collecting institutions. Weber argues that libraries, archives and scholars have not addressed the content of future digital or part-digital collections, or their capacity to support sustained scholarly research. This paper examines the potential content of future collections of poetry manuscripts and their capacity to support research into the process of composition. To predict this capacity, the article compares a study of compositional process, using handwritten and typewritten manuscripts, with a small-scale survey of early-career poets’ compositional habits. The draft manuscripts of three poems by the poet Alan Gould and three by the poet Chris Mansell are used to describe each poet’s compositional habits, while the survey component of the project obtained information about the drafting practices of 12 students of creative writing and poetry at the University of Canberra. Weber concludes that the results indicate both the great diversity of manuscript collections currently being created, and the importance of archival institutions adopting an active advocacy role in encouraging writers to create and maintain comprehensive and well-organised collections of digital manuscripts.

The collection and preservation of born digital cultural heritage is of critical importance. In the digital era, “Heritage refers to legacy from the past, what we live with today, and what should be passed from generation to generation because of its significance and value” (UNESCO/PERSIST Content Task Force 16). If we want to ensure that records and works from this era persist, we will need to substantially ramp up our efforts. Cooperation between different stakeholders is critical and the research sector has an important role to play, in undertaking collaborative research with cultural institutions to tackle some of the thornier challenges surrounding the persistence of born digital cultural heritage.

Works cited

UNESCO. “UNESCO/UBC Vancouver Declaration, The Memory of the World in the Digital Age: Digitization and Preservation.” N.p., 2012. Web. 17 Dec. 2012.

UNESCO/PERSIST Content Task Force. “The UNESCO/PERSIST Guidelines for the Selection of Digital Heritage for Long-Term Preservation.” 2016. Web.

 

[1] The “Play It Again” project received support under the Australian Research Council’s Linkage Projects funding Scheme (project number LP120100218). See our research blog and the “Popular Memory Archive” for more information on the project.

 

Bios

Associate Professor Melanie Swalwell is a scholar of digital media arts, cultures, and histories. She is the recipient of an ARC Future Fellowship for her project “Creative Micro-computing in Australia, 1976-1992”. Between 2011-15, she was Project Leader and Chief Investigator on the ARC Linkage Project “Play It Again“. In 2009, Melanie was the Nancy Keesing Fellow (State Library of New South Wales). She has authored chapters and articles in both traditional and interactive formats, in such esteemed journals as ConvergenceVectors, and the Journal of Visual Culture. Melanie’s projects include:

  • “Creative Micro-computing in Australia, 1976-1992”. Watch the filmhere.
  • Australasian Digital Heritage, which gathers together several local digital heritage research projects. Follow us onFacebook & Twitter @ourdigiheritage
  • Play It Again: Creating a Playable History of Australasian Digital Games, for Industry, Community and Research Purposes”, ARC Linkage, 2012-14. Follow us onFacebook & Twitter @AgainPlay, and visit the Popular Memory Archive.

 

Angela Ndalianis is Professor in Screen Studies at Melbourne University, and the Director of the Transformative Technologies Research Unit (Faculty of Arts). Her research interests include: genre studies, with expertise in the horror and science fiction genres; entertainment media and media histories; the contemporary entertainment industry. Her publications include Neo-Baroque Aesthetics and Contemporary Entertainment (MIT Press 2004), Science Fiction Experiences (New Academia 2010), The Horror Sensorium; Media and the Senses (McFarland 2012) and The Contemporary Comic Book Superhero (editor, Routledge 2008). She is currently completing two books: Batman: Myth and Superhero; and Robots and Entertainment Culture. She is also a Fellow of the Futures of Entertainment Network (U.S), and is the Hans Christian Andersen Academy’s Visiting Professor (2015-7), a position also affiliated with the University of Southern Denmark.   

It Is What It is, Not What It Was – Henry Lowood

Abstract: The preservation of digital media in the context of heritage work is both seductive and daunting. The potential replication of human experiences afforded by computation and realised in virtual environments is the seductive part. The work involved in realising this potential is the daunting side of digital collection, curation, and preservation. In this lecture, I will consider two questions. First, Is the lure of perfect capture of data or the reconstruction of “authentic” experiences of historical software an attainable goal? And if not, how might reconsidering the project as moments of enacting rather than re-enacting provide a different impetus for making born digital heritage?

Keynote address originally delivered at the Born Digital and Cultural Heritage Conference, Melbourne, 19 June 2014

Let’s begin with a question. When did libraries, archives, and museums begin to think about software history collections? The answer: In the late 1970s. The Charles Babbage Institute (CBI) and the History of Computing Committee of the American Federation of Information Processing Societies (AFIPS), soon to be a sponsor of CBI, were both founded in 1978. The AFIPS committee produced a brochure called “Preserving Computer-Related Source Materials.” Distributed at the National Computer Conference in 1979, it is the earliest statement I have found about preserving software history. It says,

If we are to fully understand the process of computer and computing developments as well as the end results, it is imperative that the following material be preserved: correspondence; working papers; unpublished reports; obsolete manuals; key program listings used to debug and improve important software; hardware and componentry engineering drawings; financial records; and associated documents and artifacts. (“Preserving …” 4)

Mostly paper records. The recommendations say nothing about data files or executable software, only nodding to the museum value of hardware artefacts for “esthetic and sentimental value.” The brochure says that artefacts provide “a true picture of the mind of the past, in the same way as the furnishings of a preserved or restored house provides a picture of past society.” One year later, CBI received its first significant donation of books and archival documents from George Glaser, a former president of AFIPS. Into the 1980s history of computing collections meant documentation: archival records, publications, ephemera and oral histories.

Software preservation trailed documentation and historical projects by a good two decades. The exception was David Bearman, who left the Smithsonian in 1986 to create a company called Archives & Museum Informatics (AHI). He began publishing the Archival Informatics Newsletter in 1987 (later called Archives & Museum Informatics). As one of its earliest projects, AHI drafted policies and procedures for a “Software Archives” at the Computer History Museum (CHM) then located in Boston. By the end of 1987, Bearman published the first important study of software archives under the title Collecting Software: A New Challenge for Archives & Museums. (Bearman, Collecting Software; see also Bearman, “What Are/Is Informatics?”)

In his report, Bearman alternated between frustration and inspiration. Based on a telephone survey of companies and institutions, he wrote that “the concept of collecting software for historical research purposes had not occurred to the archivists surveyed; perhaps, in part, because no one ever asks for such documentation!” (Bearman, Collecting Software 25-26.) He learned that nobody he surveyed was planning software archives. Undaunted, he produced a report that carefully considered software collecting as a multi-institutional endeavor, drafting collection policies and selection criteria, use cases, a rough “software thesaurus” to provide terms for organizing a software collection, and a variety of practices and staffing models. Should some institution accept the challenge, here were tools for the job.

Well, here we are, nearly thirty years later. We can say that software archives and digital repositories finally exist. We have made great progress in the last decade with respect to repository technology and collection development. Looking back to the efforts of the 1980s, one persistent issue raised as early as the AFIPS brochure in 1978 is the relationship between collections of historical software and archival documentation about that software. This is an important issue. Indeed, it is today, nearly forty years later, still one of the key decision points for any effort to build research collections aiming to preserve digital heritage or serve historians of software. Another topic that goes back to Bearman’s report is a statement of use cases for software history. Who is interested in historical software and what will they do with it? Answers to this fundamental question must continue to drive projects in digital preservation and software history.

As we consider the potential roles to be played by software collections in libraries and museums, we immediately encounter vexing questions about how researchers of the future will use ancient software. Consider that using historical software now in order to experience it in 2014 and running that software in 2014 to learn what it was like when people operated it thirty years ago are two completely different use cases. This will still be true in 2050. This may seem like an obvious point, but it is important to understand its implications. An analogy might help. I am not just talking about the difference between watching “Gone with the Wind” at home on DVD versus watching it in a vintage movie house in a 35mm print – with or without a live orchestra. Rather I mean the difference between my experience in a vintage movie house today – when I can find one – and the historical experience of, say, my grandfather during the 1930s. My experience is what it is, not what his was. So much of this essay will deal with the complicated problem of enacting a contemporary experience to re-enact a historical experience and what it has to do with software preservation. I will consider three takes on this problem: the historian’s, the media archaeologist’s, and the re-enactor.

Take 1. The Historian

Take one. The historian. Historians enact the past by writing about it. In other words, historians tell stories. This is hardly a revelation. Without meaning to trivialize the point, I cannot resist pointing out that “story” is right there in “hi-story” or that the words for story and history are identical in several languages, including French and German. The connections between story-telling and historical narrative have long been a major theme in writing about the methods of history, that is, historiography. In recent decades, this topic has been mightily influenced by the work of Hayden White, author of the much-discussed Metahistory: The Historical Imagination in Nineteenth-Century Europe, published in 1973.

White’s main point about historians is that History is less about subject matter and source material and more about how historians write.

He tells us that historians do not simply arrange events culled from sources in correct chronological order. Such arrangements White calls Annals or Chronicles. The authors of these texts merely compile lists of events. The work of the historian begins with the ordering of these events in a different way. Hayden writes in The Content of the Form that in historical writing, “the events must be not only registered within the chronological framework of their original occurrence but narrated as well, that is to say, revealed as possessing a structure, an order of meaning, that they do not possess as mere sequence.” (White, Content of the Form 5) How do historians do this? They create narrative discourses out of sequential chronicles by making choices. These choices involve the form, effect and message of their stories. White puts choices about form, for example, into categories such as argument, ideology and emplotment. There is no need in this essay to review all of the details of every such choice. The important takeaway is that the result of these choices by historians is sense-making through the structure of story elements, use of literary tropes and emphasis placed on particular ideas. In a word, plots. White thus gives us the enactment of history as a form of narrative or emplotment that applies established literary forms such as comedy, satire, and epic.

In his book Figural Realism: Studies in the Mimesis Effect, White writes about the “events, persons, structures and processes of the past” that “it is not their pastness that makes them historical. They become historical only in the extent to which they are represented as subjects of a specifically historical kind of writing.” (White, Figural Realism 2.) It is easy to take away from these ideas that history is a kind of literature. Indeed, this is the most controversial interpretation of White’s historiography.

My purpose in bringing Hayden White to your attention is to insist that there is a place in game and software studies for this “historical kind of writing.” I mean writing that offers a narrative interpretation of something that happened in the past. Game history and software history need more historical writing that has a point beyond adding events to the chronicles of game development or putting down milestones of the history of the game industry. We are only just beginning to see good work that pushes game history forward into historical writing and produces ideas about how these historical narratives will contribute to allied works in fields such as the history of computing or the history of technology more generally.

Allow me one last point about Hayden White as a take on enactment. Clearly, history produces narratives that are human-made and human-readable. They involve assembling story elements and choosing forms. How then do such stories relate to actual historical events, people, and artifacts? Despite White’s fondness for literary tropes and plots, he insists that historical narrative is not about imaginary events. If historical methods are applied properly, the resulting narrative according to White is a “simulacrum.” He writes in his essay on “The Question of Narrative in Contemporary Historical Theory,” that history is a “mimesis of the story lived in some region of historical reality, and insofar as it is an accurate imitation, it is to be considered a truthful account thereof.” (White, “The Question of Narrative …” 3.) Let’s keep this idea of historical mimesis in mind as we move on to takes two and three.

Take 2. The Media Archaeologist

My second take is inspired by the German media archaeologist Wolfgang Ernst. As with Hayden White, my remarks will fall far short of a critical perspective on Ernst’s work. I am looking for what he says to me about historical software collections and the enactment of media history.

Hayden White put our attention on narrative; enacting the past is storytelling. Ernst explicitly opposes Media Archaeology to historical narrative. He agrees in Digital Memory and the Archive, that “Narrative is the medium of history.” By contrast, “the technological reproduction of the past … works without any human presence because evidence and authenticity are suddenly provided by the technological apparatus, no longer requiring a human witness and thus eliminating the irony (the insight into the relativity) of the subjective perspective.” (Ernst, Loc. 1053-1055.) Irony, it should be noted, is one of White’s favourite tropes for historical narrative.

White tells us that historical enactment is given to us as narrative mimesis, with its success given as the correspondence of history to some lived reality. Ernst counters by giving us enactment in the form of playback.

In an essay called “Telling versus Counting: A Media-Archaeological Point of View,” Ernst plays with the notion that, “To tell as a transitive verb means ‘to count things’.” The contrast with White here relates to the difference in the German words erzählen (narrate) and zählen (count), but you also find it in English: recount and count. Ernst describes historians as recounters: “Modern historians … are obliged not just to order data as in antiquaries but also to propose models of relations between them, to interpret plausible connections between events.” (Ernst, Loc. 2652-2653) In another essay, aptly subtitled “Method and Machine versus the History and Narrative of Media,” Ernst adds that mainstream histories of technology and mass media as well as their counter-histories are textual performances that follow “a chronological and narrative ordering of events.” He observes succinctly that, “It takes machines to temporarily liberate us from such limitations.” (Ernst, Loc. 1080-1084)

Where do we go with Ernst’s declaration in “Telling versus Counting,” that “There can be order without stories”? We go, of course, directly to the machines. For Ernst, media machines are transparent in their operation, an advantage denied to historians. We play back historical media on historical machines, and “all of a sudden, the historian’s desire to preserve the original sources of the past comes true at the sacrifice of the discursive.” We are in that moment directly in contact with the past.

In “Method and Machine”, Ernst offers the concept of “media irony” as a response to White’s trope of historical irony. He says,

Media irony (the awareness of the media as coproducers of cultural content, with the medium evidently part of the message) is a technological modification of Hayden White’s notion that “every discourse is always as much about discourse itself as it is about the objects that make up its subject matter. (Ernst, Loc. 1029-1032)

As opposed to recounting, counting in Ernst’s view has to do with the encoding and decoding of signals by media machines. Naturally, humans created these machines. This might be considered as another irony, because humans- have thereby “created a discontinuity with their own cultural regime.” We are in a realm that replaces narrative with playback as a form of direct access to a past defined by machine sequences rather than historical time. (Ernst, Loc. 1342-1343)

Ernst draws implications from media archaeology for his closely connected notion of the multimedia archive. In “Method and Machine,” he says, “With digital archives, there is, in principle, no more delay between memory and the present but rather the technical option of immediate feedback, turning all present data into archival entries and vice versa.” In “Telling versus Counting,” he portrays “a truly multimedia archive that stores images using an image-based method and sound in its own medium … And finally, for the first time in media history, one can archive a technological dispositive in its own medium.” (Ernst, Loc. Loc. 1745-1746; 2527-2529.) Not only is the enactment of history based on playback inherently non-discursive, but the very structure of historical knowledge is written by machines.

With this as background, we can turn to the concrete manifestation of Ernst’s ideas about the Multimedia Archive. This is the lab he has created in Berlin. The website for Ernst’s lab describes The Media Archaeological Fundus (MAF) as “a collection of various electromechanical and mechanical artefacts as they developed throughout time. Its aim is to provide a perspective that may inspire modern thinking about technology and media within its epistemological implications beyond bare historiography.” (Media Archaeological Fundus) Ernst explained the intention behind the MAF in an interview with Lori Emerson as deriving from the need to experience media “in performative ways.” So he created an assemblage of media and media technologies that could be operated, touched, manipulated and studied directly. He said in this interview, “such items need to be displayed in action to reveal their media essentiality (otherwise a medium like a TV set is nothing but a piece of furniture).” (Owens) Here is media archaeology’s indirect response to the 1979 AFIPS brochure’s suggestion that historical artifacts serve a purpose similar to furnishings in a preserved house.

The media-archaeological take on enacting history depends on access to artifacts and, in its strongest form, on their operation. Even when its engagement with media history is reduced to texts, these must be “tested against the material evidence.” This is the use case for Playback as an enactment of software history.

Take 3. The Re-enactor

Take three. The Re-enactor. Authenticity is an important concept for digital preservation.   A key feature of any digital archive over the preservation life-cycle of its documents and software objects is auditing and verification of authenticity, as in any archive. Access also involves authenticity, as any discussion of emulation or virtualization will bring up the question of fidelity to an historical experience of using software.

John Walker (of AutoDesk and Virtual Reality fame) created a workshop called Fourmilab to work on personal projects such as an on-line museum “celebrating” Charles Babbage’s Analytical Engine. This computer programming heritage work includes historical documents and a Java-based emulator of the Engine. Walker says, “Since we’re fortunate enough to live in a world where Babbage’s dream has been belatedly realised, albeit in silicon rather than brass, we can not only read about The Analytical Engine but experience it for ourselves.” The authenticity of this experience – whatever that means for a machine that never existed – is important to Walker. In a 4500-word essay titled, “Is the Emulator Authentic,” he tells us that, “In order to be useful, an emulator program must be authentic—it must faithfully replicate the behaviour of the machine it is emulating.” By extension, the authenticity of a preserved version of the computer game DOOM in a digital repository could be audited by verifying that it can properly run a DOOM demo file. The same is true for Microsoft Word and a historical document in the Word format. This is a machine-centered notion of authenticity; we used it in the second Preserving Virtual Worlds project as a solution to the significant properties problem for software. (Walker, “Introduction;” Walker, “Analytical Engine.”)

All well and good. However, I want to address a different authenticity. Rather than judging authenticity in terms of playback, I would like to ask what authenticity means for the experience of using software. Another way of putting this question is to ask what we are looking for in the re-enactment of historical software use. So we need to think about historical re-enactment.

I am not a historical re-enactor, at least not the kind you are thinking of. I have never participated in the live recreation or performance of a historical event. Since I have been playing historical simulations – a category of boardgames – for most of my life, perhaps you could say that I re-enact being a historical military officer by staring at maps and moving units around on them. It’s not the same thing as wearing period uniforms and living the life, however.

Anyway, I need a re-enactor. In his 1998 book Confederates in the Attic, Tony Horwitz described historical re-enactment in its relationship to lived heritage. (Horwitz) His participant-journalist reportage begins at a chance encounter with a group of “hard-core” Confederate re-enactors. Their conversation leads Horwitz on a year-long voyage through the American South. A featured character in Confederates in the Attic is the re-enactor Robert Lee Hodge, a waiter turned Confederate officer. He took Horwitz under his wing and provided basic training in re-enactment. Hodge even became a minor celebrity due to his role in the book.

Hodges teaches Horwitz the difference between hard-core and farby (i.e., more casual) re-enactment. He tells Horwitz about dieting to look sufficiently gaunt and malnourished, the basics of “bloating” to resemble a corpse on the battlefield, what to wear, what not to wear, what to eat, what not to eat, and so on. It’s remarkable how little time he spends on martial basics. One moment sticks out for me. During the night after a hard day of campaigning Horwitz finds himself in the authentic situation of being wet, cold and hungry. He lacks a blanket, so he is given basic instruction in the sleeping technique of the Confederate infantryman: “spooning.” According to the re-enactor Scott Cross, “Spooning is an old term for bundling up together in bed like spoons placed together in the silver chest.” (Horwitz) Lacking adequate bedding and exposed to the elements, soldiers bunched up to keep warm. So that’s what Horwitz does, not as an act of mimesis or performance per se, but in order to re-experience the reality of Civil War infantrymen.

It interested me that of all the re-enactment activities Horwitz put himself through, spooning reveals a deeper commitment to authenticity than any of the combat performances he describes. It’s uncomfortable and awkward, so requires dedication and persistence. Sleep becomes self-conscious, not just in order to stick with the activity, but because the point of it is to recapture a past experience of sleeping on the battlefield. Since greater numbers of participants are needed for re-enacting a battle than sleep, more farbs (the less dedicated re-enactors) show up and thus the general level of engagement declines. During staged battles, spectators, scripting, confusion and accidents all interfere with the experience. Immersion breaks whenever dead soldiers pop up on the command, “resurrect.” In other words, performance takes over primacy from the effort to re-experience. It is likely that many farbs dressed up for battle are content to find a hotel to sleep in.

Specific attention to the details of daily life might be a reflection of recent historical work that emphasizes social and cultural histories of the Civil War period, rather than combat histories. But that’s not my takeaway from the spooning re-enactors. Rather, it’s the standard of authenticity that goes beyond performance of a specific event (such as a battle) to include life experience as a whole. Horvitz recalled that,

Between gulps of coffee—which the men insisted on drinking from their own tin cups rather than our ceramic mugs—Cool and his comrades explained the distinction. Hardcores didn’t just dress up and shoot blanks. They sought absolute fidelity to the 1860s: its homespun clothing, antique speech patterns, sparse diet and simple utensils. Adhered to properly, this fundamentalism produced a time travel high, or what hardcores called a ‘period rush.’ (Horwitz, Loc. 153-157)

Stephen Gapps, an Australian curator, historian, and re-enactor has spoken of the “extraordinary lengths” re-enactors go to “acquire and animate the look and feel of history.” Hard-core is not just about marching, shooting and swordplay. I wonder what a “period rush” might be for the experience of playing Pitfall! in the mid-21st century. Shag rugs? Ambient New Wave radio? Caffeine-free cola? Will future re-enactors of historical software seek this level of experiential fidelity? Gapps, again: “Although reenactors invoke the standard of authenticity, they also understand that it is elusive – worth striving for, but never really attainable.” (Gapps 397)

Re-enactment offers a take on born-digital heritage that proposes a commitment to lived experience. I see some similarity here with the correspondence to lived historical experience in White’s striving for a discursive mimesis. Yet, like media archaeology, re-enactment puts performance above discourse, though it is the performance of humans rather than machines.

Playing Pitfalls

We now have three different ways to think about potential uses of historical software and born digital documentation. I will shift my historian’s hat to one side of my head now and slide up my curator’s cap. If we consider these takes as use cases, do they help us decide how to allocate resources to acquire, preserve, describe and provide access to digital collections?

In May 2013, the National Digital Information Infrastructure and Preservation Program (NDIIPP) of the U.S. Library of Congress (henceforth: LC) held a conference called Preserving.exe. The agenda was to articulate the “problems and opportunities of software preservation.” In my contribution to the LC conference report issued a few months later, I described three “lures of software preservation.” (Lowood) These are potential pitfalls as we move from software collections to digital repositories and from there to programs of access to software collections. The second half of this paper will be an attempt to introduce the three lures of software preservation to the three takes on historical enactment.

  1. The Lure of the Screen

Let’s begin with the Lure of the Screen. This is the idea that what counts in digital media is what is delivered to the screen. This lure pops up in software preservation when we evaluate significant properties of software as surface properties (graphics, audio, haptics, etc).

This lure of the screen is related to what media studies scholars such as Nick Montfort, Mark Sample and Matt Kirschenbaum have dubbed (in various but related contexts) “screen essentialism.” If the significant properties of software are all surface properties, then our perception of interaction with software tells us all we need to know. We check graphics, audio, responses to our use of controllers, etc., and if they look and act as they should, we have succeeded in preserving an executable version of historical software. These properties are arguably the properties that designers consider as the focus of user interaction and they are the easiest to inspect and verify directly.

The second Preserving Virtual Worlds project was concerned primarily with identifying significant properties of interactive game software. On the basis of several case sets and interviews with developers and other stakeholders, we concluded that isolating surface properties, such as image colourspace as one example, while significant for other media such as static images, is not a particularly useful approach to take for game software. With interactive software, significance appears to be variable and contextual, as one would expect from a medium in which content is expressed through a mixture of design and play, procedurality and emergence. It is especially important that software abstraction levels are not “visible” on the surface of play. It is difficult if not impossible to monitor procedural aspects of game design and mechanics, programming and technology by inspecting properties expressed on the screen.

The preservation lifecycle for software is likely to include data migration. Access to migrated software will probably occur through emulation. How do we know when our experience of this software is affected by these practices? One answer is that we audit significant properties, and as we now know, it will be difficult to predict which characteristics are significant. An alternative or companion approach for auditing the operation of historical software is to verify the execution of data files. The integrity of the software can be evaluated by comparison to documented disk images or file signatures such as hashes or checksums. However, when data migration or delivery environments change the software or its execution environment, this method is inadequate. We must evaluate software performance. Instead of asking whether the software “looks right,” we can check if it runs verified data-sets that meet the specifications of the original software. Examples range from word processing documents to saved game and replay files. Of course, visual inspection of the content plays a role in verifying execution by the software engine; failure will not always be clearly indicated by crashes or error messages. Eliminating screen essentialism does not erase surface properties altogether.

The three takes compel us to think about the screen problem in different ways. First, the Historian is not troubled by screen essentialism. His construction of a narrative mimesis invokes a selection of source materials that may or may not involve close reading of personal gameplay, let alone focus on surface properties. On the other hand, The Re-enactor’s use of software might lead repositories to fret about what the user sees, hears and feels. It makes sense with this use case to think about the re-enactment as occurring at the interface. If a repository aims to deliver a re-enacted screen experience, it will need to delve deeply into questions of significant properties and their preservation.

Screen essentialism is also a potential problem for repositories that follow the path of Media Archaeology. It is unclear to me how a research site like the MAF would respond to digital preservation practices based on data migration and emulation. Can repositories meet the requirements of media archaeologists without making a commitment to preservation of working historical hardware to enable playback from original media? It’s not just that correspondence to surface characteristics is a significant property for media archaeologists. Nor is the Lure of the Screen a criticism of Media Archaelogy. I propose instead that it is a research problem. Ernst’s vision of a Multimedia Archive is based on the idea that media archaeology moves beyond playback to reveal mechanisms of counting. This machine operation clearly is not a surface characteristic. Ernst would argue, I think, that this counting is missed by an account of what is seen on the screen. So let’s assign the task of accounting for counting to the Media Archaeologist, which means showing us how abstraction layers in software below the surface can be revealed, audited and studied.

  1. The Lure of the Authentic Experience

I have already said quite a bit about authenticity. Let me explain now why I am sceptical about an authentic experience of historical software, and why this is an important problem for software collections.

Everyone in game or software studies knows about emulation. Emulation projects struggle to recreate an authentic experience of operating a piece of software such as playing a game. Authenticity here means that the use experience today is like it was. The Lure of the Authentic Experience tells digital repositories at minimum not to preserve software in a manner that would interfere with the production of such experiences. At maximum, repositories deliver authentic experiences, whether on-site or on-line. A tall order. In the minimum case, the repository provides software and collects hardware specifications, drivers or support programs. The documentation provides software and hardware specifications. Researchers use this documentation to reconstruct the historical look-and-feel of software to which they have access. In the maximum case, the repository designs and builds access environments. Using the software authentically would then probably mean a trip to the library or museum with historical or bespoke hardware. The reading room becomes the site of the experience.

I am not happy to debunk the Authentic Experience. Authenticity is a concept fraught not just with intellectual issues, but with registers ranging from nostalgia and fandom to immersion and fun. It is a minefield. The first problem is perhaps an academic point, but nonetheless important: Authenticity is always constructed. Whose lived experience counts as “authentic” and how has it been documented? Is the best source a developer’s design notes? The memory of someone who used the software when it was released? A marketing video? The researcher’s self-reflexive use in a library or museum? If a game was designed for kids in 1985, do you have to find a kid to play it in 2050? In the case of software with a long history, such as Breakout or Microsoft Word, how do we account for the fact that the software was used on a variety of platforms – do repositories have to account for all of them? For example, does the playing of DOOM “death match” require peer-to-peer networking on a local area network, a mouse-and-keyboard control configuration and a CRT display? There are documented cases of different configurations of hardware: track-balls, hacks that enabled multiplayer via TCPIP, monitors of various shapes and sizes, and so on. Which differences matter?

A second problem is that the Authentic Experience is not always that useful to the researcher, especially the researcher studying how historical software executes under the hood. The emulated version of a software program often compensates for its lack of authenticity by offering real-time information about system states and code execution. A trade-off for losing authenticity thus occurs when the emulator shows the underlying machine operation, the counting, if you will. What questions will historians of technology, practitioners of code studies or game scholars ask about historical software? I suspect that many researchers will be as interested in how the software works as in a personal experience deemed authentic.   As for more casual appreciation, the Guggenheim’s Seeing Double exhibition and Margaret Hedstrom’s studies of emulation suggest that exhibition visitors actually prefer reworked or updated experiences of historical software. (Hedstrom, Lee, et al.; Jones)

This is not to say that original artefacts – both physical and “virtual” – will not be a necessary part of the research process. Access to original technology provides evidence regarding its constraints and affordances. I put this to you not as a “one size fits all” decision but as an area of institutional choice based on objectives and resources.

The Re-enactor, of course, is deeply committed to the Authentic Experience. If all we offer is emulation, what do we say to him, besides “sorry.” Few digital repositories will be preoccupied with delivering authentic experiences as part of their core activity. The majority are likely to consider a better use of limited resources to be ensuring that validated software artefacts and contextual information are available on a case-by-case basis to researchers who do the work of re-enactment. Re-enactors will make use of documentation. Horwitz credits Robert Lee Hodge with an enormous amount of research time spent at the National Archives and Library of Congress. Many hours of research with photographs and documents stand behind his re-enactments. In short, repositories should let re-enactors be the re-enactors.

Consider this scenario for software re-enactment. You are playing an Atari VCS game with the open-source Stella emulator. It bothers you that viewing the game on your LCD display differs from the experience with a 1980s-era television set. You are motivated by this realization to contribute code to the Stella project for emulating a historical display. It is theoretically possible that you could assemble everything needed to create an experience that satisfies you – an old television, adapters, an original VCS, the software, etc. (Let’s not worry about the shag rug and the lava lamp.) You can create this personal experience on your own, then write code that matches it. My question: Is the result less “authentic” if you relied on historical documentation such as video, screenshots, technical specifications, and other evidence available in a repository to describe the original experience? My point is that repositories can cooperatively support research by re-enactors who create their version of the experience. Digital repositories should consider the Authentic Experience as more of a research problem than a repository problem.

  1. The Lure of the Executable

The Lure of the Executable evaluates software preservation in terms of success at building collections of software that can be executed on-demand by researchers.

Why do we collect historical software? Of course, the reason is that computers, software, and digital data have had a profound impact on virtually every aspect of recent history. What should we collect? David Bearman’s answer in 1987 was the “software archive.” He distinguished this archive from what I will call the software library. The archive assembles documentation; the library provides historical software. The archive was a popular choice in the early days. Margaret Hedstrom reported that attendees at the 1990 Arden Conference on the Preservation of Microcomputer Software “debated whether it was necessary to preserve software itself in order to provide a sense of ‘touch and feel’ or whether the history of software development could be documented with more traditional records.” (Hedstrom and Bearman) In 2002, the Smithsonian’s David Allison wrote about collecting historical software in museums that, “supporting materials are often more valuable for historical study than code itself. They provide contextual information that is critical to evaluating the historical significance of the software products.” He concluded that operating software is not a high priority for historical museums. (Allison 263-65; cf. Shustek)

Again, institutional resources are not as limitless as the things we would like to do with software. Curators must prioritize among collections and services. The choice between software archive and library is not strictly binary, but choices still must be made.

I spend quite a bit of my professional life in software preservation projects. The end-product of these projects is at least in part the library of executable historical software. I understand the Lure of the Executable and the reasons that compel digital repositories to build collections of verified historical software that can be executed on-demand by researchers. This is the Holy Grail of digital curation with respect to software history. What could possibly be wrong with this mission, if it can be executed?   As I have argued on other occasions there are several problems to consider. Let me give you two. The first is that software does not tell the user very much about how it has previously been used. In the best case, application software in its original use environment might display a record of files created by previous users, such as a list of recently opened files found in many productivity titles like Microsoft Office. The more typical situation is that software is freshly installed from data files in the repository and thus completely lacks information about its biography, for want of a better term.

The second, related problem is fundamental. Documentation that is a prerequisite for historical studies of software is rarely located in software. It is more accurate to say that this documentation surrounds software in development archives (including source code) and records of use and reception. It is important to understand that this is not just a problem for historical research. Documentation is also a problem for repositories. If contextual information such as software dependencies or descriptions of relationships among objects is not available to the repository and all the retired software engineers who knew the software inside-and-out are gone – it may be impossible to get old software to run.

Historians, of course, will usually be satisfied with the Archive. Given limited resources, is it reasonable to expect that the institutions responsible for historical collections of documentation will be able to reconcile such traditional uses with other methods of understanding historical computing systems? The Re-enactor will want to run software, and the Media Archaeologist will not just want access to a software library, but to original media and hardware in working order. These are tall orders for institutional repositories such as libraries and archives, though possibly a better fit to the museum or digital history centre.

In Best Before: Videogames, Supersession and Obsolescence, James Newman is not optimistic about software preservation and he describes how the marketing of software has in some ways made this a near impossibility. He is not as pessimistic about video game history, however. In a section of his book provocatively called “Let Videogames Die,” he argues that a documentary approach to gameplay might be a more pragmatic enterprise than the effort to preserve playable games. He sees this as a “shift away from conceiving of play as the outcome of preservation to a position that acknowledges play as an indivisible part of the object of preservation.” (Newman 160) In other words, what happens when we record contemporary use of software to create historical documentation of that use? Does this activity potentially reduce the need for services that provide for use at any given time in the future? This strikes me as a plausible historical use case, but not one for re-enactment or media archaeology.

Software archives or software libraries? That is the question. Is it nobler to collect documentation or to suffer the slings and arrows of outrageous software installations? The case for documentation is strong. The consensus among library and museum curators (including myself) is almost certainly that documents from source code to screenshots are a clear win for historical studies of software. Historians, however, will not be the only visitors to the archive. But there are other reasons to collect documentation. One of the most important reasons, which I briefly noted above, is that software preservation requires such documentation. In other words, successful software preservation activities are dependent upon technical, contextual and rights documentation. And of course, documents tell re-enactors how software was used and can help media archaeologists figure out what their machines are showing or telling them. But does documentation replace the software library? Is it sufficient to build archives of software history without libraries of historical software? As we have seen, this question was raised nearly forty years ago and remains relevant today. My wish is that this question of the relationship between documentation and software as key components of digital heritage work stir conversation among librarians, historians, archivists and museum curators. This conversation must consider that there is likely to be a broad palette of use cases such as the historian, media archaeologist and re-enactor, as well as many others not mentioned here. It is unlikely that any one institution can respond to every one of these use cases. Instead, the more likely result is a network of participating repositories, each of which will define priorities and allocate resources according to both their specific institutional contexts and an informed understanding of the capabilities of partner institutions.

 

References

Allison, David K. “Preserving Software in History Museums: A Material Culture Approach. Ed. Ulf Hashagen, Reinhard Keil-Slawik and Arthur L. Norberg. History of Computing: Software Issues. Berlin: Springer, 2002. 263-272.

Bearman, David. Collecting Software: A New Challenge for Archives and Museums. Archival Informatics Technical Report #2 (Spring 1987).

— “What Are/Is Informatics? And Especially, What/Who is Archives & Museum Informatics?” Archival Informatics Newsletter 1:1 (Spring 1987): 8.

Cross, Scott. “The Art of Spooning.” Atlantic Guard Soldiers’ Aid Society. 13 July 2016. Web. http://www.agsas.org/howto/outdoor/art_of_spooning.shtml. Originally published in The Company Wag 2, no. 1 (April 1989).

Ernst, Wolfgang. Digital Memory and the Archive. (Minneapolis: Univ. Minnesota Press, 2012). Kindle edition.

Gapps, Stephen. “Mobile monuments: A view of historical reenactment and authenticity from inside the costume cupboard of history.” Rethinking History: The Journal of Theory and Practice, 13:3 (2009): 395-409.

Hedstrom, Margaret L., Christopher A. Lee, Judith S. Olson and Clifford A. Lampe, “‘The Old Version Flickers More’: Digital Preservation from the User’s Perspective.” The American Archivist, 69: 1 (Spring – Summer 2006): 159-187.

Hedstrom, Margaret L., and David Bearman, “Preservation of Microcomputer Software: A Symposium,” Archives and Museum Informatics 4:1 (Spring 1990): 10.

Horwitz, Tony. Confederates in the Attic: Dispatches from the Unfinished Civil War. New York: Pantheon Books, 1998. Kindle Edition.

Jones, Caitlin. “Seeing Double: Emulation in Theory and Practice. The Erl King Study.” Paper presented to the Electronic Media Group, 14 June 2004. Electronic Media Group. Web. http://cool.conservation-us.org/coolaic/sg/emg/library/pdf/jones/Jones-EMG2004.pdf

Lowood, Henry. “The Lures of Software Preservation.” Preserving.exe: Toward a National Strategy for Software Preservation (October 2013): 4-11. Web. http://www.digitalpreservation.gov/multimedia/documents/PreservingEXE_report_final101813.pdf

Media Archaeological Fundus. Web. 21 Jan. 2016. http://www.medienwissenschaft.hu-berlin.de/medientheorien/fundus/media-archaeological-fundus

Newman, James. Best Before: Videogames, Supersession and Obsolescence. London: Routledge, 2012.

Owens, Trevor. “Archives, Materiality and the ‘Agency of the Machine’: An Interview with Wolfgang Ernst.” The Signal: Digital Preservation. Web. 8 February 2013. http://blogs.loc.gov/digitalpreservation/2013/02/archives-materiality-and-agency-of-the-machine-an-interview-with-wolfgang-ernst/

“Preserving Computer-Related Source Materials.” IEEE Annals of the History of Computing 1 (Jan.-March 1980): 4-6.

Shustek, Len. “What Should We Collect to Preserve the History of Software?” IEEE Annals of the History of Computing, 28 (Oct.-Dec. 2006): 110-12.

Walker, John. “Introduction” to The Analytical Engine: The First Computer.” Fourmilab, 21 March 2016. Web. http://www.fourmilab.ch/babbage/

— “The Analytical Engine: Is the Emulator Authentic?,” Fourmilab, 21 March 2016. Web. http://www.fourmilab.ch/babbage/authentic.html

White, Hayden. The Content of the Form: Narrative Discourse and Historical Representation. Baltimore: Johns Hopkins Univ. Press, 1987.

Figural Realism: Studies in the Mimesis Effect. Baltimore: Johns Hopkins Univ. Press, 2000.

— “The Question of Narrative in Contemporary Historical Theory.” In: History and Theory 23: 1 (Feb. 1984): 1-33.

 

Bio

Henry Lowood is Curator for History of Science & Technology Collections and for Film & Media Collections at Stanford University. He has led the How They Got Game project at Stanford University since 2000 and is the co-editor of The Machinima Reader and Debugging Game History, both published by MIT Press. Contact: lowood@stanford.edu