Blog Archive for the ‘Digital Dark Age’ Category

navigateleft Older Articles    Newer Articles navigateright

Our Digital Afterlives

Posted on Monday, April 22nd, 02013 by Charlotte Hajer
link   Categories: Digital Dark Age, Technology   chat 0 Comments

dddgirl

In 02006, Long Now Board Member David Eagleman wrote in Nature:

There is no afterlife, but a version of us lives on nonetheless.

At the beginning of the computer era, people died with passwords in their heads and no one could access their files. When access to these files was critical, companies could grind to a halt. That’s when programmers invented death switches.

With a death switch, the computer prompts you for your password once a week to make sure you are still alive. When you don’t enter your password for some period of time, the computer deduces you are dead, and your passwords are automatically e-mailed to the second-in-command. Individuals began to use death switches to reveal Swiss bank account numbers to their heirs, to get the last word in an argument, and to confess secrets that were unspeakable during a lifetime.

In other words, a “death switch” is a way for us to pre-program an afterlife for our digital selves. Despite the relatively short lifespan of software platforms, it is likely that the data we post on the internet will live on – somewhere – after we ourselves expire.

Eagleman, along with several others, is urging us to think about what will happen to our digital legacy after death: to decide where we want our data to live, and who will have the privilege to engage with it. Do we want to place our legacy in the hands of an heir, or do we want our online presence to be erased? Alternatively, do we perhaps want to designate our own computers as executors of our estate, and have it send out friendly messages to our descendants every once in a while?

Over the past two years, a series of Digital Death Day “unconferences” has brought people together to talk about these kinds of questions. Evan Carroll and John Romano published a book and host an accompanying blog about ways to shape our digital afterlives. And most recently, Google introduced its Inactive Account Manager: a new tool that allows you to decide what will happen to your emails, photo albums, posted videos and personal profiles when your account becomes inactive.

Planning for our digital beyond is a way to save our own lives from receding into a digital dark age – and as such, it may be a way to keep something of ourselves alive after our bodies die. Eagleman muses:

This situation allows us to forever revisit shared jokes, to remedy lost opportunities for a kind word, to recall stories about delightfully earthly experiences that can no longer be felt. Memories now live on their own, and no one forgets them or grows tired of telling them. We are quite satisfied with this arrangement, because reminiscing about our glory days of existence is perhaps all that would have happened in an afterlife anyway.

O’Reilly Talks about Digital Preservation

Posted on Tuesday, April 2nd, 02013 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Thinking   chat 0 Comments

Former SALT speaker Tim O’Reilly recently shared the video of a talk he gave on digital preservation at the Library of Congress in 02011.

Discussing some of his own “personal failures” to archive O’Reilly Media’s early projects, O’Reilly here emphasizes the importance of preserving digital information and resources in a world where printed matter may eventually become obsolete. We risk slipping into a digital dark age if we continue to treat digital archiving as an “afterthought.” Citing the examples of Wikipedia and Github, O’Reilly therefore urges us to recognize the long-term relevance of what we create on the web. He suggests we change the way we engage with online systems, and build preservation into the very core of our online activity.

Danny Hillis: We need a backup internet

Posted on Tuesday, March 19th, 02013 by Austin Brown
link   Categories: Digital Dark Age, Long Term Thinking, Technology   chat 0 Comments

dannyTED

Speaking at TED earlier this year, Long Now co-founder Danny Hillis described the early days of networked computing – a time when one could register “think.com” on a whim and everyone with an email address or a domain could be listed in a single book.

He explained that the design of the Internet Protocol and the early community using it were infused with communistic values - ironic, he notes, as the tech grew out of Cold War militarism.

Since then, of course, the internet, its users and its uses have expanded far beyond the wildest dreams of its creators. In so doing, it has become an essential societal infrastructure – without having been designed as such. As another Long Now Board Member, David Eagleman, points out, the internet is not invulnerable. Emergency communications and other high-priority services must be possible without the internet, but increasingly depend on it.

Hillis says a separate backup internet would not be hard to build and would dramatically increase our resilience to disaster and malfeasance.

Reviving and Restoring Digital Art

Posted on Wednesday, February 27th, 02013 by Charlotte Hajer
link   Categories: Digital Dark Age   chat 0 Comments

topoform

With the ever-accelerating evolution of hardware and software, we stand to lose much more than reels of data. A vast collection of computer art risks slipping into digital darkness, as well.

Concerned about this impending loss, NYU student Matthew Epler recently founded the ReCode project: a community-driven effort to create an active archive for the products of “creative computing.”

But lest the idea of an “archive” should call to mind anything stuffy or rigid, the project involves much more than preservation alone. Epler hopes to revive computer art by translating it into a contemporary programming language, and thereby making it available for others to learn from and use in their own creative pursuits. (This isn’t storage – it’s “movage.”)

Furthermore, the archaeological endeavor of deciphering the obsolescent code in which works of digital art were written invites us to learn something about the history of computer languages. As such, Epler hopes that the ReCode project will generate awareness of the historical and cultural context in which computer art is created. In a blog post on Wired’s web page, Epler writes:

“… this project aims to start a larger discussion about the transitory nature of not only our work but also the languages and platforms on which we create it. Can we see a common conceptual thread through these pieces that speaks to the digital art practice as a whole over decades? Or will our work always be limited by our machines? … I feel these discussions are necessary to push through the “Tumblr-era” of digital art works and into one that legitimizes itself through critical reflection and historical grounding.”

If you’re interested in getting involved, Epler invites you to join in on the process of translation. Here’s a guide to get you started.

TimesMachine: “All the News That Was Fit to Print”

Posted on Wednesday, February 13th, 02013 by Austin Brown
link   Categories: Digital Dark Age, Long Term Thinking   chat 0 Comments

If you’re a fan of this video showing a years’ worth of front pages for NYTimes.com, or the Way Back Machine, which allows you to browse the internet of the past, you might also love a project released by the New York Times today: The TimesMachine.

TimesMachine can take you back to any issue from Volume 1, Number 1 of The New-York Daily Times, on September 18, 1851, through The New York Times of December 30, 1922. Choose a date in history and flip electronically through the pages, displayed with their original look and feel.

You’ve got to be a subscriber to access most of the content, but issues from several important days in history are freely available in their entirety:

nytimes

Preserving Virtual Worlds

Posted on Thursday, November 1st, 02012 by Catherine Borgeson
link   Categories: Digital Dark Age   chat 0 Comments

“This is our history, and just a handful of people are saving it.”

– PixelVixen707, screen name of “Rachael Webster,” a fictional character in the alternate reality game Personal Effects: Dark Art

Virtual games are becoming cultural artifacts. Yes, they are commodities, (the global market for video games is forecast to hit $82 billion by 02017) but, beyond entertainment, they also facilitate complex exchanges between many members of society.  It would be impossible to provide an accurate record of much of our existing popular culture without archiving games.

Take the online world EA Land, formerly The Sims Online, for example.  The service was generating very little revenue, despite the re-branding, resulting in its sudden demise (a common phenomenon in the digital realm). Electronic Arts pulled the plug just weeks after its debut as EA Land — its designer became its destroyer. Amidst avatars exchanging hugs and tearful goodbyes, a virtual world winked out of existence with a whimper — a whimper that might have only been witnessed by the players had it not been for Stanford archival researcher Henry Lowood. He captured the virtual apocalypse presented in the 11-minute radio piece, “Game Over.” Roman Mars (now with 99% Invisible) originally produced this episode for Snap Judment along with Robert Ashley’s A Life Well Wasted (an Internet radio program about everything video games).

Preserving video games presents formidable challenges. These nebulous artifacts consist of platforms, operating systems and network communities. The digital content is interactive, which is just as defining as the code itself.  These challenges were the subject of Preserving Virtual Worlds (PVW), a two-year collaborative research venture geared towards preserving and exploring the history and cultural impact of interactive simulations and video games — saving video games for future generations (02008-02010). In addition to Henry Lowood of Stanford University, Rochester Institute of Technology, the University of Maryland, the University of Illinois at Urbana-Champaign and Linden Lab collaborated on the project.

PVW focused its investigation on a case set of eight different games, interactive fictions and virtual worlds. The case studies were selected from different time periods in gaming history, different platforms and different degrees of player involvement to maximize the potential problems that might prevent the games from being preserved in the long-term.  Based on their investigations into the games, they developed a set of requirements for game preservation.

Unlike a book in a library, computer games have very poorly defined boundaries that make it difficult to determine exactly what the object of preservation should be. Is it the source code for the program? The binary executable version of the program? Is it the executable program along with the operating system under which the program runs? Should the hardware on which the operating system runs be included? Ultimately, a computer game cannot be played without a complex and interconnected set of programs and hardware. Is the preservationist’s job maintaining a particular, operating combination of elements, or is to preserve the capability to produce an operating combination using existing software and hardware? Is it both? Once these questions of the boundaries of preservation have been addressed, there are a host of other difficulties presenting the would-be preservationist. What information, beyond the game itself, will we need to ensure continuing access to the game? How should librarians, archivists and preservationists go about organizing the bed of information needed to preserve a game? What strategy should we adopt to preserve software in a technological environment in which computing hardware and operating systems are undergoing constant and rapid evolution? Given the costs of preservation of normal library and archival materials, how can we possibly sustain the additional costs of preserving these complex and fragile technological artifacts?

Preserving Virtual Worlds 2 follows the initial report. PVW2 will focus on providing a set of best practices for preserving educational games and game franchises, such as Oregon Trail,  Carmen Sandiego and the Super Mario Brothers series.

Decoding Long-Term Data Storage

Posted on Friday, October 12th, 02012 by Charlotte Hajer
link   Categories: Digital Dark Age, Rosetta, Technology   chat 0 Comments

If human societies are founded on the accumulation of knowledge through the ages, then the long-term transmission of information must be the cornerstone of a durable civilization. And as we accelerate ever more rapidly in our expansion of knowledge and technological capability, the development of durable storage methods becomes ever more important.

In the process of brainstorming such methods, two central questions emerge. The first of these concerns the type of storage media you might use: what kind of material is likely to last long enough to convey a message to generations thousands of years into the future? Throughout much of history, people carved important messages into stone, bone, or other hard materials. So far, we don’t seem to have come up with anything better: most of us are familiar with the limited lifespan of CDs, vinyl, and computer hard drives. Faced with this lack of suitable options, several organizations and companies around the world have re-embraced the long-term durability of hard natural substances. The Long Now’s Rosetta disk, for example, is made of nickel. Arnano, a French technology start-up, has developed a disk of sapphire on which to micro-etch information – civic records, perhaps, or important messages about the storage of nuclear waste. And most recently, Japanese electronics giant Hitachi announced a new data storage technology that uses quartz glass.

The second – and perhaps even more intriguing – question concerns the language of your message. What kind of ‘code’ will be most easily accessible to future generations, and what technologies will they have available to help them decrypt a message from the past?

The storage and transmission of data often requires multiple levels of encoding. When we think of ‘code’ we often think of computers – but in fact, we routinely go through two layers of encryption before we can even begin to digitize information. Spoken human language is itself a code, in which sounds are used to signify things or ideas. The use of a writing system adds a further layer of encryption: sequences of letters or pictographs signify the sounds that represent things or ideas. Yet another layer of encryption can then be applied by translating a writing system into binary numbers (and numeric systems are a kind of code, as well!) or perhaps even DNA.

These extra layers of encoding offer the advantage of information density: they can help you pack lots of information into a very small format. However, each layer also further complicates the decodability and readability of a message. Because the Rosetta Disk is itself intended to be a tool for decryption – a primer of human language meant to help future archaeologists unlock entire worlds of culture, just like the Rosetta stone did in the 19th century – Long Now has chosen to store its data in the analog form of human alphabets, rather than add an extra layer of encryption by a digital code of 1’s and 0’s.

Arnano, the makers of the sapphire disk, have made a similar choice. The added advantage is that this analog information is readable by the human eye (aided by a microscope or magnifying glass).

It’s safe to assume that the languages future generations will speak – and the technologies they’ll have available – will most likely be very different from what we use today. This brings up an important third question: how do you include ‘instructions’ for decoding and reading with your message? Following the example of its namesake predecessor, the content on Long Now’s Rosetta Disk is its own primer: if you know at least one of the 1,500 languages included on the disk, all other information can be decoded. Perhaps a similar kind of parallel multiplicity of codes is possible for other storage methods as well.

These questions of language and code are inevitably more difficult to answer than that of the storage medium. You can subject your chosen material to stress tests to make sure that it will stand up to acid, erosion, or any other kind of potential natural disaster. But there’s no similar test for language; it’s impossible to predict what codes will be interpretable by the people of the future, or what technology they’ll have available to decrypt a message. Nevertheless, these conundrums are no less important to grapple with, and any proposal for long-term storage worth its salt must offer some potential answers to these questions.

Decaying Web

Posted on Sunday, September 30th, 02012 by Alexander Rose - Twitter: @zander
link   Categories: Digital Dark Age   chat 0 Comments

 

Tom Chatfield from BBC online writes about the newest flavor of the digital dark age…  Lost tweets and social media:

On January 28 2011, three days into the fierce protests that would eventually oust the Egyptian president Hosni Mubarak, a Twitter user called Farrah posted a link to a picture that supposedly showed an armed man as he ran on a “rooftop during clashes between police and protesters in Suez”. I say supposedly, because both the tweet and the picture it linked to no longer exist.  Instead they have been replaced with error messages that claim the message – and its contents – “doesn’t exist”.  [Continue to Article]

Cultural Memories in the Digital World

Posted on Wednesday, September 19th, 02012 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Thinking, Technology   chat 0 Comments

A book is much more than a collection of information. It is also a physical object, and this materiality plays an important role in shaping the way we relate to literature. Think of how the pages of your favorite story feel between your fingers, and the way its spine creases as you immerse yourself further and further in the plot. The weight of a thick novel reflects the depth of its story, an illustrated cover helps to seed your imagination, while the font in which the text is printed might convey a certain emotional tone. Perhaps you like to record your own thoughts in the white margins of a book’s pages; perhaps you prefer to leave those edges clean and allow the story to stand on its own. Either way, the materiality of a book shapes our relationship to its content; it becomes a physical souvenir of our engagement with a story.

So what happens, then, as our media and the culture they convey move increasingly into the digital world? How does the emergence of e-books change the way we experience a story?

These questions were the subject of a brief talk by publisher and technologist James Bridle, broadcast recently on BBC Radio’s Four Thought. Bridle suggests that the digitization and globalization of our cultural world not only transforms the nature of our cultural artifacts; it changes us, as well.

People are changed by these encounters with the network as much as our cultural objects are. That’s fundamentally important. Even though we’ve always been connected [to the world] in all these ways, the visibility of that connection that the network brings, is deeply strange. You can reach out across space, and you can reach out across time, as well; the network has this extraordinary flattening effect on time, so things that look distant are just as accessible to us as things that are near. And you can see this process happening in the ways that we write, in the ways that we read, and the things that it’s doing to the texts themselves.

Whereas it was the physicality of a book that brought its narrative to life in our experience, it is now the instantaneousness and interactivity of information that facilitate our connection to a story. But while this necessarily changes the way we engage with cultural artifacts, Bridle suggests that this need not entail a loss of value. Our collective cultural memory is not in the process of disappearing; it is simply being transformed – embodied no longer by physical objects, but rather by the process of sharing.

Storing Digital Data in DNA

Posted on Thursday, August 16th, 02012 by Laura Welcher
link   Categories: Digital Dark Age, Long Term Science, Rosetta   chat 0 Comments

Schematic of DNA information storage.jpg

Reported in Science today, scientists George Church, Yuan Gao and Sriram Kosuri report that they have written a 5.27-megabit “book” in DNA – encoding far more digital data in DNA than has ever been achieved.

Writing messages in DNA was first demonstrated in 1988, and the largest amount of data written in DNA previously was 7,920 bits. The challenge in writing more information than this has been creating long perfect sequences. The current project uses shorter sequences, each encoding 96-bit data block, along with a 19-bit address that specifies the location of the data block within the larger data set. Then redundancy reduces errors: each base only encodes a single bit (A and C are both “0”, G and T are both “one”), and each data block has several molecular copies.

DNA has several advantages for archival data storage – information density, energy efficiency, and stability. With regard to stability DNA offers readability “despite degradation in non-ideal conditions over millennia” – by which they mean 400,000 years! (See Church and Regis, in their forthcoming book on the subject.)

If we wish to intentionally use this technology for active long-term information storage (imagine some crucial message we need to convey to the future), we should probably anticipate the possibility of a discontinuity in technological knowledge and access to tools that could read the information. This raises questions of discoverability, decodability, and readability.

Ubiquity aids discoverability – if the information is everywhere it is easier to find, even stumble upon, by accident. Still, clear signals / signposts could aid discovery (neon green cockroaches anyone?). With regard to decodability, I’ll simply mention there several layers of encoding to be unraveled here: spoken human language > written language in text form > digital / binary > DNA. And presumably readability requires tools on the order of at least what we have available today, unless you can make the expression of the information obvious in some biological way.

Wonderfully exciting new stuff to conjure with from the perspective of technologies for the Long Now Library. We are also delighted to be working with Dr. George Church to provide Rosetta / PanLex data that may be written in a new “edition” of the DNA book, so check back for updates!