Blog Archive for the ‘Digital Dark Age’ Category

navigateleft Older Articles   

Digital Dark Age On The Media

Posted on Monday, June 29th, 02015 by Alexander Rose - Twitter: @zander
link   Categories: Digital Dark Age   chat 0 Comments

Digital_Dark_Age_-_On_The_Media

 

On this week’s episode of On the Media, they dive into the digital preservation issue: what would happen if we, as a species, lost access to our electronic records? What if, either by the slow creep of  technological obsolescence or sudden cosmic disaster, we no longer could draw from the well of of knowledge accrued through the ages? What if we fell into…a digital dark age?

 

 

New Horizons Probe to Send Message to Interstellar Space

Posted on Tuesday, April 28th, 02015 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Thinking, Rosetta, The Big Here   chat 0 Comments

If you could tell the universe about planet Earth, what would you say?

The One Earth Message Initiative is sending a missive to the stars, and they want your input.

The initiative’s goal is to create a message that will be digitally uploaded to a spacecraft currently making its way to the outer reaches of our solar system. Launched in 02006, the New Horizons probe will fly by Pluto, its primary target, later this summer. Once it completes this mission and sends its data back to Earth, the One Earth Message team hopes to use the space thus freed up on the probe’s on-board computer for a message that intelligent extraterrestrial life may one day intercept. They’ve petitioned NASA with more than 10,000 signatures of support from people all over the world, and received the agency’s encouragement to move forward with the project.

The effort is headed by Jon Lomberg, a long-time collaborator of the late astronomer Carl Sagan, who has decades of experience in the aesthetic design of communications both to and about the distant reaches of our universe. He was design director for the Golden Records that have been traveling aboard the Voyager crafts since the late 01970s, and has collaborated on numerous documentaries, films, and blogs about space exploration.

This new project unites his interest in outreach to the earthbound public with his passion for communicating with the universe. The One Earth Message team hopes to crowd-source their message to the furthest extent possible. They intend to create an internet platform where people from all over the globe can submit images for inclusion in the message and review submissions sent in by others. An advisory board of 86 specialists in a variety of fields – among them Long Now’s own Laura Welcher – will help curate submissions to help put together a message that represents the diversity of our global community.

People from every country will have the opportunity to submit photos and other content. Everyone will have the chance to view and vote online for the ones they think should be sent. It will be a global project that brings the people of the world together to speak as one. Who will speak for Earth? YOU WILL! So we are asking for your support to make it so. (Fiat Physica Campaign page)

The team is currently in the midst of a fundraising campaign to build the message website and spread word of the project around the globe. If the campaign is successful, stretch goals include the development of educational material to encourage creative engagement with One Earth Message, and expeditions to the remotest corners of Earth to make sure even the voices living there are included in the New Horizons message.

While there is a possibility that the message could one day reach alien recipients, The One Earth Message organization sees its project primarily as a way to inspire a sense of global unity, much like the Golden Records did – and like Stewart Brand once thought a picture of Earth from space might do.

For almost 40 years, people have been inspired by the Voyager record, a portrait of the Earth in 1977 … The world is very different now, and this new message will reflect the hopes and dreams of the second decade in the 21st century. It will inspire young people’s interest in science and ignite the imagination of all ages. We hope it will be an example of global creativity and cooperation, something that the entire planet can share as a cooperative venture … (space.com)

Artist's impression of the Rosetta spacecraft flying past an asteroid

In other words, the New Horizons message is a way to start a conversation – with alien life, but also with ourselves. Aside from a form of communication, we might also think of it as a self-portrait. Like the Rosetta Disk aboard the European Space Agency’s Rosetta probe, the New Horizons message will be a record of who we are as a global community. As Laura Welcher said of the Rosetta mission,

It’s interesting to think why people do this, why we send messages into space. I think partly we’re trying to commemorate special events … partly we’re also trying to communicate with ourselves; our current selves, and perhaps our future selves. … These messages that we’re sending into space are proxies for us. They are our ambassadors, and they go where we physically cannot go.

The creation of a self-portrait requires reflection on who we are, and who we want to be. It holds us accountable to the image we present to the world. Like any self-portrait, the One Earth Message is at least partly aspirational – it’s meant to compel continual engagement with ourselves and our own betterment; to inspire us always to strive to be our best selves.

To learn more about One Earth Message and ways to contribute, please visit the project’s fundraising page, or follow the project on Twitter.

The Front Line of Language Extinction

Posted on Friday, April 17th, 02015 by Andrew Warner
link   Categories: Digital Dark Age, PanLex, Rosetta   chat 0 Comments

We live in an era of mass extinction of linguistic heritage. Thousands of years of ancestral knowledge and stories are vanishing with the last speakers of hundreds of languages. Come and find out how mobile devices and social media are being used to preserve the “wisdom of the tribe” for generations far into the future.

Linguists worldwide are engaged in an urgent task of recording the world’s languages while there is still time. Oral cultures are in particular jeopardy because they lack a written record. However, the languages are disappearing more quickly than they can be preserved, and so a new effort is trying to ramp up the effort using mobile technologies.

Steven Bird, a linguist and anthologist who spoke for us at The Interval in November 02014 has been testing a new mobile app in Amazonia, Melanesia, and Central Asia. The app, called Aikuma, has been designed by Steven and his team to permit people who speak endangered languages to record and translate their stories and songs. When Steven visited The Interval, he ran a hands-on demonstration of the app, facilitated a discussion of some thorny issues it raised, and shared some of his ingenious solutions. In this recent interview with the Australian Broadcasting Company, Steven Bird explains how the app works and how it can be used to save endangered languages.

amazoniatranscribe stevenamazonia

The above photos are from the village of Terra Preta, near Manaus, in the heart of the Brazilian Amazon. Steven’s team worked with local speakers of the Nhengatu language to record, translate, and transcribe the stories of the rainforest. One of the products is a story book illustrated by the children of the village, which has been uploaded to the Internet Archive where anyone can access it.

Steven Bird is a Senior Research Associate at the Linguistic Data Consortium at UPenn and Associate Professor of Computing and Information Systems at the University of Melbourne, Australia. He travels extensively to remote indigenous communities and through a variety of projects he works to bring the power of technology to bear on efforts to preserve the world’s endangered languages.

The Near and Far Future of Libraries

Posted on Monday, March 2nd, 02015 by Andrew Warner
link   Categories: Digital Dark Age, Futures   chat 0 Comments

w_8cHynKVQjlFjM2xh2_3A-wide

The Near and Far Future of Libraries“, an article in the new publication “Hopes & Fears”, includes an interview with Long Now’s Dr. Laura Welcher on the dangers of the “digital dark age”.

Laura Welcher is Director of the Rosetta Project, The Long Now Foundation’s language-preservation effort that explores storage mediums that will last thousands of years.

 

Keeping The Net’s Long Memory Stocked: Jason Scott @ The Interval— February 24, 02015

Posted on Wednesday, February 18th, 02015 by Mikl Em
link   Categories: Digital Dark Age, Events, Long Term Thinking, Technology, The Interval   chat 0 Comments

Jason Scott of Archive Team and Archive.org

February 24, 02015
Jason Scott (archivist, historian, filmmaker)
The Web in an Eyeblink at The Interval

Tickets are now on sale: space is limited and we expect this talk to sell out

If you are reading this then Jason Scott has probably backed up bits that matter to you–whether you are an ex-SysOp or only use the Web to read this blog. Jason works on no less than three important online archives, each of which is invaluable in preserving our digital history. He’s also made two documentaries about pre-Web networked-computer culture The BBS Documentary and Get Lamp.

Jason Scott with Internet Archive founder Brewster Kahle at ROFLCon Summit

Jason created textfiles.com in 01998 to make thousands of files once found on online bulletin board systems (BBSs) available again after they had become scarce in the era of the World Wide Web. He founded Archive Team in 02009 to rescue user data from popular sites that shut down with little notice; this loose collective has launched “Distributed Preservation of Service Attacks” to save content from Friendster, GeoCities and Google Reader amongst others. In 02011 Jason began curating the Internet Archive‘s software collection: the largest online library of vintage and historical software in the world.

Long Now welcomes Jason Scott to our Conversations at The Interval series:
“The Web in an Eye Blink” on Tuesday, February 24, 02015

The Internet is a network of networks that has the ability to bring the far reaches of the world closer, seemingly in real time. A Big Here in a short Now. But there’s a Long Now of the Internet also in that it connects us through time: a shared memory of readily accessible information. Accessible as long as the resource exists somewhere on the system. So the Internet should give worldwide access to our long memory, all the content we’ve ever put online, right? Unfortunately there are some challenges. But happily we have allies.

The network path to a specific document is a technological chain. And it can be a brittle one. The chain’s components include servers, cables, protocols, programming code, and even human decisions. If one connection in the chain fails–whether it’s the hardware, software, or just a hyperlink–the information becomes inaccessible. And perhaps it’s lost forever. This problem is an aspect of what we call the Digital Dark Age.

The Dilemma of Modern Media

The “High technology” industry is innovation/obsolescence driven by its nature; so new models and software updates often undermine that network-chain in the name of progress. But the tech industry’s competitive business environment causes another threat to our online memory. Mergers and acquisitions shift product offerings irregardless of customer sentiment, let alone the historical importance of a site or service. Users who have invested time and effort in creating content and customizing their accounts will often get little notice about a site’s impending demise. And it’s rare that companies provide tools or guidance to enable customers to preserve their own data. That’s why Jason started Archive Team: to save digital assets for users not empowered to do so themselves. Initially reactive they now keep a “Death Watch” to warn users and keep their own team alert ahead of time about sites that don’t look long for this world.

There is no FDIC for user data. Jason Scott and the Archive Team are all we’ve got.

Jason Scott at ROFLCon II -- Photo by Scott Beale photo by Scott Beale

Jason’s advice is to help ourselves. As he said about the strange case of Twitpic.com:

Assuming that your photographs, writing, email, or other data is important to you … you should always be looking for an export function or a way to save a local backup. If a company claims it’s too hard, they are lying. If they claim that they have everything under control, ignore that and demand your export and local backup again.

The broken link may be the most pernicious of the many breaks that can occur in the network chain. When a linked file is removed or renamed, even if it’s been available for years, all access is instantly cut off. The Wayback Machine, a database of 452 Billion previously downloaded web pages, is a solution to that problem and it’s the main feature of archive.org that most people use today.

But the Internet Archive is much more than a database of web pages. It is a non-profit library with an ever-growing collection of books, movies, software, music, and more is available online for free. The archive preserves and expands our collective memory. And Jason’s work with archiving vintage software is especially groundbreaking.

Thousands of computer and arcade games have been added to the Archive in the last year. Many games have been saved from complete extinction in the process. But Jason and team have done more than that. They have built a website on which this code can run, so that the games are playable again. They did it with JSMESS, Javascript code that can emulate nearly 1000 vintage computing and gaming platforms. So the fact that physical components once requisite for these programs to be accessed will never be manufactured again has ceased to be a limitation. Hardware (computers, gaming consoles, disk drives) is no longer, or much less of, a weak link in the technological chain.

And these games which ran on Apple II’s, TRS-80s, Atari 2600′s, etc, from the historically important and nostalgia-rich era of the 01980s and 01990s will now run in many 21st Century web browsers.

Which browsers? Maybe your browser. Can you see a black box below? If so click the button and you’ll have a chance to play Apple’s 01979 “Apple ][” game Lemonade Stand. Have fun. And be patient. Things were slower then.

You can thank Jason Scott for uploading this and thousands of other fun, useful, or just old pieces of software.

In fact, you can thank him in person when he speaks at The Interval at Long Now this Tuesday, February 24, 02015: “The Web in an Eye Blink”. Jason will talk about his work in the frame of the Long Now.

Get your tickets soon, we expect this talk to sell out.

See Andy Baio’s piece on Medium for more thoughts on Jason’s work and the implications of the Archive’s software collection.

Photos by Jason Scott unless otherwise noted

The Cosmological Limits of Information Storage

Posted on Thursday, February 12th, 02015 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Science   chat 0 Comments

qua_1

An important part of long-term thinking is the never-ending search for very long-lived methods of information storage. A perfect, eternal storage medium still eludes us; most of the ones we’ve invented and used over the course of civilization have had their limitations – even stone, nickel, and sapphire have a shelf life.

But new research by a team of physicists now suggests that searching for a storage medium that lives forever may be a waste of energy, because the laws of physics themselves limit the amount of time that any information can be kept.

In a paper recently published by the New Journal of Physics, the researchers review how spacetime dynamics might influence the storage of information by asking how much data we can reliably hold on to from the beginning to the end of time.

In order to answer that question, the team combined Einsteinian cosmology with quantum theories about the nature of matter and reality. They worked with a standard model of the universe, called the Friedman-Lemaître-Robertson-Walker metric: based on Einstein’s theory of general relativity, it describes a universe that is homogeneous and isotropic, and therefore expands (or contracts) uniformly in all directions.

Working with this metric, the researchers modeled what would happen to stored data over the course of universe expansion. When you encode information into some kind of matter and then track what happens to your storage medium throughout the life course of the universe, you’ll find that the quantum state of its matter (in other words, its properties: its position, momentum, and spin) will eventually and inevitably change. The research team was able to prove that this change in state creates ‘noise’ that dampens the stored information. One of the research physicists explains the process in this video abstract of the paper:

The faster the universe expands, the team argues, the more ‘noise’ interferes with stored data. Looking at the storage of both classical information (anything encoded in bits) and quantum information (anything encoded by the quantum state of a given particle), they conclude that not very much data will last from the beginning to the end of time.

In other words, it seems as though we may be doomed to an eventual quantum dark age. Unless, of course, we always take care to anticipate these state changes, and continuously forward migrate our data.

Software as Language, as Object, as Art

Posted on Tuesday, November 25th, 02014 by Chia Evers
link   Categories: Digital Dark Age, Rosetta, Technology   chat 0 Comments

Rosetta Disk
 

When The Long Now Foundation first began thinking about long-term archives, we drew inspiration from the Rosetta Stone, a 2000-year-old stele containing a Ptolemaic decree in Ancient Egyptian hieroglyphics, Demotic script, and Ancient Greek. Our version of the Rosetta Stone, the Rosetta Disk, includes parallel texts in more than 1,500 languages. Since creating the Disk (a copy of which is now orbiting Comet 67P/Churyumov-Gerasimenko on board the European Space Agency’s Rosetta probe), we have also partnered with the Internet Archive to create an online supplement that currently contains information on some 2,500 languages.

One of our purposes in creating The Rosetta Project was to encourage the preservation of endangered human languages. In a recent event at The Interval, The Future of Language, we explored the role these languages play in carrying important cultural information, and their correlation with biodiversity worldwide.

While we have focused our efforts on spoken languages and their written analogues, other organizations have begun preserving software—not just the end results, but the software itself. This is not only a way of archiving useful information and mitigating the risks of a digital dark age, but also a path to better understand the world we live in. As Paul Ford (a writer and programmer who digitized the full archive of Harper’s Magazine) wrote in The Great Works of Software, “The greatest works of software are not just code or programs, but social, expressive, human languages. They give us a shared set of norms and tools for expressing our ideas about words, or images, or software development.”

Matthew Kirschenbaum, the Associate Director of the Maryland Institute for Technology in the Humanities, made a similar point in the opening address of the Digital Preservation 2014 Meeting at the Library of Congress. In discussing George R. R. Martin’s idiosyncratic choice to write his blockbuster, doorstopper Song of Ice and Fire on an air-gapped machine running DOS and WordStar, Kirschenbaum notes that “WordStar is no toy or half-baked bit of code: on the contrary, it was a triumph of both software engineering and what we would nowadays call user-centered design.”

In its heyday, WordStar appealed to many writers because its central metaphor was that of the handwritten, not the typewritten, page. Robert J. Sawyer, whose novel Calculating God is a candidate for the Manual of Civilization, described the difference like this:

Consider: On a long-hand page, you can jump back and forth in your document with ease. You can put in bookmarks, either actual paper ones, or just fingers slipped into the middle of the manuscript stack. You can annotate the manuscript for yourself with comments like ‘Fix this!’ or ‘Don’t forget to check these facts’ without there being any possibility of you missing them when you next work on the document. And you can mark a block, either by circling it with your pen, or by physically cutting it out, without necessarily having to do anything with it right away. The entire document is your workspace.”

Wordstar
Screenshot of Wordstar Interface

If WordStar does offer a fundamentally different way of approaching digital text, then it’s reasonable to believe that authors using it may produce different work than they would with the mass-market behemoth, Microsoft Word, or one of the more modern, artisanal writing programs like Scrivener or Ulysses III, just as multi-lingual authors find that changing languages changes the way they think.

Speak, Memory

Samuel Beckett famously wrote certain plays in French, because he found that it made him choose his words more carefully and think more clearly; in the preface to Speak, Memory, Vladimir Nabokov said that the “re-Englishing of a Russian re-version of what had been an English re-telling of Russian memories in the first place, proved a diabolical task.” Knowing that A Game of Thrones was written in WordStar or that Waiting for Godot was originally titled “En Attendent Godot” may nuance our appreciation of the texts, but we can go even deeper into the relationship between software and the results it produces by examining its source code.

This was the motivation for the Cooper Hewitt, Smithsonian Design Museum’s recent acquisition of the code for Planetary, a music player for iOS that envisions each artist in the music library as a sun orbited by album-planets, each of which is orbited in turn by a collection of song-moons. In explaining its decision to acquire not only a physical representation of the code, such as an iPad running the app, but the code itself, Cooper-Hewitt said,

With Planetary, we are hoping to preserve more than simply the vessel, more than an instantiation of software and hardware frozen at a moment in time: Commit message fd247e35de9138f0ac411ea0b261fab21936c6e6 authored in 2011 and an iPad2 to be specific.

Cooper-Hewitt’s Planetary announcement also touches on another challenge in archiving software.

[P]reserving large, complex and interdependent systems whose component pieces are often simply flirting with each other rather than holding hands is uncharted territory. Trying to preserve large, complex and interdependent systems whose only manifestation is conceptual—interaction design say or service design—is harder still.

One of the ways the Museum has chosen to meet this challenge is to open-source the software, inviting the public to examine the code, modify it, or build new applications on top of it.

The open-source approach has the advantage of introducing more people to a particular piece of software—people who may be able to port it to new systems, or simply maintain their own copies of it. As we have said in reference to the Rosetta Project, “One of the tenets of the project is that for information to last, people have to care about and engage it.” However, generations of software have already been lost, abandoned, or forgotten, like the software that controls the International Cometary Explorer. Other software has been preserved, but locked into black boxes like the National Software Reference Library at NIST, which includes some 20 million digital signatures, but is available only to law enforcement.

ICEEThe International Cometary Explorer, a spacecraft we are no longer able to talk to

While there is no easy path to archiving software over the long term, the efforts of researchers like Kirschenbaum, projects like the Internet Archive’s Software Collection, and enthusiastic hackers like the Carnegie Mellon Computer Club, who recently recovered Andy Warhol’s digital artwork, are helping create awareness of the issues and develop potential solutions.

Andy WarholOriginal Warhol, created on a Amiga 1000 in 01985

 

New Book Explores the Legacy of Paul Otlet’s Mundaneum

Posted on Tuesday, September 23rd, 02014 by Charlotte Hajer
link   Categories: Digital Dark Age, Technology   chat 0 Comments

image

In 02007, SALT speaker Alex Wright introduced us to Paul Otlet, the Belgian visionary who spent the first half of the twentieth century building a universal catalog of human knowledge, and who dreamed of creating a global information network that would allow anyone virtual access to this “Mundaneum.”

In June of this year, Wright released a new monograph that examines the impact of Otlet’s work and dreams within the larger history of humanity’s attempts to organize and archive its knowledge. In Cataloging The World: Paul Otlet and the Birth of the Information Age, Wright traces the visionary’s legacy from its idealistic mission through the Mundaneum’s destruction by the Nazis, to the birth of the internet and the data-driven world of the 21st century.

Otlet’s work on his Mundaneum went beyond a simple wish to collect and catalog knowledge: it was driven by a deeply idealistic vision of a world brought into harmony through the free exchange of information.

An ardent “internationalist,” Otlet believed in the inevitable progress of humanity towards a peaceful new future, in which the free flow of information over a distributed network would render traditional institutions – like state governments – anachronistic. Instead, he envisioned a dawning age of social progress, scientific achievement, and collective spiritual enlightenment. At the center of it all would stand the Mundaneum, a bulwark and beacon of truth for the whole world. (Wright 02014)

Otlet imagined a system of interconnected “electric telescopes” with which people could easily access the Mundaneum’s catalog of information from the comfort of their homes – a ‘world wide web’ that would bring the globe together in shared reverence for the power of knowledge. But sadly, his vision was thwarted before it could reach its full potential. Brain Pickings’ Maria Popova writes,

At the peak of Otlet’s efforts to organize the world’s knowledge around a generosity of spirit, humanity’s greatest tragedy of ignorance and cruelty descended upon Europe. As the Nazis seized power, they launched a calculated campaign to thwart critical thought by banning and burning all books that didn’t agree with their ideology … and even paved the muddy streets of Eastern Europe with such books so the tanks would pass more efficiently.

Otlet’s dream of open access to knowledge obviously clashed with the Nazis’ effort to control the flow of information, and his Mundaneum was promptly shut down to make room for a gallery displaying Third Reich art. Nevertheless, Otlet’s vision survived, and in many ways inspired the birth of the internet.

While Otlet did not by any stretch of the imagination “invent” the Internet — working as he did in an age before digital computers, magnetic storage, or packet-switching networks — nonetheless his vision looks nothing short of prophetic. In Otlet’s day, microfilm may have qualified as the most advanced information storage technology, and the closest thing anyone had ever seen to a database was a drawer full of index cards. Yet despite these analog limitations, he envisioned a global network of interconnected institutions that would alter the flow of information around the world, and in the process lead to profound social, cultural, and political transformations. (Wright 02014)

Still, Wright argues, some characteristics of today’s internet fly in the face of Otlet’s ideals even as they celebrate them. The modern world wide web is predicated on an absolute individual freedom to consume and contribute information, resulting in an amorphous and decentralized network of information whose provenance can be difficult to trace. In many ways, this defies Otlet’s idealistic belief in a single repository of absolute and carefully verified truths, open access to which would lead the world to collective enlightenment. Wright wonders,

Would the Internet have turned out any differently had Paul Otlet’s vision come to fruition? Counterfactual history is a fool’s game, but it is perhaps worth considering a few possible lessons from the Mundaneum. First and foremost, Otlet acted not out of a desire to make money — something he never succeeded at doing — but out of sheer idealism. His was a quest for universal knowledge, world peace, and progress for humanity as a whole. The Mundaneum was to remain, as he said, “pure.” While many entrepreneurs vow to “change the world” in one way or another, the high-tech industry’s particular brand of utopianism almost always carries with it an underlying strain of free-market ideology: a preference for private enterprise over central planning and a distrust of large organizational structures. This faith in the power of “bottom-up” initiatives has long been a hallmark of Silicon Valley culture, and one that all but precludes the possibility of a large-scale knowledge network emanating from anywhere but the private sector.

Nevertheless, Wright sees in Otlet’s vision a useful ideal to keep striving for:

Otlet’s Mundaneum will never be. But it nonetheless offers us a kind of Platonic object, evoking the possibility of a technological future driven not by greed and vanity, but by a yearning for truth, a commitment to social change, and a belief in the possibility of spiritual liberation. Otlet’s vision for an international knowledge network—always far more expansive than a mere information retrieval tool—points toward a more purposeful vision of what the global network could yet become. And while history may judge Otlet a relic from another time, he also offers us an example of a man driven by a sense of noble purpose, who remained sure in his convictions and unbowed by failure, and whose deep insights about the structure of human knowledge allowed him to peer far into the future…

Wright summarizes Otlet’s legacy with a simple question: are we better off when we safeguard the absolute individual freedom to consume and distribute information as we see fit, or should we be making a more careful effort to curate the information we are surrounded by? It’s a question that we see emerging with growing urgency in contemporary debates about privacy, data sharing, and regulation of the internet – and our answer to it is likely to play an important role in shaping the future of our information networks.

To learn more about Cataloging the World, please take a look at Maria Popova’s thoughtful review, or visit the book’s website.

 

Retrocomputing Brings Warhol’s Lost Digital Art Back to Life

Posted on Friday, May 16th, 02014 by Catherine Borgeson
link   Categories: Digital Dark Age, Long Term Art   chat 0 Comments

In 01985, Andy Warhol used an Amiga 1000 personal computer and the GraphiCraft software to create a series of digital works. Warhol’s early computer artworks are now viewable after 30 years of dormancy.

Commodore International commissioned Warhol to appear at the product launch and produce a few public pieces showing off the Amiga’s multimedia capabilities. According to the report, “Warhol’s presence was intended to convey the message that this was a highly sophisticated yet accessible machine that acted as a tool for creativity.”

In addition to creating a series of public pieces, Warhol made digital works on his own time. He was given a variety of pre-release hardware and software. This led him to eventually experiment with digital photography and videography, edit animation and compose digital audio pieces. The Studio for Creative Inquiry’s report states:

All of this (digital photography, video capturing, animation editing, and audio composition) had been done to limited extents earlier, but Warhol was an incredibly early adopter in this arena and may be the first major artist to explore many of these mediums of computer art. He almost certainly was the earliest (if not the only, given several pre-release statues) possessor of some of this hardware and software and, given their steep later sale prices, possibly the only person to have such a collection.

Andy2

Decades later, artist Cory Arcangel learned of Warhol’s Amiga experiments from a YouTube clip showing Warhol at the launch altering a photo of Deborah Harry by using what nowadays would be considered basic digital art tools, such as flood fills. (shown in the above video). This scene set in motion what would become a 3-year-long quest of technological feats and multidisciplinary collaboration to recover and catalog the previously-unknown Warhol artworks living in degrading 30-year-old Amiga floppy disks.

According to the contract with Commodore, Warhol owned the rights to any hardware given to him and all the work he created with the machines. After his death, his files and machines were stashed away and unpublished in the archives at the Warhol Museum. The collection contained two Amiga 1000 computers, one of which was never used, parts of a video capturing hardware setup, a drawing tablet, and an assortment of floppy disks of mostly commercial software in their original boxes:

It was immediately clear that this was an exciting window into history given that several pieces of Amiga hardware had shipping labels directly from Commodore, others had internal Commodore labels warning that the components were not yet for sale lacking FCC approval, and that the drawing tablet appeared hand-made.

 Amiga computer equipment used by Andy Warhol. Photo: Andy Warhol Museum

In December 02011, Arcangel approached the Warhol Museum with the proposal of restoring the Amiga hardware and archive the contents of associated disks. In April 02012, he teamed up with the Carnegie Mellon Computer Club, a team of experts in obsolete computer maintenance and software preservation, to retrocompute and forensically extract data from the roughly 40 Amiga disks. 10 of those disks were found to contain a total of at least 13 graphic files they think to be created or altered by Warhol.

warhol-amiga-recovery-sm
Cory Arcangel (Center), and CMU Computer Club members Michael Dille (Left), Keith A. Bare (Right) during the data recovery process at The Andy Warhol Museum. Photo: Hillman Photography Initiative, CMOA.

With a lot of hacking, sleuthing and extensive Amiga knowledge, the CMU Computer Club figured out how to examine the contents on Warhol’s disks. In a simplified explanation, it boils down to a two-tier process–first creating copies of the disks in a standard filesystem-level disk image and then looking through those files to see if any were in known graphic formats. Some were, some weren’t. It took months of retrocomputing the GraphiCraft software to convert the unknown graphic formats into a file that could be opened today.

To extract data and generate an archival dump, the Computer Club used a USB device called KryoFlux. This device attaches a floppy drive to the modern day PC and reads and writes standard-format floppy disks. But its real advantage is its ability to capture a very low-level picture of the disks. The KryoFlux created raw dumps as close as possible to the original floppies and standard filesystem disk images (ADF files). These ADF files work with Amiga emulators. Using the KryoFlux also allowed for better handling of degrading and fragile disks (many had magnetic materials coming off the substrate) and it generated standard ADF files for floppy disks containing non-standard encodings or copy-protection schemes.

The following day after making copies of Warhol’s disks, the disk images were loaded in the Amiga emulator in the basement of CMU Computer Club member Michael Dille. The disk hand-labeled “GraphiCraft” contained files with names like “flower.pic,” “campbells.pic” and “marilyn1.pic,” a clear sign that something was on that disk. The .pic files were unrecognizable by modern software and would later require the club’s resident Amiga expert Keith Bare to do deeper hacking and reverse computer engineering in order to crack the GraphiCraft format.  But on that day, two files on the same disk named “tut” and “venus” were in a common format used on Amigas, “Interchange File Format” (IFF). These two files were readable by using the software ImageMagick to convert the .iff files to PNG–a format modern software can understand. On the evening of March 2, 02013, Warhol’s Venus displayed on the screen for the first time in 30 years.

3_Andy_Warhol_Venus_1985_AWF_475px

Warhol’s digital works are proof that the Amiga 1000 was highly impressive for its time. The first of the Amigas, it already had better sound and graphics abilities than its competitors. It had a 4-channel stereo and up to 4,096 colors and 640×200 pixels. In comparison, PCs had “beepers” and up to 16 colors, while Macintoshes had 22.5kHz mono audio and monochrome displays (Studio for Creative Inquiry’s report).

These recovered images give insight to the workflow and capabilities of early computer art. In the above photo, Warhol used clip art to create the three identical eyes on Venus. The digital reinterpretation of Warhol’s Campbell’s Soup, pictured below, was modified with a line tool. It shows Warhol’s willingness to experiment and adapt to a new medium.

2_Andy_Warhol_Campbells_1985_AWF_475px

Earlier this week, the Carnegie Museum of Art released Trapped, the second of a five-part documentary series by the Hillman Photography Initiative to “investigate the world of images that are guarded, stashed away, barely recognizable or simply forgotten.” The short documentary gives a detailed look at the techno-archaeologists’ process of decoding the obsolete file types. (Minute 8:47 shows the copying of Warhol’s disks.)

The forensic effort and process of studying the disks contents sheds light to the impermanent nature of digital material and the need for digital preservation. “In a way, a lot of the data and things we work with almost seem like it’s imaginary,” explained Bare in Trapped, “It’s electrons in a machine. You can turn it into photons if you use a display, but in some sense it’s almost like it’s not even there.”

ICE/ISEE-3 To Return To An Earth No Longer Capable of Speaking To It

Posted on Monday, February 24th, 02014 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Science   chat 0 Comments

International Cometary Explorer (NASA)

This August, a pioneer in space exploration returns to Earth after more than 30 years of service. The spacecraft is still in good, functioning condition, and could possibly be assigned to another mission. Sadly, however, we seem to have forgotten how to speak its language.

The probe, a collaboration between NASA and ESA, was one of three crafts launched in 01978 to study the the interaction between solar wind and Earth’s magnetosphere. Named the International Sun-Earth Explorer-3 (ISEE-3), it was the first-ever object to be sent into heliocentric orbit at the first Lagrangian point – a particular location between Earth and Sun, where our planet’s gravitational force cancels out the Sun’s pull in such a way that a satellite essentially orbits in tandem with Earth.

Upon completion of its mission in 01983, the probe was repurposed and re-christened: now called the International Cometary Explorer (ICE), it circled the moon a few times to gather speed, and then flew off to chase after two comets. ICE intercepted comet Giacobini-Zinner in 01985 before catching up with Halley’s comet in 01986, and making history as the first spacecraft to study two comets directly.

After a brief third mission to study coronal mass ejections, NASA officially decommissioned the probe and shut down communications with its systems. Nevertheless, the agency discovered in 02008 not only that ICE had failed to power off, but also that 12 of its 13 instruments were still functioning. They entertained the idea of sending ICE off to study another corner of the Solar System – only to learn that the equipment needed to communicate with ICE is no longer available, and too cost-prohibitive to rebuild. The Planetary Society’s Emily Lackdawalla explains:

Several months of digging through old technical documents has led a group of NASA engineers to believe they will indeed be able to understand the stream of data coming from the spacecraft. NASA’s Deep Space Network (DSN) can listen to the spacecraft, a test in 2008 proved that it was possible to pick up the transmitter carrier signal, but can we speak to the spacecraft? Can we tell the spacecraft to turn back on its thrusters and science instruments after decades of silence and perform the intricate ballet needed to send it back to where it can again monitor the Sun? The answer to that question appears to be no.

For the past 15 years, ICE has been patiently orbiting the Sun at a speed slightly higher than that of Earth. Now that it’s catching up with us again from behind, researchers realize there’s much more exploration that ICE could have helped us with. Unfortunately, we simply don’t seem capable of mustering the resources we need to communicate with ICE. Lackdawalla muses,

I wonder if ham radio operators will be able to pick up its carrier signal – it’s meaningless, I guess, but it feels like an honorable thing to do, a kind of salute to the venerable ship as it passes by.