Blog Archive for the ‘Digital Dark Age’ Category

navigateleft Older Articles   

Keeping The Net’s Long Memory Stocked: Jason Scott @ The Interval— February 24, 02015

Posted on Wednesday, February 18th, 02015 by Mikl Em
link   Categories: Digital Dark Age, Events, Long Term Thinking, Technology, The Interval   chat 0 Comments

Jason Scott of Archive Team and Archive.org

February 24, 02015
Jason Scott (archivist, historian, filmmaker)
The Web in an Eyeblink at The Interval

Tickets are now on sale: space is limited and we expect this talk to sell out

If you are reading this then Jason Scott has probably backed up bits that matter to you–whether you are an ex-SysOp or only use the Web to read this blog. Jason works on no less than three important online archives, each of which is invaluable in preserving our digital history. He’s also made two documentaries about pre-Web networked-computer culture The BBS Documentary and Get Lamp.

Jason Scott with Internet Archive founder Brewster Kahle at ROFLCon Summit

Jason created textfiles.com in 01998 to make thousands of files once found on online bulletin board systems (BBSs) available again after they had become scarce in the era of the World Wide Web. He founded Archive Team in 02009 to rescue user data from popular sites that shut down with little notice; this loose collective has launched “Distributed Preservation of Service Attacks” to save content from Friendster, GeoCities and Google Reader amongst others. In 02011 Jason began curating the Internet Archive‘s software collection: the largest online library of vintage and historical software in the world.

Long Now welcomes Jason Scott to our Conversations at The Interval series:
“The Web in an Eye Blink” on Tuesday, February 24, 02015

The Internet is a network of networks that has the ability to bring the far reaches of the world closer, seemingly in real time. A Big Here in a short Now. But there’s a Long Now of the Internet also in that it connects us through time: a shared memory of readily accessible information. Accessible as long as the resource exists somewhere on the system. So the Internet should give worldwide access to our long memory, all the content we’ve ever put online, right? Unfortunately there are some challenges. But happily we have allies.

The network path to a specific document is a technological chain. And it can be a brittle one. The chain’s components include servers, cables, protocols, programming code, and even human decisions. If one connection in the chain fails–whether it’s the hardware, software, or just a hyperlink–the information becomes inaccessible. And perhaps it’s lost forever. This problem is an aspect of what we call the Digital Dark Age.

The Dilemma of Modern Media

The “High technology” industry is innovation/obsolescence driven by its nature; so new models and software updates often undermine that network-chain in the name of progress. But the tech industry’s competitive business environment causes another threat to our online memory. Mergers and acquisitions shift product offerings irregardless of customer sentiment, let alone the historical importance of a site or service. Users who have invested time and effort in creating content and customizing their accounts will often get little notice about a site’s impending demise. And it’s rare that companies provide tools or guidance to enable customers to preserve their own data. That’s why Jason started Archive Team: to save digital assets for users not empowered to do so themselves. Initially reactive they now keep a “Death Watch” to warn users and keep their own team alert ahead of time about sites that don’t look long for this world.

There is no FDIC for user data. Jason Scott and the Archive Team are all we’ve got.

Jason Scott at ROFLCon II -- Photo by Scott Beale photo by Scott Beale

Jason’s advice is to help ourselves. As he said about the strange case of Twitpic.com:

Assuming that your photographs, writing, email, or other data is important to you … you should always be looking for an export function or a way to save a local backup. If a company claims it’s too hard, they are lying. If they claim that they have everything under control, ignore that and demand your export and local backup again.

The broken link may be the most pernicious of the many breaks that can occur in the network chain. When a linked file is removed or renamed, even if it’s been available for years, all access is instantly cut off. The Wayback Machine, a database of 452 Billion previously downloaded web pages, is a solution to that problem and it’s the main feature of archive.org that most people use today.

But the Internet Archive is much more than a database of web pages. It is a non-profit library with an ever-growing collection of books, movies, software, music, and more is available online for free. The archive preserves and expands our collective memory. And Jason’s work with archiving vintage software is especially groundbreaking.

Thousands of computer and arcade games have been added to the Archive in the last year. Many games have been saved from complete extinction in the process. But Jason and team have done more than that. They have built a website on which this code can run, so that the games are playable again. They did it with JSMESS, Javascript code that can emulate nearly 1000 vintage computing and gaming platforms. So the fact that physical components once requisite for these programs to be accessed will never be manufactured again has ceased to be a limitation. Hardware (computers, gaming consoles, disk drives) is no longer, or much less of, a weak link in the technological chain.

And these games which ran on Apple II’s, TRS-80s, Atari 2600′s, etc, from the historically important and nostalgia-rich era of the 01980s and 01990s will now run in many 21st Century web browsers.

Which browsers? Maybe your browser. Can you see a black box below? If so click the button and you’ll have a chance to play Apple’s 01979 “Apple ][” game Lemonade Stand. Have fun. And be patient. Things were slower then.

You can thank Jason Scott for uploading this and thousands of other fun, useful, or just old pieces of software.

In fact, you can thank him in person when he speaks at The Interval at Long Now this Tuesday, February 24, 02015: “The Web in an Eye Blink”. Jason will talk about his work in the frame of the Long Now.

Get your tickets soon, we expect this talk to sell out.

See Andy Baio’s piece on Medium for more thoughts on Jason’s work and the implications of the Archive’s software collection.

Photos by Jason Scott unless otherwise noted

The Cosmological Limits of Information Storage

Posted on Thursday, February 12th, 02015 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Science   chat 0 Comments

qua_1

An important part of long-term thinking is the never-ending search for very long-lived methods of information storage. A perfect, eternal storage medium still eludes us; most of the ones we’ve invented and used over the course of civilization have had their limitations – even stone, nickel, and sapphire have a shelf life.

But new research by a team of physicists now suggests that searching for a storage medium that lives forever may be a waste of energy, because the laws of physics themselves limit the amount of time that any information can be kept.

In a paper recently published by the New Journal of Physics, the researchers review how spacetime dynamics might influence the storage of information by asking how much data we can reliably hold on to from the beginning to the end of time.

In order to answer that question, the team combined Einsteinian cosmology with quantum theories about the nature of matter and reality. They worked with a standard model of the universe, called the Friedman-Lemaître-Robertson-Walker metric: based on Einstein’s theory of general relativity, it describes a universe that is homogeneous and isotropic, and therefore expands (or contracts) uniformly in all directions.

Working with this metric, the researchers modeled what would happen to stored data over the course of universe expansion. When you encode information into some kind of matter and then track what happens to your storage medium throughout the life course of the universe, you’ll find that the quantum state of its matter (in other words, its properties: its position, momentum, and spin) will eventually and inevitably change. The research team was able to prove that this change in state creates ‘noise’ that dampens the stored information. One of the research physicists explains the process in this video abstract of the paper:

The faster the universe expands, the team argues, the more ‘noise’ interferes with stored data. Looking at the storage of both classical information (anything encoded in bits) and quantum information (anything encoded by the quantum state of a given particle), they conclude that not very much data will last from the beginning to the end of time.

In other words, it seems as though we may be doomed to an eventual quantum dark age. Unless, of course, we always take care to anticipate these state changes, and continuously forward migrate our data.

Software as Language, as Object, as Art

Posted on Tuesday, November 25th, 02014 by Chia Evers
link   Categories: Digital Dark Age, Rosetta, Technology   chat 0 Comments

Rosetta Disk
 

When The Long Now Foundation first began thinking about long-term archives, we drew inspiration from the Rosetta Stone, a 2000-year-old stele containing a Ptolemaic decree in Ancient Egyptian hieroglyphics, Demotic script, and Ancient Greek. Our version of the Rosetta Stone, the Rosetta Disk, includes parallel texts in more than 1,500 languages. Since creating the Disk (a copy of which is now orbiting Comet 67P/Churyumov-Gerasimenko on board the European Space Agency’s Rosetta probe), we have also partnered with the Internet Archive to create an online supplement that currently contains information on some 2,500 languages.

One of our purposes in creating The Rosetta Project was to encourage the preservation of endangered human languages. In a recent event at The Interval, The Future of Language, we explored the role these languages play in carrying important cultural information, and their correlation with biodiversity worldwide.

While we have focused our efforts on spoken languages and their written analogues, other organizations have begun preserving software—not just the end results, but the software itself. This is not only a way of archiving useful information and mitigating the risks of a digital dark age, but also a path to better understand the world we live in. As Paul Ford (a writer and programmer who digitized the full archive of Harper’s Magazine) wrote in The Great Works of Software, “The greatest works of software are not just code or programs, but social, expressive, human languages. They give us a shared set of norms and tools for expressing our ideas about words, or images, or software development.”

Matthew Kirschenbaum, the Associate Director of the Maryland Institute for Technology in the Humanities, made a similar point in the opening address of the Digital Preservation 2014 Meeting at the Library of Congress. In discussing George R. R. Martin’s idiosyncratic choice to write his blockbuster, doorstopper Song of Ice and Fire on an air-gapped machine running DOS and WordStar, Kirschenbaum notes that “WordStar is no toy or half-baked bit of code: on the contrary, it was a triumph of both software engineering and what we would nowadays call user-centered design.”

In its heyday, WordStar appealed to many writers because its central metaphor was that of the handwritten, not the typewritten, page. Robert J. Sawyer, whose novel Calculating God is a candidate for the Manual of Civilization, described the difference like this:

Consider: On a long-hand page, you can jump back and forth in your document with ease. You can put in bookmarks, either actual paper ones, or just fingers slipped into the middle of the manuscript stack. You can annotate the manuscript for yourself with comments like ‘Fix this!’ or ‘Don’t forget to check these facts’ without there being any possibility of you missing them when you next work on the document. And you can mark a block, either by circling it with your pen, or by physically cutting it out, without necessarily having to do anything with it right away. The entire document is your workspace.”

Wordstar
Screenshot of Wordstar Interface

If WordStar does offer a fundamentally different way of approaching digital text, then it’s reasonable to believe that authors using it may produce different work than they would with the mass-market behemoth, Microsoft Word, or one of the more modern, artisanal writing programs like Scrivener or Ulysses III, just as multi-lingual authors find that changing languages changes the way they think.

Speak, Memory

Samuel Beckett famously wrote certain plays in French, because he found that it made him choose his words more carefully and think more clearly; in the preface to Speak, Memory, Vladimir Nabokov said that the “re-Englishing of a Russian re-version of what had been an English re-telling of Russian memories in the first place, proved a diabolical task.” Knowing that A Game of Thrones was written in WordStar or that Waiting for Godot was originally titled “En Attendent Godot” may nuance our appreciation of the texts, but we can go even deeper into the relationship between software and the results it produces by examining its source code.

This was the motivation for the Cooper Hewitt, Smithsonian Design Museum’s recent acquisition of the code for Planetary, a music player for iOS that envisions each artist in the music library as a sun orbited by album-planets, each of which is orbited in turn by a collection of song-moons. In explaining its decision to acquire not only a physical representation of the code, such as an iPad running the app, but the code itself, Cooper-Hewitt said,

With Planetary, we are hoping to preserve more than simply the vessel, more than an instantiation of software and hardware frozen at a moment in time: Commit message fd247e35de9138f0ac411ea0b261fab21936c6e6 authored in 2011 and an iPad2 to be specific.

Cooper-Hewitt’s Planetary announcement also touches on another challenge in archiving software.

[P]reserving large, complex and interdependent systems whose component pieces are often simply flirting with each other rather than holding hands is uncharted territory. Trying to preserve large, complex and interdependent systems whose only manifestation is conceptual—interaction design say or service design—is harder still.

One of the ways the Museum has chosen to meet this challenge is to open-source the software, inviting the public to examine the code, modify it, or build new applications on top of it.

The open-source approach has the advantage of introducing more people to a particular piece of software—people who may be able to port it to new systems, or simply maintain their own copies of it. As we have said in reference to the Rosetta Project, “One of the tenets of the project is that for information to last, people have to care about and engage it.” However, generations of software have already been lost, abandoned, or forgotten, like the software that controls the International Cometary Explorer. Other software has been preserved, but locked into black boxes like the National Software Reference Library at NIST, which includes some 20 million digital signatures, but is available only to law enforcement.

ICEEThe International Cometary Explorer, a spacecraft we are no longer able to talk to

While there is no easy path to archiving software over the long term, the efforts of researchers like Kirschenbaum, projects like the Internet Archive’s Software Collection, and enthusiastic hackers like the Carnegie Mellon Computer Club, who recently recovered Andy Warhol’s digital artwork, are helping create awareness of the issues and develop potential solutions.

Andy WarholOriginal Warhol, created on a Amiga 1000 in 01985

 

New Book Explores the Legacy of Paul Otlet’s Mundaneum

Posted on Tuesday, September 23rd, 02014 by Charlotte Hajer
link   Categories: Digital Dark Age, Technology   chat 0 Comments

image

In 02007, SALT speaker Alex Wright introduced us to Paul Otlet, the Belgian visionary who spent the first half of the twentieth century building a universal catalog of human knowledge, and who dreamed of creating a global information network that would allow anyone virtual access to this “Mundaneum.”

In June of this year, Wright released a new monograph that examines the impact of Otlet’s work and dreams within the larger history of humanity’s attempts to organize and archive its knowledge. In Cataloging The World: Paul Otlet and the Birth of the Information Age, Wright traces the visionary’s legacy from its idealistic mission through the Mundaneum’s destruction by the Nazis, to the birth of the internet and the data-driven world of the 21st century.

Otlet’s work on his Mundaneum went beyond a simple wish to collect and catalog knowledge: it was driven by a deeply idealistic vision of a world brought into harmony through the free exchange of information.

An ardent “internationalist,” Otlet believed in the inevitable progress of humanity towards a peaceful new future, in which the free flow of information over a distributed network would render traditional institutions – like state governments – anachronistic. Instead, he envisioned a dawning age of social progress, scientific achievement, and collective spiritual enlightenment. At the center of it all would stand the Mundaneum, a bulwark and beacon of truth for the whole world. (Wright 02014)

Otlet imagined a system of interconnected “electric telescopes” with which people could easily access the Mundaneum’s catalog of information from the comfort of their homes – a ‘world wide web’ that would bring the globe together in shared reverence for the power of knowledge. But sadly, his vision was thwarted before it could reach its full potential. Brain Pickings’ Maria Popova writes,

At the peak of Otlet’s efforts to organize the world’s knowledge around a generosity of spirit, humanity’s greatest tragedy of ignorance and cruelty descended upon Europe. As the Nazis seized power, they launched a calculated campaign to thwart critical thought by banning and burning all books that didn’t agree with their ideology … and even paved the muddy streets of Eastern Europe with such books so the tanks would pass more efficiently.

Otlet’s dream of open access to knowledge obviously clashed with the Nazis’ effort to control the flow of information, and his Mundaneum was promptly shut down to make room for a gallery displaying Third Reich art. Nevertheless, Otlet’s vision survived, and in many ways inspired the birth of the internet.

While Otlet did not by any stretch of the imagination “invent” the Internet — working as he did in an age before digital computers, magnetic storage, or packet-switching networks — nonetheless his vision looks nothing short of prophetic. In Otlet’s day, microfilm may have qualified as the most advanced information storage technology, and the closest thing anyone had ever seen to a database was a drawer full of index cards. Yet despite these analog limitations, he envisioned a global network of interconnected institutions that would alter the flow of information around the world, and in the process lead to profound social, cultural, and political transformations. (Wright 02014)

Still, Wright argues, some characteristics of today’s internet fly in the face of Otlet’s ideals even as they celebrate them. The modern world wide web is predicated on an absolute individual freedom to consume and contribute information, resulting in an amorphous and decentralized network of information whose provenance can be difficult to trace. In many ways, this defies Otlet’s idealistic belief in a single repository of absolute and carefully verified truths, open access to which would lead the world to collective enlightenment. Wright wonders,

Would the Internet have turned out any differently had Paul Otlet’s vision come to fruition? Counterfactual history is a fool’s game, but it is perhaps worth considering a few possible lessons from the Mundaneum. First and foremost, Otlet acted not out of a desire to make money — something he never succeeded at doing — but out of sheer idealism. His was a quest for universal knowledge, world peace, and progress for humanity as a whole. The Mundaneum was to remain, as he said, “pure.” While many entrepreneurs vow to “change the world” in one way or another, the high-tech industry’s particular brand of utopianism almost always carries with it an underlying strain of free-market ideology: a preference for private enterprise over central planning and a distrust of large organizational structures. This faith in the power of “bottom-up” initiatives has long been a hallmark of Silicon Valley culture, and one that all but precludes the possibility of a large-scale knowledge network emanating from anywhere but the private sector.

Nevertheless, Wright sees in Otlet’s vision a useful ideal to keep striving for:

Otlet’s Mundaneum will never be. But it nonetheless offers us a kind of Platonic object, evoking the possibility of a technological future driven not by greed and vanity, but by a yearning for truth, a commitment to social change, and a belief in the possibility of spiritual liberation. Otlet’s vision for an international knowledge network—always far more expansive than a mere information retrieval tool—points toward a more purposeful vision of what the global network could yet become. And while history may judge Otlet a relic from another time, he also offers us an example of a man driven by a sense of noble purpose, who remained sure in his convictions and unbowed by failure, and whose deep insights about the structure of human knowledge allowed him to peer far into the future…

Wright summarizes Otlet’s legacy with a simple question: are we better off when we safeguard the absolute individual freedom to consume and distribute information as we see fit, or should we be making a more careful effort to curate the information we are surrounded by? It’s a question that we see emerging with growing urgency in contemporary debates about privacy, data sharing, and regulation of the internet – and our answer to it is likely to play an important role in shaping the future of our information networks.

To learn more about Cataloging the World, please take a look at Maria Popova’s thoughtful review, or visit the book’s website.

 

Retrocomputing Brings Warhol’s Lost Digital Art Back to Life

Posted on Friday, May 16th, 02014 by Catherine Borgeson
link   Categories: Digital Dark Age, Long Term Art   chat 0 Comments

In 01985, Andy Warhol used an Amiga 1000 personal computer and the GraphiCraft software to create a series of digital works. Warhol’s early computer artworks are now viewable after 30 years of dormancy.

Commodore International commissioned Warhol to appear at the product launch and produce a few public pieces showing off the Amiga’s multimedia capabilities. According to the report, “Warhol’s presence was intended to convey the message that this was a highly sophisticated yet accessible machine that acted as a tool for creativity.”

In addition to creating a series of public pieces, Warhol made digital works on his own time. He was given a variety of pre-release hardware and software. This led him to eventually experiment with digital photography and videography, edit animation and compose digital audio pieces. The Studio for Creative Inquiry’s report states:

All of this (digital photography, video capturing, animation editing, and audio composition) had been done to limited extents earlier, but Warhol was an incredibly early adopter in this arena and may be the first major artist to explore many of these mediums of computer art. He almost certainly was the earliest (if not the only, given several pre-release statues) possessor of some of this hardware and software and, given their steep later sale prices, possibly the only person to have such a collection.

Andy2

Decades later, artist Cory Arcangel learned of Warhol’s Amiga experiments from a YouTube clip showing Warhol at the launch altering a photo of Deborah Harry by using what nowadays would be considered basic digital art tools, such as flood fills. (shown in the above video). This scene set in motion what would become a 3-year-long quest of technological feats and multidisciplinary collaboration to recover and catalog the previously-unknown Warhol artworks living in degrading 30-year-old Amiga floppy disks.

According to the contract with Commodore, Warhol owned the rights to any hardware given to him and all the work he created with the machines. After his death, his files and machines were stashed away and unpublished in the archives at the Warhol Museum. The collection contained two Amiga 1000 computers, one of which was never used, parts of a video capturing hardware setup, a drawing tablet, and an assortment of floppy disks of mostly commercial software in their original boxes:

It was immediately clear that this was an exciting window into history given that several pieces of Amiga hardware had shipping labels directly from Commodore, others had internal Commodore labels warning that the components were not yet for sale lacking FCC approval, and that the drawing tablet appeared hand-made.

 Amiga computer equipment used by Andy Warhol. Photo: Andy Warhol Museum

In December 02011, Arcangel approached the Warhol Museum with the proposal of restoring the Amiga hardware and archive the contents of associated disks. In April 02012, he teamed up with the Carnegie Mellon Computer Club, a team of experts in obsolete computer maintenance and software preservation, to retrocompute and forensically extract data from the roughly 40 Amiga disks. 10 of those disks were found to contain a total of at least 13 graphic files they think to be created or altered by Warhol.

warhol-amiga-recovery-sm
Cory Arcangel (Center), and CMU Computer Club members Michael Dille (Left), Keith A. Bare (Right) during the data recovery process at The Andy Warhol Museum. Photo: Hillman Photography Initiative, CMOA.

With a lot of hacking, sleuthing and extensive Amiga knowledge, the CMU Computer Club figured out how to examine the contents on Warhol’s disks. In a simplified explanation, it boils down to a two-tier process–first creating copies of the disks in a standard filesystem-level disk image and then looking through those files to see if any were in known graphic formats. Some were, some weren’t. It took months of retrocomputing the GraphiCraft software to convert the unknown graphic formats into a file that could be opened today.

To extract data and generate an archival dump, the Computer Club used a USB device called KryoFlux. This device attaches a floppy drive to the modern day PC and reads and writes standard-format floppy disks. But its real advantage is its ability to capture a very low-level picture of the disks. The KryoFlux created raw dumps as close as possible to the original floppies and standard filesystem disk images (ADF files). These ADF files work with Amiga emulators. Using the KryoFlux also allowed for better handling of degrading and fragile disks (many had magnetic materials coming off the substrate) and it generated standard ADF files for floppy disks containing non-standard encodings or copy-protection schemes.

The following day after making copies of Warhol’s disks, the disk images were loaded in the Amiga emulator in the basement of CMU Computer Club member Michael Dille. The disk hand-labeled “GraphiCraft” contained files with names like “flower.pic,” “campbells.pic” and “marilyn1.pic,” a clear sign that something was on that disk. The .pic files were unrecognizable by modern software and would later require the club’s resident Amiga expert Keith Bare to do deeper hacking and reverse computer engineering in order to crack the GraphiCraft format.  But on that day, two files on the same disk named “tut” and “venus” were in a common format used on Amigas, “Interchange File Format” (IFF). These two files were readable by using the software ImageMagick to convert the .iff files to PNG–a format modern software can understand. On the evening of March 2, 02013, Warhol’s Venus displayed on the screen for the first time in 30 years.

3_Andy_Warhol_Venus_1985_AWF_475px

Warhol’s digital works are proof that the Amiga 1000 was highly impressive for its time. The first of the Amigas, it already had better sound and graphics abilities than its competitors. It had a 4-channel stereo and up to 4,096 colors and 640×200 pixels. In comparison, PCs had “beepers” and up to 16 colors, while Macintoshes had 22.5kHz mono audio and monochrome displays (Studio for Creative Inquiry’s report).

These recovered images give insight to the workflow and capabilities of early computer art. In the above photo, Warhol used clip art to create the three identical eyes on Venus. The digital reinterpretation of Warhol’s Campbell’s Soup, pictured below, was modified with a line tool. It shows Warhol’s willingness to experiment and adapt to a new medium.

2_Andy_Warhol_Campbells_1985_AWF_475px

Earlier this week, the Carnegie Museum of Art released Trapped, the second of a five-part documentary series by the Hillman Photography Initiative to “investigate the world of images that are guarded, stashed away, barely recognizable or simply forgotten.” The short documentary gives a detailed look at the techno-archaeologists’ process of decoding the obsolete file types. (Minute 8:47 shows the copying of Warhol’s disks.)

The forensic effort and process of studying the disks contents sheds light to the impermanent nature of digital material and the need for digital preservation. “In a way, a lot of the data and things we work with almost seem like it’s imaginary,” explained Bare in Trapped, “It’s electrons in a machine. You can turn it into photons if you use a display, but in some sense it’s almost like it’s not even there.”

ICE/ISEE-3 To Return To An Earth No Longer Capable of Speaking To It

Posted on Monday, February 24th, 02014 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Science   chat 0 Comments

International Cometary Explorer (NASA)

This August, a pioneer in space exploration returns to Earth after more than 30 years of service. The spacecraft is still in good, functioning condition, and could possibly be assigned to another mission. Sadly, however, we seem to have forgotten how to speak its language.

The probe, a collaboration between NASA and ESA, was one of three crafts launched in 01978 to study the the interaction between solar wind and Earth’s magnetosphere. Named the International Sun-Earth Explorer-3 (ISEE-3), it was the first-ever object to be sent into heliocentric orbit at the first Lagrangian point – a particular location between Earth and Sun, where our planet’s gravitational force cancels out the Sun’s pull in such a way that a satellite essentially orbits in tandem with Earth.

Upon completion of its mission in 01983, the probe was repurposed and re-christened: now called the International Cometary Explorer (ICE), it circled the moon a few times to gather speed, and then flew off to chase after two comets. ICE intercepted comet Giacobini-Zinner in 01985 before catching up with Halley’s comet in 01986, and making history as the first spacecraft to study two comets directly.

After a brief third mission to study coronal mass ejections, NASA officially decommissioned the probe and shut down communications with its systems. Nevertheless, the agency discovered in 02008 not only that ICE had failed to power off, but also that 12 of its 13 instruments were still functioning. They entertained the idea of sending ICE off to study another corner of the Solar System – only to learn that the equipment needed to communicate with ICE is no longer available, and too cost-prohibitive to rebuild. The Planetary Society’s Emily Lackdawalla explains:

Several months of digging through old technical documents has led a group of NASA engineers to believe they will indeed be able to understand the stream of data coming from the spacecraft. NASA’s Deep Space Network (DSN) can listen to the spacecraft, a test in 2008 proved that it was possible to pick up the transmitter carrier signal, but can we speak to the spacecraft? Can we tell the spacecraft to turn back on its thrusters and science instruments after decades of silence and perform the intricate ballet needed to send it back to where it can again monitor the Sun? The answer to that question appears to be no.

For the past 15 years, ICE has been patiently orbiting the Sun at a speed slightly higher than that of Earth. Now that it’s catching up with us again from behind, researchers realize there’s much more exploration that ICE could have helped us with. Unfortunately, we simply don’t seem capable of mustering the resources we need to communicate with ICE. Lackdawalla muses,

I wonder if ham radio operators will be able to pick up its carrier signal – it’s meaningless, I guess, but it feels like an honorable thing to do, a kind of salute to the venerable ship as it passes by.

Laura Welcher Speaks at Contemporary Jewish Museum This Sunday

Posted on Thursday, February 13th, 02014 by Charlotte Hajer
link   Categories: Announcements, Digital Dark Age, Long Term Art   chat 0 Comments

Chicago-Inside-Container-47

How do public archives, as collections of cultural artifacts, shape our collective memory? And how is this changing as new digital tools make it ever easier for scholars and artists to access these repositories?

This Sunday, Long Now’s Laura Welcher joins a group of archivists and artists to discuss these questions and more at the Contemporary Jewish Museum in San Francisco. Entitled Finders and Keepers: Archives in the Digital Age, the panel discussion accompanies an exhibit by Chicago-based photographer Jason Lazarus, who creates collaborative installations with pictures and texts submitted by others.

The panel discussion starts this Sunday, February 16th, at 3 PM; the event is free with museum admission.

 

Lost century-old Antarctic images found and conserved

Posted on Friday, January 10th, 02014 by Catherine Borgeson
link   Categories: Digital Dark Age, Long News, Long Term Art   chat 0 Comments

png;base64ae8986ed5eff7765
Photo: Antarctic Heritage Trust (NZ)

A small box of 22 exposed but unprocessed photographic negatives left nearly a century  ago in an Antarctic exploration hut has been discovered and conserved by New Zealand’s Antarctic Heritage Trust.

“It’s the first example that I’m aware of, of undeveloped negatives from a century ago from the Antarctic heroic era,” Antarctic Heritage Trust Executive Director Nigel Watson said in a press release. “There’s a paucity of images from that expedition.”

aht1
Photo: Antarctic Heritage Trust (NZ)

The team of conservationists discovered the clumped together negatives preserved in a solid block of ice in Robert Falcon Scott’s hut at Cape Evans on Ross Island. The hut served as one of the many supply depots of Captain Scott’s doomed Terre Nova Expedition to the South Pole (01910-01913). While the expedition made it to the Pole, they died during the return trip from starvation and extreme conditions. Today, preserved jars of Heinz Tomato Ketchup, John Burgess & Sons French olives and blocks of New Zealand butter can still be found in the hut, as well as a darkroom intact with chemicals and plates.

Two years after Scott’s expedition, the hut was inhabited by the Ross Sea Party of Ernest Shackleton’s Imperial Trans-Antarctic Expedition (01914-01917). Ten marooned men lived there after being stranded on the ice for nearly two years when their ship, the SY Aurora, broke free from her moorings during a blizzard and drifted out to sea.  By the time of their rescue, three men had died, including the team’s photographer Arnold Patrick Spencer-Smith. While the photographer of the negatives cannot be proven, someone in the Ross Sea Party did leave behind the undeveloped images.

aht2
Chief Scientist Alexander Stevens looking south on the deck of Aurora. Hut Point Peninsula in the background. Photo: Antarctic Heritage Trust (NZ)

ath3
Photo: Antarctic Heritage Trust (NZ)

These never-before-seen images give testament to the Heroic Age of Antarctic Exploration. And only in places like Antarctica could such a situation exist. The photographer used cellulose nitrate film, which according to Kodak, is a relatively unstable base. The film breaks down in humidity and higher temperatures, giving off powerful oxidizing agents. However, if the conditions are right, the film may last for decades, or as the Antarctic Heritage Trust discovered, a century.

The photographs found in Captain Scott’s expedition base at Cape Evans, Antarctica required specialist conservation treatment. The Antarctic Heritage Trust (NZ) engaged Photographic Conservator Mark Strange to undertake the painstaking task of separating, cleaning (including removing mould) and consolidating the cellulose nitrate image layers. Twenty-two separate sheets were revealed and sent to New Zealand Micrographic Services for scanning using a Lanovia pre-press scanner. The digital scans were converted to digital positives.

via i09

Reviving and Restoring Lost Sounds

Posted on Thursday, December 26th, 02013 by Catherine Borgeson
link   Categories: Digital Dark Age, Rosetta   chat 0 Comments

In 02008 Kevin Kelly called for movage (as opposed to storage) as the only way to archive digital information:

“Proper movage means transferring the material to current platforms on a regular basis— that is, before the old platform completely dies, and it becomes hard to do. This movic rythym of refreshing content should be as smooth as a respiratory cycle — in, out, in, out. Copy, move, copy, move.”

Five years later, Berkeley physicist Carl Haber received the MacArthur “Genius Grant” for doing just this–moving two- or three-dimensional audio recordings on obsolete platforms and/or decaying storage media and digitally restoring them. These long-lost analog sounds can essentially be played with a virtual needle.

Haber already had the technology in place from his research on imaging radiation. He had cameras precise enough to image and measure the patterns of particles and debris that emerged from subatomic particle collisions. He and his colleagues at the Lawrence Berkeley National Laboratory applied this noninvasive image processing to develop IRENE (Image, Reconstruct, Erase Noise, Etc.). They derived the acronym from the first recording used to demonstrate the concept of IRENE–The Weavers performing “Goodnight Irene.”

Just like the detailed technique required to measure radiation, pictures taken at great magnification are needed to map the surface of an audio recording.  One pixel is about one micron on the disc or cylinder surface, meaning in order to acquire a sufficient digital map, the camera scans the object slowly enough to synthesize a gigapixel image:

A disc or cylinder is placed in a precision optical metrology system, where a camera following the path of the grooves on the object takes thousands of images that are then cleaned to compensate for physical damage; the resulting data are mathematically interpolated to determine how a stylus would course through the undulations, and the stylus motion is converted into a standard digital sound file.

So IRENE uses image processing to take a picture and mathematically break down the information in that image to calculate the motion of the groove and determine what sound would actually be played. It’s all done with an algorithm on a computer, never having to touch the recording in the process.

Haber has collaborated with archivists and researchers around the world to test IRENE on a variety of audio recordings. The Smithsonian has about 200 experimental recordings from Volta Laboratory Associates, a collection of some of the earliest audio recordings ever made. It is “a reflection on the intense competition between (Alexander Graham) Bell, Thomas Edison and Emile Berliner for patents following the invention of the phonograph by Edison in 1877.”

Little-Lamb-glass-plate-no-3-analysis
A glass disc recording from Bell’s Volta Lab containing the audio of a male voice repeating “Mary had a little lamb.” Photo: The Smithsonian

Now, in collaboration with the Smithsonian and the Library of Congress, IRENE has started to recover these fragile recordings made out of rubber, beeswax, glass, tin foil and brass. The cryptic recordings on such delicate and damaged storage mediums can presently be heard, over a century later:

Early experimental recordings are not the only recovered lost voices. Anthropologists, linguists and ethnographers were among some of the first to use recording as a research tool and to document cultural heritage. IRENE has restored some of these field recordings, including wax cylinders from the Alfred L. Kroeber collection at the Phoebe A. Hearst Museum of Anthropology in Berkeley. There are over 3,000 cylinder recordings in the collection that document California Native American culture from 01900-01919. Around 300 of these cylinders are 2-3 minute long recordings of Ishi, the only surviving member of the Yahi at the time. 53 of the 300 cylinders are Ishi telling the story of the “Wood Duck” in 01912.

Earlier this month Smithsonian’s National Museum of Natural History received a $1 million grant from the Arcadia Fund to digitize its endangered-language materials, including the estimated 3,000 hours of sound recordings. The recordings will then be electronically available to the public through the Smithsonian’s catalog system:

Digitization of these materials within the NAA (National Anthropological Archives) will give both scholars and local communities new access to documentation of endangered languages and cultural knowledge about threatened environments around the world, ranging from southern California to small Micronesian atolls.

In September 02010, The Library of Congress released the first comprehensive study on a national level examining the preservation of sound recordings in the United States. It found that many historical recordings have already deteriorated or are inaccessible to the public due to their experimental and fragile nature.

“Those audio cassettes are just time bombs,” the study’s co-author Sam Brylawski said. “They’re just not going to be playable.”

But maybe this is not the case after all. Perhaps Haber has taken upon an archaeological endeavor that postpones the detonation of media “time bombs.” It comes back to Kevin Kelly’s idea of movage. Haber has taken big technological steps to digitally play these long-lost analog voices, now the key is to keep on moving.

The Cure for Broken Links and Dead Dot-Coms

Posted on Friday, November 1st, 02013 by Catherine Borgeson
link   Categories: Digital Dark Age   chat 0 Comments

1280px-Wikimedia_error_404.png__1280×960_

“The Internet echoes with the empty spaces where data used to be.”
- Alexis Rossi from the Wayback Machine

The Internet Archive recently unveiled a new plan to fix broken links utilizing the Wayback Machine.

The Wayback Machine provides digital captures of URLs to create stable access to websites that otherwise might vanish. The service initially launched in 2001 with 10 billion pages. Today it archives 10 billion pages every 10 weeks and currently contains more than 360 billion URL snapshots.

We have been serving archived web pages to the public via the Wayback Machine for twelve years now, and it is gratifying to see how this service has become a medium of record for so many.  Wayback pages are cited in papers, referenced in news articles and submitted as evidence in trials.  Now even the U.S. government relies on this web archive.

Steady improvements to the Wayback Machine have been made over the past year to keep pace with the always evolving digital landscape of the Internet. Content went from being a year out of date to appearing in the Wayback Machine an hour after a site is crawled.  Anyone can create a permanent URL to cite a page and the Wayback Machine supports a number of different APIs.

Part of what makes the web so great is its churning and ephemeral nature, but as more and more of our culture and history is built on the dunes of ever-shifting silicates, we stand to stumble forward without a clear sense of where we’ve come from, like Guy Pierce’s amnesiac in Memento. The Wayback Machine improves the web’s memory and in a way, our own.

In becoming better equipped to keep up with the growing Internet, the Wayback Machine has also become a well-suited solution to the broken link epidemic.  It is first working with individual webmasters and a couple larger sites such as WordPress and Wikipedia:

Webmasters can add a short snippet of code to their 404 page that will let users know if the Wayback Machine has a copy of the page in our archive – your web pages don’t have to die!

We started with a big goal — to archive the Internet and preserve it for history.  This year we started looking at the smaller goals — archiving a single page on request, making pages available more quickly, and letting you get information back out of the Wayback in an automated way.  We have spent 17 years building this amazing collection, let’s use it to make the web a better place.