Blog Archive for the ‘Digital Dark Age’ Category

navigateleft Older Articles   

New Book Explores the Legacy of Paul Otlet’s Mundaneum

Posted on Tuesday, September 23rd, 02014 by Charlotte Hajer
link   Categories: Digital Dark Age, Technology   chat 0 Comments

image

In 02007, SALT speaker Alex Wright introduced us to Paul Otlet, the Belgian visionary who spent the first half of the twentieth century building a universal catalog of human knowledge, and who dreamed of creating a global information network that would allow anyone virtual access to this “Mundaneum.”

In June of this year, Wright released a new monograph that examines the impact of Otlet’s work and dreams within the larger history of humanity’s attempts to organize and archive its knowledge. In Cataloging The World: Paul Otlet and the Birth of the Information Age, Wright traces the visionary’s legacy from its idealistic mission through the Mundaneum’s destruction by the Nazis, to the birth of the internet and the data-driven world of the 21st century.

Otlet’s work on his Mundaneum went beyond a simple wish to collect and catalog knowledge: it was driven by a deeply idealistic vision of a world brought into harmony through the free exchange of information.

An ardent “internationalist,” Otlet believed in the inevitable progress of humanity towards a peaceful new future, in which the free flow of information over a distributed network would render traditional institutions – like state governments – anachronistic. Instead, he envisioned a dawning age of social progress, scientific achievement, and collective spiritual enlightenment. At the center of it all would stand the Mundaneum, a bulwark and beacon of truth for the whole world. (Wright 02014)

Otlet imagined a system of interconnected “electric telescopes” with which people could easily access the Mundaneum’s catalog of information from the comfort of their homes – a ‘world wide web’ that would bring the globe together in shared reverence for the power of knowledge. But sadly, his vision was thwarted before it could reach its full potential. Brain Pickings’ Maria Popova writes,

At the peak of Otlet’s efforts to organize the world’s knowledge around a generosity of spirit, humanity’s greatest tragedy of ignorance and cruelty descended upon Europe. As the Nazis seized power, they launched a calculated campaign to thwart critical thought by banning and burning all books that didn’t agree with their ideology … and even paved the muddy streets of Eastern Europe with such books so the tanks would pass more efficiently.

Otlet’s dream of open access to knowledge obviously clashed with the Nazis’ effort to control the flow of information, and his Mundaneum was promptly shut down to make room for a gallery displaying Third Reich art. Nevertheless, Otlet’s vision survived, and in many ways inspired the birth of the internet.

While Otlet did not by any stretch of the imagination “invent” the Internet — working as he did in an age before digital computers, magnetic storage, or packet-switching networks — nonetheless his vision looks nothing short of prophetic. In Otlet’s day, microfilm may have qualified as the most advanced information storage technology, and the closest thing anyone had ever seen to a database was a drawer full of index cards. Yet despite these analog limitations, he envisioned a global network of interconnected institutions that would alter the flow of information around the world, and in the process lead to profound social, cultural, and political transformations. (Wright 02014)

Still, Wright argues, some characteristics of today’s internet fly in the face of Otlet’s ideals even as they celebrate them. The modern world wide web is predicated on an absolute individual freedom to consume and contribute information, resulting in an amorphous and decentralized network of information whose provenance can be difficult to trace. In many ways, this defies Otlet’s idealistic belief in a single repository of absolute and carefully verified truths, open access to which would lead the world to collective enlightenment. Wright wonders,

Would the Internet have turned out any differently had Paul Otlet’s vision come to fruition? Counterfactual history is a fool’s game, but it is perhaps worth considering a few possible lessons from the Mundaneum. First and foremost, Otlet acted not out of a desire to make money — something he never succeeded at doing — but out of sheer idealism. His was a quest for universal knowledge, world peace, and progress for humanity as a whole. The Mundaneum was to remain, as he said, “pure.” While many entrepreneurs vow to “change the world” in one way or another, the high-tech industry’s particular brand of utopianism almost always carries with it an underlying strain of free-market ideology: a preference for private enterprise over central planning and a distrust of large organizational structures. This faith in the power of “bottom-up” initiatives has long been a hallmark of Silicon Valley culture, and one that all but precludes the possibility of a large-scale knowledge network emanating from anywhere but the private sector.

Nevertheless, Wright sees in Otlet’s vision a useful ideal to keep striving for:

Otlet’s Mundaneum will never be. But it nonetheless offers us a kind of Platonic object, evoking the possibility of a technological future driven not by greed and vanity, but by a yearning for truth, a commitment to social change, and a belief in the possibility of spiritual liberation. Otlet’s vision for an international knowledge network—always far more expansive than a mere information retrieval tool—points toward a more purposeful vision of what the global network could yet become. And while history may judge Otlet a relic from another time, he also offers us an example of a man driven by a sense of noble purpose, who remained sure in his convictions and unbowed by failure, and whose deep insights about the structure of human knowledge allowed him to peer far into the future…

Wright summarizes Otlet’s legacy with a simple question: are we better off when we safeguard the absolute individual freedom to consume and distribute information as we see fit, or should we be making a more careful effort to curate the information we are surrounded by? It’s a question that we see emerging with growing urgency in contemporary debates about privacy, data sharing, and regulation of the internet – and our answer to it is likely to play an important role in shaping the future of our information networks.

To learn more about Cataloging the World, please take a look at Maria Popova’s thoughtful review, or visit the book’s website.

 

Retrocomputing Brings Warhol’s Lost Digital Art Back to Life

Posted on Friday, May 16th, 02014 by Catherine Borgeson
link   Categories: Digital Dark Age, Long Term Art   chat 0 Comments

In 01985, Andy Warhol used an Amiga 1000 personal computer and the GraphiCraft software to create a series of digital works. Warhol’s early computer artworks are now viewable after 30 years of dormancy.

Commodore International commissioned Warhol to appear at the product launch and produce a few public pieces showing off the Amiga’s multimedia capabilities. According to the report, “Warhol’s presence was intended to convey the message that this was a highly sophisticated yet accessible machine that acted as a tool for creativity.”

In addition to creating a series of public pieces, Warhol made digital works on his own time. He was given a variety of pre-release hardware and software. This led him to eventually experiment with digital photography and videography, edit animation and compose digital audio pieces. The Studio for Creative Inquiry’s report states:

All of this (digital photography, video capturing, animation editing, and audio composition) had been done to limited extents earlier, but Warhol was an incredibly early adopter in this arena and may be the first major artist to explore many of these mediums of computer art. He almost certainly was the earliest (if not the only, given several pre-release statues) possessor of some of this hardware and software and, given their steep later sale prices, possibly the only person to have such a collection.

Andy2

Decades later, artist Cory Arcangel learned of Warhol’s Amiga experiments from a YouTube clip showing Warhol at the launch altering a photo of Deborah Harry by using what nowadays would be considered basic digital art tools, such as flood fills. (shown in the above video). This scene set in motion what would become a 3-year-long quest of technological feats and multidisciplinary collaboration to recover and catalog the previously-unknown Warhol artworks living in degrading 30-year-old Amiga floppy disks.

According to the contract with Commodore, Warhol owned the rights to any hardware given to him and all the work he created with the machines. After his death, his files and machines were stashed away and unpublished in the archives at the Warhol Museum. The collection contained two Amiga 1000 computers, one of which was never used, parts of a video capturing hardware setup, a drawing tablet, and an assortment of floppy disks of mostly commercial software in their original boxes:

It was immediately clear that this was an exciting window into history given that several pieces of Amiga hardware had shipping labels directly from Commodore, others had internal Commodore labels warning that the components were not yet for sale lacking FCC approval, and that the drawing tablet appeared hand-made.

 Amiga computer equipment used by Andy Warhol. Photo: Andy Warhol Museum

In December 02011, Arcangel approached the Warhol Museum with the proposal of restoring the Amiga hardware and archive the contents of associated disks. In April 02012, he teamed up with the Carnegie Mellon Computer Club, a team of experts in obsolete computer maintenance and software preservation, to retrocompute and forensically extract data from the roughly 40 Amiga disks. 10 of those disks were found to contain a total of at least 13 graphic files they think to be created or altered by Warhol.

warhol-amiga-recovery-sm
Cory Arcangel (Center), and CMU Computer Club members Michael Dille (Left), Keith A. Bare (Right) during the data recovery process at The Andy Warhol Museum. Photo: Hillman Photography Initiative, CMOA.

With a lot of hacking, sleuthing and extensive Amiga knowledge, the CMU Computer Club figured out how to examine the contents on Warhol’s disks. In a simplified explanation, it boils down to a two-tier process–first creating copies of the disks in a standard filesystem-level disk image and then looking through those files to see if any were in known graphic formats. Some were, some weren’t. It took months of retrocomputing the GraphiCraft software to convert the unknown graphic formats into a file that could be opened today.

To extract data and generate an archival dump, the Computer Club used a USB device called KryoFlux. This device attaches a floppy drive to the modern day PC and reads and writes standard-format floppy disks. But its real advantage is its ability to capture a very low-level picture of the disks. The KryoFlux created raw dumps as close as possible to the original floppies and standard filesystem disk images (ADF files). These ADF files work with Amiga emulators. Using the KryoFlux also allowed for better handling of degrading and fragile disks (many had magnetic materials coming off the substrate) and it generated standard ADF files for floppy disks containing non-standard encodings or copy-protection schemes.

The following day after making copies of Warhol’s disks, the disk images were loaded in the Amiga emulator in the basement of CMU Computer Club member Michael Dille. The disk hand-labeled “GraphiCraft” contained files with names like “flower.pic,” “campbells.pic” and “marilyn1.pic,” a clear sign that something was on that disk. The .pic files were unrecognizable by modern software and would later require the club’s resident Amiga expert Keith Bare to do deeper hacking and reverse computer engineering in order to crack the GraphiCraft format.  But on that day, two files on the same disk named “tut” and “venus” were in a common format used on Amigas, “Interchange File Format” (IFF). These two files were readable by using the software ImageMagick to convert the .iff files to PNG–a format modern software can understand. On the evening of March 2, 02013, Warhol’s Venus displayed on the screen for the first time in 30 years.

3_Andy_Warhol_Venus_1985_AWF_475px

Warhol’s digital works are proof that the Amiga 1000 was highly impressive for its time. The first of the Amigas, it already had better sound and graphics abilities than its competitors. It had a 4-channel stereo and up to 4,096 colors and 640×200 pixels. In comparison, PCs had “beepers” and up to 16 colors, while Macintoshes had 22.5kHz mono audio and monochrome displays (Studio for Creative Inquiry’s report).

These recovered images give insight to the workflow and capabilities of early computer art. In the above photo, Warhol used clip art to create the three identical eyes on Venus. The digital reinterpretation of Warhol’s Campbell’s Soup, pictured below, was modified with a line tool. It shows Warhol’s willingness to experiment and adapt to a new medium.

2_Andy_Warhol_Campbells_1985_AWF_475px

Earlier this week, the Carnegie Museum of Art released Trapped, the second of a five-part documentary series by the Hillman Photography Initiative to “investigate the world of images that are guarded, stashed away, barely recognizable or simply forgotten.” The short documentary gives a detailed look at the techno-archaeologists’ process of decoding the obsolete file types. (Minute 8:47 shows the copying of Warhol’s disks.)

The forensic effort and process of studying the disks contents sheds light to the impermanent nature of digital material and the need for digital preservation. “In a way, a lot of the data and things we work with almost seem like it’s imaginary,” explained Bare in Trapped, “It’s electrons in a machine. You can turn it into photons if you use a display, but in some sense it’s almost like it’s not even there.”

ICE/ISEE-3 To Return To An Earth No Longer Capable of Speaking To It

Posted on Monday, February 24th, 02014 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Science   chat 0 Comments

International Cometary Explorer (NASA)

This August, a pioneer in space exploration returns to Earth after more than 30 years of service. The spacecraft is still in good, functioning condition, and could possibly be assigned to another mission. Sadly, however, we seem to have forgotten how to speak its language.

The probe, a collaboration between NASA and ESA, was one of three crafts launched in 01978 to study the the interaction between solar wind and Earth’s magnetosphere. Named the International Sun-Earth Explorer-3 (ISEE-3), it was the first-ever object to be sent into heliocentric orbit at the first Lagrangian point – a particular location between Earth and Sun, where our planet’s gravitational force cancels out the Sun’s pull in such a way that a satellite essentially orbits in tandem with Earth.

Upon completion of its mission in 01983, the probe was repurposed and re-christened: now called the International Cometary Explorer (ICE), it circled the moon a few times to gather speed, and then flew off to chase after two comets. ICE intercepted comet Giacobini-Zinner in 01985 before catching up with Halley’s comet in 01986, and making history as the first spacecraft to study two comets directly.

After a brief third mission to study coronal mass ejections, NASA officially decommissioned the probe and shut down communications with its systems. Nevertheless, the agency discovered in 02008 not only that ICE had failed to power off, but also that 12 of its 13 instruments were still functioning. They entertained the idea of sending ICE off to study another corner of the Solar System – only to learn that the equipment needed to communicate with ICE is no longer available, and too cost-prohibitive to rebuild. The Planetary Society’s Emily Lackdawalla explains:

Several months of digging through old technical documents has led a group of NASA engineers to believe they will indeed be able to understand the stream of data coming from the spacecraft. NASA’s Deep Space Network (DSN) can listen to the spacecraft, a test in 2008 proved that it was possible to pick up the transmitter carrier signal, but can we speak to the spacecraft? Can we tell the spacecraft to turn back on its thrusters and science instruments after decades of silence and perform the intricate ballet needed to send it back to where it can again monitor the Sun? The answer to that question appears to be no.

For the past 15 years, ICE has been patiently orbiting the Sun at a speed slightly higher than that of Earth. Now that it’s catching up with us again from behind, researchers realize there’s much more exploration that ICE could have helped us with. Unfortunately, we simply don’t seem capable of mustering the resources we need to communicate with ICE. Lackdawalla muses,

I wonder if ham radio operators will be able to pick up its carrier signal – it’s meaningless, I guess, but it feels like an honorable thing to do, a kind of salute to the venerable ship as it passes by.

Laura Welcher Speaks at Contemporary Jewish Museum This Sunday

Posted on Thursday, February 13th, 02014 by Charlotte Hajer
link   Categories: Announcements, Digital Dark Age, Long Term Art   chat 0 Comments

Chicago-Inside-Container-47

How do public archives, as collections of cultural artifacts, shape our collective memory? And how is this changing as new digital tools make it ever easier for scholars and artists to access these repositories?

This Sunday, Long Now’s Laura Welcher joins a group of archivists and artists to discuss these questions and more at the Contemporary Jewish Museum in San Francisco. Entitled Finders and Keepers: Archives in the Digital Age, the panel discussion accompanies an exhibit by Chicago-based photographer Jason Lazarus, who creates collaborative installations with pictures and texts submitted by others.

The panel discussion starts this Sunday, February 16th, at 3 PM; the event is free with museum admission.

 

Lost century-old Antarctic images found and conserved

Posted on Friday, January 10th, 02014 by Catherine Borgeson
link   Categories: Digital Dark Age, Long News, Long Term Art   chat 0 Comments

png;base64ae8986ed5eff7765
Photo: Antarctic Heritage Trust (NZ)

A small box of 22 exposed but unprocessed photographic negatives left nearly a century  ago in an Antarctic exploration hut has been discovered and conserved by New Zealand’s Antarctic Heritage Trust.

“It’s the first example that I’m aware of, of undeveloped negatives from a century ago from the Antarctic heroic era,” Antarctic Heritage Trust Executive Director Nigel Watson said in a press release. “There’s a paucity of images from that expedition.”

aht1
Photo: Antarctic Heritage Trust (NZ)

The team of conservationists discovered the clumped together negatives preserved in a solid block of ice in Robert Falcon Scott’s hut at Cape Evans on Ross Island. The hut served as one of the many supply depots of Captain Scott’s doomed Terre Nova Expedition to the South Pole (01910-01913). While the expedition made it to the Pole, they died during the return trip from starvation and extreme conditions. Today, preserved jars of Heinz Tomato Ketchup, John Burgess & Sons French olives and blocks of New Zealand butter can still be found in the hut, as well as a darkroom intact with chemicals and plates.

Two years after Scott’s expedition, the hut was inhabited by the Ross Sea Party of Ernest Shackleton’s Imperial Trans-Antarctic Expedition (01914-01917). Ten marooned men lived there after being stranded on the ice for nearly two years when their ship, the SY Aurora, broke free from her moorings during a blizzard and drifted out to sea.  By the time of their rescue, three men had died, including the team’s photographer Arnold Patrick Spencer-Smith. While the photographer of the negatives cannot be proven, someone in the Ross Sea Party did leave behind the undeveloped images.

aht2
Chief Scientist Alexander Stevens looking south on the deck of Aurora. Hut Point Peninsula in the background. Photo: Antarctic Heritage Trust (NZ)

ath3
Photo: Antarctic Heritage Trust (NZ)

These never-before-seen images give testament to the Heroic Age of Antarctic Exploration. And only in places like Antarctica could such a situation exist. The photographer used cellulose nitrate film, which according to Kodak, is a relatively unstable base. The film breaks down in humidity and higher temperatures, giving off powerful oxidizing agents. However, if the conditions are right, the film may last for decades, or as the Antarctic Heritage Trust discovered, a century.

The photographs found in Captain Scott’s expedition base at Cape Evans, Antarctica required specialist conservation treatment. The Antarctic Heritage Trust (NZ) engaged Photographic Conservator Mark Strange to undertake the painstaking task of separating, cleaning (including removing mould) and consolidating the cellulose nitrate image layers. Twenty-two separate sheets were revealed and sent to New Zealand Micrographic Services for scanning using a Lanovia pre-press scanner. The digital scans were converted to digital positives.

via i09

Reviving and Restoring Lost Sounds

Posted on Thursday, December 26th, 02013 by Catherine Borgeson
link   Categories: Digital Dark Age, Rosetta   chat 0 Comments

In 02008 Kevin Kelly called for movage (as opposed to storage) as the only way to archive digital information:

“Proper movage means transferring the material to current platforms on a regular basis— that is, before the old platform completely dies, and it becomes hard to do. This movic rythym of refreshing content should be as smooth as a respiratory cycle — in, out, in, out. Copy, move, copy, move.”

Five years later, Berkeley physicist Carl Haber received the MacArthur “Genius Grant” for doing just this–moving two- or three-dimensional audio recordings on obsolete platforms and/or decaying storage media and digitally restoring them. These long-lost analog sounds can essentially be played with a virtual needle.

Haber already had the technology in place from his research on imaging radiation. He had cameras precise enough to image and measure the patterns of particles and debris that emerged from subatomic particle collisions. He and his colleagues at the Lawrence Berkeley National Laboratory applied this noninvasive image processing to develop IRENE (Image, Reconstruct, Erase Noise, Etc.). They derived the acronym from the first recording used to demonstrate the concept of IRENE–The Weavers performing “Goodnight Irene.”

Just like the detailed technique required to measure radiation, pictures taken at great magnification are needed to map the surface of an audio recording.  One pixel is about one micron on the disc or cylinder surface, meaning in order to acquire a sufficient digital map, the camera scans the object slowly enough to synthesize a gigapixel image:

A disc or cylinder is placed in a precision optical metrology system, where a camera following the path of the grooves on the object takes thousands of images that are then cleaned to compensate for physical damage; the resulting data are mathematically interpolated to determine how a stylus would course through the undulations, and the stylus motion is converted into a standard digital sound file.

So IRENE uses image processing to take a picture and mathematically break down the information in that image to calculate the motion of the groove and determine what sound would actually be played. It’s all done with an algorithm on a computer, never having to touch the recording in the process.

Haber has collaborated with archivists and researchers around the world to test IRENE on a variety of audio recordings. The Smithsonian has about 200 experimental recordings from Volta Laboratory Associates, a collection of some of the earliest audio recordings ever made. It is “a reflection on the intense competition between (Alexander Graham) Bell, Thomas Edison and Emile Berliner for patents following the invention of the phonograph by Edison in 1877.”

Little-Lamb-glass-plate-no-3-analysis
A glass disc recording from Bell’s Volta Lab containing the audio of a male voice repeating “Mary had a little lamb.” Photo: The Smithsonian

Now, in collaboration with the Smithsonian and the Library of Congress, IRENE has started to recover these fragile recordings made out of rubber, beeswax, glass, tin foil and brass. The cryptic recordings on such delicate and damaged storage mediums can presently be heard, over a century later:

Early experimental recordings are not the only recovered lost voices. Anthropologists, linguists and ethnographers were among some of the first to use recording as a research tool and to document cultural heritage. IRENE has restored some of these field recordings, including wax cylinders from the Alfred L. Kroeber collection at the Phoebe A. Hearst Museum of Anthropology in Berkeley. There are over 3,000 cylinder recordings in the collection that document California Native American culture from 01900-01919. Around 300 of these cylinders are 2-3 minute long recordings of Ishi, the only surviving member of the Yahi at the time. 53 of the 300 cylinders are Ishi telling the story of the “Wood Duck” in 01912.

Earlier this month Smithsonian’s National Museum of Natural History received a $1 million grant from the Arcadia Fund to digitize its endangered-language materials, including the estimated 3,000 hours of sound recordings. The recordings will then be electronically available to the public through the Smithsonian’s catalog system:

Digitization of these materials within the NAA (National Anthropological Archives) will give both scholars and local communities new access to documentation of endangered languages and cultural knowledge about threatened environments around the world, ranging from southern California to small Micronesian atolls.

In September 02010, The Library of Congress released the first comprehensive study on a national level examining the preservation of sound recordings in the United States. It found that many historical recordings have already deteriorated or are inaccessible to the public due to their experimental and fragile nature.

“Those audio cassettes are just time bombs,” the study’s co-author Sam Brylawski said. “They’re just not going to be playable.”

But maybe this is not the case after all. Perhaps Haber has taken upon an archaeological endeavor that postpones the detonation of media “time bombs.” It comes back to Kevin Kelly’s idea of movage. Haber has taken big technological steps to digitally play these long-lost analog voices, now the key is to keep on moving.

The Cure for Broken Links and Dead Dot-Coms

Posted on Friday, November 1st, 02013 by Catherine Borgeson
link   Categories: Digital Dark Age   chat 0 Comments

1280px-Wikimedia_error_404.png__1280×960_

“The Internet echoes with the empty spaces where data used to be.”
- Alexis Rossi from the Wayback Machine

The Internet Archive recently unveiled a new plan to fix broken links utilizing the Wayback Machine.

The Wayback Machine provides digital captures of URLs to create stable access to websites that otherwise might vanish. The service initially launched in 2001 with 10 billion pages. Today it archives 10 billion pages every 10 weeks and currently contains more than 360 billion URL snapshots.

We have been serving archived web pages to the public via the Wayback Machine for twelve years now, and it is gratifying to see how this service has become a medium of record for so many.  Wayback pages are cited in papers, referenced in news articles and submitted as evidence in trials.  Now even the U.S. government relies on this web archive.

Steady improvements to the Wayback Machine have been made over the past year to keep pace with the always evolving digital landscape of the Internet. Content went from being a year out of date to appearing in the Wayback Machine an hour after a site is crawled.  Anyone can create a permanent URL to cite a page and the Wayback Machine supports a number of different APIs.

Part of what makes the web so great is its churning and ephemeral nature, but as more and more of our culture and history is built on the dunes of ever-shifting silicates, we stand to stumble forward without a clear sense of where we’ve come from, like Guy Pierce’s amnesiac in Memento. The Wayback Machine improves the web’s memory and in a way, our own.

In becoming better equipped to keep up with the growing Internet, the Wayback Machine has also become a well-suited solution to the broken link epidemic.  It is first working with individual webmasters and a couple larger sites such as WordPress and Wikipedia:

Webmasters can add a short snippet of code to their 404 page that will let users know if the Wayback Machine has a copy of the page in our archive – your web pages don’t have to die!

We started with a big goal — to archive the Internet and preserve it for history.  This year we started looking at the smaller goals — archiving a single page on request, making pages available more quickly, and letting you get information back out of the Wayback in an automated way.  We have spent 17 years building this amazing collection, let’s use it to make the web a better place.

Toward a Manual for Civilization

Posted on Wednesday, August 14th, 02013 by Austin Brown
link   Categories: Digital Dark Age, Long Now salon (Interval), Long Term Thinking, Manual for Civilization, The Interval   chat 0 Comments

We are as gods” because of our ancestors’ diligence. The promise of a technologically advancing future is predicated on millennia of accumulated knowledge. Civilization has taken a lot of work to build and it demands a great deal of know-how to sustain. And as modern life increasingly encourages specialization, familiarity across that accumulated knowledge’s breadth can wane. Our ability to collaborate is a strength, but beyond a point we risk losing comprehension of the infrastructure that supports our modern lives. How can we retain that knowledge?

Long Now Board Member Kevin Kelly has suggested a Library of Utility:

It would be a very selective library. It would not contain the world’s great literature, or varied accounts of history, or deep knowledge of ethnic wonders, or speculations about the future. It has no records of past news, no children’s books, no tomes on philosophy. It contains only seeds. Seeds of utilitarian know-how. How to recreate the infrastructure and technology of civilization so far.

Alexander Rose, our Executive Director, has compiled resources that could become such a Manual for Civilization:

It is an interesting thought exercise to ask yourself what information you might want if you had to truly start over.

And in our forthcoming Salon space at Fort Mason Center, we’ll house approximately 3,500 volumes in a floor-to-ceiling library featuring carefully selected books that could be used to help restart civilization. We are not trying to be apocalyptic or at all predictive, but the conversation that is inspired by this exercise seems to be endless and valuable.

We will collaboratively curate this corpus with Long Now’s members and the public. We understand that by definition we ourselves will have a western-centric viewpoint of what might be collected, but as the project gets going we plan to seek submissions that represent views from as many cultural viewpoints as possible. Several interns have been hired to begin rounding up submissions and our Digital Research Director, Kurt Bollacker, is advising on the information design, indexing architecture, and digital archiving strategy for the collection.

To support its long-term survival and worldwide accessibility, we’ll have a digital version of the collection publicly available on the Internet Archive. And, among its shelves, we’ll have many a great conversation – over tea, coffee, and maybe some whiskey – honoring curiosity, ingenuity and persistence. We hope you’ll join us.

If you share our enthusiasm for this project, please consider supporting the construction of the Salon space in which it will be housed – gifts for supporters include things like a free beverage once the space opens or having a shelf of the Manual’s books dedicated in your honor!

Art & The Art of Archiving at New York’s New Museum

Posted on Monday, August 12th, 02013 by Charlotte Hajer
link   Categories: Digital Dark Age, Long Term Art   chat 0 Comments

xfr_logo

From July 17 to September 8 of this year, the New Museum on Manhattan’s Lower East Side is hosting XFR STN (read ‘transfer station’), an “open-door artist-centered media archiving project.”

A collaborative effort by artists for artists, XFR STN is essentially a preservation and migration service for artwork created with or on audiovisual and digital formats that have since become obsolete. The migrated works will be available publicly through the Internet Archive, and on view at the New Museum’s fifth floor gallery space.

Part public exhibit and part archival laboratory, XFR STN is turning the preservation of art itself into a creative process. It’s an effort at saving art from digital darkness – not only by ensuring its continued accessibility, but by keeping it alive in the public eye.

“Consistent with the dictum “distribution is preservation,” the project argues for circulation as a mode of conservation. “XFR STN” will serve as a collection and dissemination point for artist-produced content, as well as a hub for information about these past projects (including production materials and personal recollections). The project is both a pragmatic public service and an activity as metaphor: an opportunity to present aspects of a mediatic production process in continuous dynamic transformation.”

 

A New Dimension (or Two?) for Long-Term Data Storage

Posted on Friday, July 26th, 02013 by Charlotte Hajer
link   Categories: Digital Dark Age, Rosetta   chat 0 Comments

superman-optical-storage-glass2

A group of scientists at the University of Southampton is pushing the frontier of long-term data storage technology to a new level. At a recent Conference on Lasers and Electro-Optics in San José, the researchers announced their success at recording data in quartz glass by using a femtosecond laser.

A femtosecond, or ultrafast, laser sends out a quadrillion (that’s a 1 with 15 zeros) pulses per second. When focused on a piece of quartz glass, these photon bullets shift the structuring of atoms in the silica, creating what are called nanostructures. The presence of nanostructures changes the way light travels through the quartz, which means they can be ‘read’ by an optical microscope.

Taking advantage of this fact, these Southampton researchers figured out how to use an ultrafast laser to deliberately place nanostructured dots within the quartz glass. A configuration of dots can thereby become five-dimensional code, conveying meaning through its spatial position within the quartz (dimensions one, two, and three), as well as the size and directional orientation of the dot (dimensions four and five). Using this ‘code’, the research team successfully recorded a 300 kb digital text file into a piece of quartz glass, in the form of a holographic ‘image’ of dots that can be read with an optical microscope fitted with a polarizing filter.

Silica quartz is attractive as a base for very long-term storage because, like sapphire or nickel, it is strong and resistant to high temperatures up to 1000° Celsius. The Southampton research team claims that quartz glass could last for a million years:

“It is thrilling to think that we have created the first document which will likely survive the human race, said Peter Kazansky, professor of physical optoelectronics at the Univ. of Southampton’s Optical Research Centre. “This technology can secure the last evidence of civilization: all we’ve learnt will not be forgotten.”

Beyond the strength of its material, the potential of this new technology lies in the nano-scale of its encoding: at that order of magnitude (or microtude, if you will), the researchers suggest, a single piece of quartz could hold more than 350 terabytes of data. If this technology can be translated into a real-world utility, the researchers claim this new form of data storage

“… could be highly useful for organizations with big archives. At the moment companies have to back up their archives every five to ten years because hard-drive memory has a relatively short lifespan,” says [principal investigator Jingyu Zhang]. “Museums who want to preserve information or places like the national archives where they have huge numbers of documents, would really benefit.”