“The Near and Far Future of Libraries“, an article in the new publication “Hopes & Fears”, includes an interview with Long Now’s Dr. Laura Welcher on the dangers of the “digital dark age”.
Laura Welcher is Director of the Rosetta Project, The Long Now Foundation’s language-preservation effort that explores storage mediums that will last thousands of years.
How many autoglossonyms do you know? Presumably, “English”; probably “español”, “français”, and “Deutsch”; perhaps “русский”, “日本語”, “עברית”, or “हिंदी”.
As you may have guessed, an autoglossonym is the name of a language in that language. While most people know a few of them, PanLex, as a Long Now project, aims to discover and document all of them that can be found, all the way into the farthest corners of the world and the remotest eras in time.
PanLex has amassed facts about words in nearly 10,000 language varieties (languages and their dialects). PanLex prefers to use autoglossonyms in naming language varieties; so far we have collected about 9,000, which we believe to be the largest such collection in existence. In some cases we find phrases that mean “language of the X people” or “language of X region” or “our language” used as autoglossonyms. But in about a thousand cases the PanLex team has not yet found autoglossonyms of any kind, and then we substitute exoglossonyms—names used by outsiders.
Finding autoglossonyms is hardest for extinct languages, languages of small groups, and obscure dialects. For example, PanLex has documented eight varieties of Shoshoni, a Uto-Aztecan language of Nevada, Idaho, Wyoming, and Utah, and for three of these we haven’t found autoglossonyms. Our database contains over 2,600 expressions in Big Smokey Valley Shoshoni, but we still don’t know its autoglossonym. It’s possible that speakers of this variety did not have a name for it, or the name has never been recorded. The search continues.
Using exoglossonyms when autoglossonyms are not available can be a delicate issue. As with names for racial and ethnic groups, names that outsiders give to languages are sometimes considered offensive by the people whose languages are being labeled. The words “Lapp” and “Hottentot”, for example, are generally recognized as pejorative terms for the Saami and Nama languages, respectively. But in many cases a non-native speaker would not recognize a language name as pejorative (for example, “Ngiao” for Shan and “Quottu” for Eastern Oromo).
Autoglossonyms can often be found in the documentation produced by other projects, including Ethnologue, Geonames, Lexvo, and Wikipedia. We use data from all these projects, and we make our data available to them in return.
You can see PanLex’s labels for language varieties on the home page of the expert PanLex interface. If you see any autoglossonyms there that you know to be incorrect, or exoglossonyms that you can replace with autoglossonyms, please notify firstname.lastname@example.org.
In November 02011 Brewster Kahle, the founder of the Internet Archive, spoke for Long Now. “We are really striving to build The Library of Alexandria version 2,” says Brewster, near the start of his talk, “So that everyone anywhere who is curious to want access can access the world’s knowledge.” He proceeds to assess, one media type at a time what it will take in effort and disk space to get all the books, recorded music, TV, software, web pages, etc. into an online database. The overall message: “Universal access to all knowledge is within our grasp.”
Long Now members can watch this video here. The audio is free for everyone on the Seminar page and via podcast. Long Now members can see all Seminar videos in HD. Video of the 12 most recent Seminars is also free for all to view.
From Stewart Brand’s summary of the talk (in full here):
The Web itself. When the Internet Archive began in 1996, there were just 30 million web pages. Now the Wayback Machine copies every page of every website every two months and makes them time-searchable from its 6-petabyte database of 150 billion pages. It has 500,000 users a day making 6,000 queries a second.
In 02015, less than 4 years later, the Internet Archive’s web archive has grown to over 400 billion pages; and the ever-expanding collections of books, movies, and music have now pushed the total Archive database size over 20 petabytes.
You’ll hear in this talk that Brewster and the Archive’s association with The Long Now Foundation goes way back. In fact the first prototype of the 10,000 Year Clock “bonged” twice to mark the year 02000 in a building shared with the Archive. Long Now continues to partner with the Archive in many ways including on Rosetta Project activities and the Manual for Civilization. And we intend for our partnership to continue for at the very least a few more millennia.
Brewster Kahle is the founder and chairman of the Internet Archive. He earned a B.S. from MIT in 1982, where he studied artificial intelligence with Long Now co-founder Daniel Hillis. Brewster Kahle serves on the boards of the Electronic Frontier Foundation, Public Knowledge, the European Archive, the Television Archive, and the Internet Archive.
Photo by Rudy Rucker
The Seminars About Long-term Thinking series began in 02003 and is presented each month live in San Francisco. It is curated and hosted by Long Now’s President Stewart Brand. Seminar audio is available to all via podcast.
Everyone can watch full video of the 12 most recent Long Now Seminars. Long Now members can watch video of this Seminar video or more than ten years of previous Seminars in HD. Membership levels start at $8/month and include lots of benefits.
Tuesday February 17, 02015 – San Francisco
“Temporary, moderate, and responsive” should be the guidelines of responsible geoengineering, in David Keith’s view. For slowing global warming, and giving humanity time to bring greenhouse gas emissions down to zero (and eventually past zero with carbon capture), he favors the form of “solar radiation management” that reflects sunlight the way volcanoes occasionally do—with sulfate particles in the stratosphere.
The common worry about geoengineering is that because it is so cheap ($1 billion a year) and easy, civilization would become “addicted“ and have to continue it forever, while giving up on the expensive and difficult process of reducing greenhouse gas emissions, thus making the long-term problem far worse. Keith’s solution is to design the geoengineering program as temporary from start to finish. “Temporary“ means shut it down by 02200. (Keith also likes the term “patient” for this approach.)
By “moderate” he means there is no attempt to completely offset the warming caused by us, but just cut the rate of climate change in half. That would give the highest benefit at lowest risk—minimal harmful effect on ozone and rainfall patterns, and the fewest unwelcome surprises, while providing enough time (and plenty of incentive) for societies to manage their carbon dioxide mitigation and orderly adaptation. Geoengineering’s leverage is very high—one gram of particles in the stratosphere prevents the warming caused by a ton of carbon dioxide.
“Responsive” means careful, gradual, and closely monitored, with the expectation there will be many adjustments along the way, along with the ability to back off entirely if needed. Though climate-change models keep improving, we still do not completely understand how climate works, and that raises the very good question: “How do you engineer a system whose behavior you don’t understand?” Keith’s answer is “feedback. We engineer and control many chaotic systems, such as high-performance aircraft, through precise feedback.” The same goes for governance of geoengineering. It is a complex system that will require sophisticated control by a global set of governing bodies, but we already do that for the far more complex system of global finance.
Keith’s specific program would begin with balloon tests in the lower stratosphere (8 miles up) releasing just 100 grams of sulfuric acid—about the amount of particles in a few minutes of normal jet contrail. “If those studies confirm safety and effectiveness,” Keith said, “then we could begin gradual deployment as early as 02020 with three business jets re-engineered for high altitude. By 02030 you could have about ten aircraft delivering a quarter million tons of sulfur per year at a cost of $700 million.“
The amount of sulfur being released might be up to a million tons by 02070, but that would still be only one-eighth of what went into the stratosphere from the Mt. Pinatubo volcanic eruption in 01991, and one-fiftieth of what enters the lower atmosphere from our current burning of fossil fuels. By then we may have developed more sophisticated particles than sulfate. It could be diamond dust, or alumina, or even something like a nanoscale “photophoretic” particle designed by Keith that would levitate itself above the stratosphere.
This is no quick fix. It is not quick, and it doesn’t try to be a complete fix. It has to be matched with total reduction of greenhouse gas emissions to zero and with effective capture of carbon, because the overload of carbon dioxide already in the atmosphere will stay there for a very long time unless removed. Keith asked, “Is it plausible that we will not figure out how to pull, say, five gigatons of carbon per year out of the air by 02075? I don’t buy it.“
Keith ended by proposing that the goal should not be just 350 parts per million (ppm) of carbon dioxide in the atmosphere. (It’s rising past 400 ppm now.) We can shoot for the pre-industrial level of the 01770s. Take carbon dioxide down to 270 ppm.
Subscribe to our Seminar email list for updates and summaries.
Tickets are now on sale: space is limited and we expect this talk to sell out
If you are reading this then Jason Scott has probably backed up bits that matter to you–whether you are an ex-SysOp or only use the Web to read this blog. Jason works on no less than three important online archives, each of which is invaluable in preserving our digital history. He’s also made two documentaries about pre-Web networked-computer culture The BBS Documentary and Get Lamp.
Jason created textfiles.com in 01998 to make thousands of files once found on online bulletin board systems (BBSs) available again after they had become scarce in the era of the World Wide Web. He founded Archive Team in 02009 to rescue user data from popular sites that shut down with little notice; this loose collective has launched “Distributed Preservation of Service Attacks” to save content from Friendster, GeoCities and Google Reader amongst others. In 02011 Jason began curating the Internet Archive‘s software collection: the largest online library of vintage and historical software in the world.
Long Now welcomes Jason Scott to our Conversations at The Interval series:
“The Web in an Eye Blink” on Tuesday, February 24, 02015
The Internet is a network of networks that has the ability to bring the far reaches of the world closer, seemingly in real time. A Big Here in a short Now. But there’s a Long Now of the Internet also in that it connects us through time: a shared memory of readily accessible information. Accessible as long as the resource exists somewhere on the system. So the Internet should give worldwide access to our long memory, all the content we’ve ever put online, right? Unfortunately there are some challenges. But happily we have allies.
The network path to a specific document is a technological chain. And it can be a brittle one. The chain’s components include servers, cables, protocols, programming code, and even human decisions. If one connection in the chain fails–whether it’s the hardware, software, or just a hyperlink–the information becomes inaccessible. And perhaps it’s lost forever. This problem is an aspect of what we call the Digital Dark Age.
The “High technology” industry is innovation/obsolescence driven by its nature; so new models and software updates often undermine that network-chain in the name of progress. But the tech industry’s competitive business environment causes another threat to our online memory. Mergers and acquisitions shift product offerings irregardless of customer sentiment, let alone the historical importance of a site or service. Users who have invested time and effort in creating content and customizing their accounts will often get little notice about a site’s impending demise. And it’s rare that companies provide tools or guidance to enable customers to preserve their own data. That’s why Jason started Archive Team: to save digital assets for users not empowered to do so themselves. Initially reactive they now keep a “Death Watch” to warn users and keep their own team alert ahead of time about sites that don’t look long for this world.
There is no FDIC for user data. Jason Scott and the Archive Team are all we’ve got.
Assuming that your photographs, writing, email, or other data is important to you … you should always be looking for an export function or a way to save a local backup. If a company claims it’s too hard, they are lying. If they claim that they have everything under control, ignore that and demand your export and local backup again.
The broken link may be the most pernicious of the many breaks that can occur in the network chain. When a linked file is removed or renamed, even if it’s been available for years, all access is instantly cut off. The Wayback Machine, a database of 452 Billion previously downloaded web pages, is a solution to that problem and it’s the main feature of archive.org that most people use today.
But the Internet Archive is much more than a database of web pages. It is a non-profit library with an ever-growing collection of books, movies, software, music, and more is available online for free. The archive preserves and expands our collective memory. And Jason’s work with archiving vintage software is especially groundbreaking.
And these games which ran on Apple II’s, TRS-80s, Atari 2600′s, etc, from the historically important and nostalgia-rich era of the 01980s and 01990s will now run in many 21st Century web browsers.
Which browsers? Maybe your browser. Can you see a black box below? If so click the button and you’ll have a chance to play Apple’s 01979 “Apple ][” game Lemonade Stand. Have fun. And be patient. Things were slower then.
You can thank Jason Scott for uploading this and thousands of other fun, useful, or just old pieces of software.
In fact, you can thank him in person when he speaks at The Interval at Long Now this Tuesday, February 24, 02015: “The Web in an Eye Blink”. Jason will talk about his work in the frame of the Long Now.
Get your tickets soon, we expect this talk to sell out.
See Andy Baio’s piece on Medium for more thoughts on Jason’s work and the implications of the Archive’s software collection.
Photos by Jason Scott unless otherwise noted
An important part of long-term thinking is the never-ending search for very long-lived methods of information storage. A perfect, eternal storage medium still eludes us; most of the ones we’ve invented and used over the course of civilization have had their limitations – even stone, nickel, and sapphire have a shelf life.
But new research by a team of physicists now suggests that searching for a storage medium that lives forever may be a waste of energy, because the laws of physics themselves limit the amount of time that any information can be kept.
In a paper recently published by the New Journal of Physics, the researchers review how spacetime dynamics might influence the storage of information by asking how much data we can reliably hold on to from the beginning to the end of time.
In order to answer that question, the team combined Einsteinian cosmology with quantum theories about the nature of matter and reality. They worked with a standard model of the universe, called the Friedman-Lemaître-Robertson-Walker metric: based on Einstein’s theory of general relativity, it describes a universe that is homogeneous and isotropic, and therefore expands (or contracts) uniformly in all directions.
Working with this metric, the researchers modeled what would happen to stored data over the course of universe expansion. When you encode information into some kind of matter and then track what happens to your storage medium throughout the life course of the universe, you’ll find that the quantum state of its matter (in other words, its properties: its position, momentum, and spin) will eventually and inevitably change. The research team was able to prove that this change in state creates ‘noise’ that dampens the stored information. One of the research physicists explains the process in this video abstract of the paper:
The faster the universe expands, the team argues, the more ‘noise’ interferes with stored data. Looking at the storage of both classical information (anything encoded in bits) and quantum information (anything encoded by the quantum state of a given particle), they conclude that not very much data will last from the beginning to the end of time.
In other words, it seems as though we may be doomed to an eventual quantum dark age. Unless, of course, we always take care to anticipate these state changes, and continuously forward migrate our data.
In May 02009 author and food activist Michael Pollan spoke for Long Now about Deep Agriculture. At the time Barack Obama was recently elected President, and Pollan takes the opportunity to give a “state of the movement” on efforts to reform the US food system.
Full audio and video of this Seminar is free for everyone to watch on the Long Now Seminar site and via podcast. Long Now members can see all Seminar videos in HD. Video of the 12 most recent Seminars are always free for all to view.
His assessment finds a system built on cheap oil that has negative impacts on our health and jeopardizes our security. In a word, Pollan calls it unsustainable. It takes 10 calories of fossil fuel energy to bring 1 food calorie to the table. It’s important, he says, that people realize “we are eating oil.”
From Kevin Kelly’s summary of this Seminar (in full here):
The benefit of a reformed food system, besides better food, better environment and less climate shock, is better health and the savings of trillions of dollars. Four out of five chronic diseases are diet-related. Three quarters of medical spending goes to preventable chronic disease. Pollan says we cannot have a healthy population, without a healthy diet. The news is that we are learning that we cannot have a healthy diet without a healthy agriculture. And right now, farming is sick.
Michael Pollan is an award-winning author, a critic of and activist against the industrialized food system, whose books include The Omnivore’s Dilemma, In Defense of Food: An Eater’s Manifesto, and most recently Cooked: A Natural History of Transformation. He is also a former executive editor for Harper’s Magazine.
The Seminars About Long-term Thinking series began in 02003 and is presented each month live in San Francisco. It is curated and hosted by Long Now’s President Stewart Brand. Seminar audio is available to all via podcast.
Everyone can watch full video of the 12 most recent Long Now Seminars. Long Now members can watch more than ten years of previous Seminars in HD. Membership levels start at $8/month and include lots of benefits.
Stewart Brand and Paul Saffo — photos by Ian Kennedy
Stewart introduced pace layers in The Clock of Long Now (01999). In the book he credits Freeman Dyson and Brian Eno, amongst others, for influencing his thinking on intra-societal tiers that move at differing speeds. At The Interval he went deeper into the origins of the idea, citing the concept of “Shearing Layers” by architect Frank Duffy as a precursor to pace layers. Stewart featured Duffy’s idea in his 01994 book How Buildings Learn: What Happens After They’re Built.
The beginnings of the concept can be found in this excerpt from How Buildings Learn:
To change is to lose identity; yet to change is to be alive. Buildings partially resolve the paradox by offering the hierarchy of pace – you can fiddle with the Stuff and Space Plan all you want, while the Structure and Site remain solid and reliable.
How Buildings Learn was influential in surprising ways. It was embraced for its applicability to systems thinking and software design amongst other things. And pace layers continues to be utilized as a framework for understanding many different kinds of complex, overlapping systems. Here’s a recent example:
Stewart shared an amazing artifact: a preliminary sketch of the pace layers diagram dating to December 01996. He drew it after discussions with Brian Eno. It includes a couple of the final edits which he mentions in his talk: the top layer was changed to Fashion from “Art”, at Eno’s insistence. And “Government” is here annotated with “Governance” which it would become in the final version.
In The Clock of the Long Now Stewart presents the diagram and lays out the pace layers concept in a 6-page chapter. He draws parallels to natural systems. He cites numerous sources that have informed his thinking. It’s remarkably efficient in presenting this idea, managing to be dense and readable at the same time.
While this Interval talk is a great introduction, if you are intrigued by pace layers then you should read the book. It is 200 pages long and includes many more great ideas. It’s also a great history of The Long Now Foundation. And it features probably the most entertaining correspondence with the Interval Revenue Service that you will ever read. Just saying.
In Pace Layers the relationship between layers is key to the health of the system. More specifically, as both Stewart and Paul point out, the conflicts caused by layers moving at different speeds actually keeps thing balanced and resilient. Paul calls this “constructive turbulence.” Stewart’s slide details how fast (upper) and slow (descending) layers interact.
Paul Saffo teaches pace layers in his forecasting classes at Stanford, and he compared the speed of change in Silicon Valley to the slow shifting of the San Andreas Fault below. In fact he sees The Valley as having its own particular ecosystem of pace layers forming a standing vortex (akin to the eye of Jupiter). For those of us here in the Bay Area: “We all live on von Kármán vortex street.”
Paul called on a couple people in the audience to speak about how they use pace layers. Customer feedback, as it were. First up was Andrea Saveri who uses pace layers in futures education. For the 14 to 21 year-old students she works with, it provides a concrete way to think about time horizons, abstract thinking and the long-term future.
Peter Leyden was a colleague of Stewart’s at Global Business Network (GBN) around the time pace layers was published. Today, he noted, Culture and Nature are accelerating (driven by technology and climate change, respectively) but Governance isn’t responding. Even as the layers move below it, it’s not innovating.
Stewart had an answer for that. Look to cities and city-states (Singapore) which are less ideological and may be the change agents in Governance that are needed.
“This is a data free document” Stewart replied to a data-specific audience question. And maybe that lack of hard data has aided its longevity and versatility. It’s a framework, a way to look at issues in a society; or it can be projected on other systems. The lines are not hard lines; borders between layers are turbulent zones “where the action is”. But it’s not tied to “facts” which may turn out to have expiration dates.
Here’s how Stewart introduced pace layers back in 01999:
I propose six significant levels of pace and size in the working structure of a robust and adaptable civilization. [...] In a healthy society each level is allowed to operate at its own pace, safely sustained by the slower levels below and kept invigorated by the livelier levels above.
He put forward pace layers as an ideal model, knowing that no society made of humans will operate perfectly to plan. After a decade and a half of continued use, the model seems to have garnered overwhelmingly positive feedback. It continues to be a useful bridge to long-term thinking. Its conceptual outline has been applied successfully to many areas of human activity. Maybe we should no longer call it “an idea.” And call it wisdom instead.
We hope you’ll listen to this talk in full and tell us what you think of the Conversations series. We’d like to make more audio and video of these events available in the future. Long Now is looking for grants and sponsorships to underwrite the production and distribution of these talks on a wider scale. Please let us know if you have ideas on that.
Tuesday January 13, 02015 – San Francisco
Over the last 40 years, in nearly every field, human productivity has decoupled from resource use, Ausubel began. Even though our prosperity and population continue to increase, the trends show decreasing use of energy, water, land, material resources, and impact on natural systems (except the ocean). As a result we are seeing the beginnings of a global restoration of nature.
America tends to be the leader in such trends, and the “American use of almost everything except information seems to be peaking, not because the resources are exhausted but because consumers changed consumption and producers changed production.“
Start with agriculture, which “has always been the greatest raper of nature.” Since 01940 yield has decoupled from acreage, and yet the rising yields have not required increasing inputs such as fertilizer, pesticides, or water. The yield from corn has become spectacular, and it is overwhelmingly our leading crop, but most of it is fed to cars and livestock rather than people. Corn acreage the size of Iowa is wasted on biofuels. An even greater proportion goes to cows and pigs for conversion to meat.
The animals vary hugely in their efficiency at producing meat. If they were vehicles, we would say that “a steer gets about 12 miles per gallon, a pig 40, and a chicken 60.“ (In that scale a farmed fish gets 80 miles per gallon.) Since 01975 beef and pork consumption have leveled off while chicken consumption has soared. “The USA and the world are at peak farmland, “ Ausubel declared, “not because of exhaustion of arable land, but because farmers are wildly successful in producing protein and calories.” Much more can be done. Ausubel pointed out that just reducing the one-third of the world’s food that is wasted, rolling out the highest-yield techniques worldwide, and abandoning biofuels would free up an area the size of India (1.2 million square miles) to return to nature.
As for forests, nation after nation is going through the “forest transition” from decreasing forest area to increasing. France was the first in 01830. Since then their forests have doubled while their population also doubled. The US transitioned around 01950. A great boon is tree plantations, which have a yield five to ten times greater than logging wild forest. “In recent times,” Ausubel said, “about a third of wood production comes from plantations. If that were to increase to 75 percent, the logged area of natural forests could drop in half.” Meanwhile the consumption of all wood has leveled off—for fuel, buildings, and, finally, paper. We are at peak timber.
One byproduct of increased carbon dioxide in the atmosphere and the longer temperate-zone growing seasons accompanying global warming is greater plant growth. “Global Greening,“ Ausubel said, “is the most important ecological trend on Earth today. The biosphere on land is getting bigger, year by year, by two billion tons or even more.”
Other trendlines show that world population is at peak children, and in the US we are peak car travel and may even be at peak car. The most efficient form of travel, which Ausubel promotes, is maglev trains such as the “Hyperloop“ proposed by Elon Musk. Statistically, horses, trains, cars, and jets all require about one ton of vehicle per passenger. A maglev system would require only one-third of that.
In the ocean, though, trends remain troubling. Unlike on land, we have not yet replaced hunting wild animals with farming. Once refrigeration came along, “the democratization of sushi changed everything for sea life. Fish biomass in intensively exploited fisheries appears to be about one‐tenth the level of the fish in those seas a few decades or hundred years ago.“ One fifth of the meat we eat comes from fish, and about 40 percent of that fifth is now grown in fish farms, but too many of the farmed fish are fed with small fish caught at sea. Ausubel recommends vegetarian fish such as tilapia and “persuading salmon and other carnivores to eat tofu,” which has already been done with the Caribbean kingfish. “With smart aquaculture,“ Ausubel said, “life in the oceans can rebound while feeding humanity.”
When nature rebounds, the wild animals return. Traversing through abandoned farmlands in Europe, wolves, lynx, and brown bears are repopulating lands that haven’t seen them for centuries, and they are being welcomed. Ten thousand foxes roam London. Salmon are back in the Thames, the Seine, and the Rhine. Whales have recovered and returned even to the waters off New York. Ausubel concluded with a photo showing a humpback whale breaching, right in line with the Empire State Building in the background.
Subscribe to our Seminar email list for updates and summaries.
On Tuesday, February 17, David Keith will present Patient Geoengineering, as part of our monthly Seminars About Long-Term Thinking. Each month the Seminar Primer gives you some background information about the speaker, including links to learn even more.
In 01991, Mount Pinatubo – a largely forgotten and underestimated volcano in the Philippines – erupted in what would turn out to be one of the 20th century’s most significant geological events. It shot about 20 million tons of sulfur dioxide to the surface, much of which a coinciding typhoon then swept up into the air. This produced a cloud of sulfuric acid aerosols that quickly spread across the planet and managed to lower global temperatures by about 0.5 ºCelsius for the next few years.
This one-time event thereby managed to achieve what decades of political discussion about curbing CO₂ emissions has so far been unsuccessful at doing: counteracting the unprecedented global warming of our planet. Could Mount Pinatubo be pointing us to a viable new solution for climate change?
Many people, climate scientists included, are wary of proposals to reverse or reduce global warming by tinkering directly with Earth’s climate and atmosphere. Such efforts at geoengineering, they worry, could have unforeseen and dangerous regional side effects that we may not be able to control or reverse. What if it interferes with local patterns of rainfall – or produces powerful storms?
But after decades of getting nowhere with emissions caps, argues David Keith, we simply can no longer afford not to put these ideas on the table.
Keith is an applied physicist and climate scientist at Harvard, with dual appointments in the university’s schools of engineering and public policy. He splits his time between Cambridge and Calgary, where he runs Carbon Engineering – a company that works on developing technologies for the capture of carbon dioxide in the atmosphere and turning it into low-carbon fuel.
Keith dedicates both his academic and entrepreneurial efforts to the exploration of climate engineering. While his company works on methods to directly reduce the amount of CO₂ in the air, his research explores ways to counteract human contributions to rising CO₂ levels by diminishing the amount of solar energy that reaches Earth’s surface. Indeed, one method for this kind of Solar Radiation Management (SRM) takes a cue from Mount Pinatubo, and would involve the release of sulfate particles into the upper atmosphere:
There’s no question [solar radiation management] reduces the global average temperatures; even the people who hate it agree you could reduce average global temperatures. The question is: how does it do on a regional basis? By far the single most important thing to look at on a region-by-region basis is the impact on rainfall and temperature. And the answer is, it works a lot better than I expected. It’s really stunning. A lot of us thought that, in fact, geoengineering would do a lousy job on a regional basis – and there’s lots of talk on the inequalities – but in fact, when you actually look at the climate models, the results show they’re strikingly even.
Nevertheless, Keith by no means means to suggest humanity should begin experimenting with these methods immediately, nor should they be considered a viable and ethical alternative to cutting CO₂ emissions. Above all, he argues for thoughtful discussion, rigorous research, and global consensus about the best way forward. We must, above all, be patient and thorough. As he told Time Magazine in 02009, when the weekly named him a Hero of the Environment, “The thing about tools … is not that you have to use them: it’s that you have to understand them.”
Join us next Tuesday, February 17th at SFJAZZ Center to hear David Keith present his case for patient geoengineering.
navigateleft Previous Article
Jeffrey McGrew: Talking with Robots about Architecture at The Interval — February 10, 02015
Next Article navigateright
Jesse Ausubel Seminar Media
Ideas about Long-term Thinking
The Long Now Foundation was established to creatively foster long-term thinking and responsibility in the framework of the next 10,000 years. More navigaterightBecome a Member