Published on Tuesday, November 15th, 02011 by Alex Mensing
Published on Tuesday, November 15th, 02011 by Alex Mensing
Published on Sunday, November 13th, 02011 by Alexander Rose - Twitter: @zander
(Astronaut removing the MISSE-7 Experiment with our sample on EVA1 on the STS-134 mission)
Back in 02009 through a partnership with Applied Minds, and in turn the Air Force Research Lab (who generously invited us to include a sample), we sent one of our Rosetta materials on an experiment called MISSE-7 (pronounced “missey”), which flew on the International Space Station. This experiment is a shorter term version of the material research begun in 01984 with the Long Duration Exposure Facility. We sent a sample of commercially pure titanium, that was black oxide coated, and laser marked (pictured below left). This is the same material and oxide process that was used to create the front of the original Rosetta Disk. However we used a much lower power laser than was used on the Rosetta disk so the marking was not very deep. The sample was just returned to us (below right) after its stint outside the ISS and looks no worse for wear at all except for a slight fade in the clarity of the etching.
(Sample before it was sent on left and after returning on right)
This marks our second space rated Rosetta Disk material, the first one was the nickel material that is currently on the ESAs Rosetta Mission. Below is all the info I have found out about the MISSE-7 mission so far. I am trying to locate the section of the EVA videos where the experiment gets installed and removed. Any help is appreciated.
Published on Friday, November 11th, 02011 by Alexander Rose - Twitter: @zander
|The Colbert Report||Mon – Thurs 11:30pm / 10:30c|
Founding Long Now board member Brian Eno was on Colbert last night and he got a chance to discuss Long Now and Clock Project. Also not to be missed is the end segment where Brian, Steven and Michael Stipe sing a-cappella.
Published on Monday, November 7th, 02011 by Alex Mensing
Researchers at the McKinsey Global Institute have been studying the process of urbanization – what works and what doesn’t – and argue in this article that the detrimental effects of rapid city growth are not directly the result of insufficient resources. Rather, they stem from management that is neither comprehensive enough nor farsighted enough.
Does this imply that the future will be one of massive megalopolises spread across the globe? Theoretically, the answer is yes—there is no limit to the size of cities. In practice, however, the growth of most urban centers is bound by an inability to manage their size in a way that maximizes scale opportunities and minimizes costs. Large urban centers are highly complex, demanding environments that require a long planning horizon and extraordinary managerial skills. Many city governments are simply not prepared to cope with the speed at which their populations are expanding.
Theoretical physicist Geoffrey West spoke at The Long Now Foundation’s SALT series in July of 02011 and discussed how cities tend to become more efficient and productive as they grow, and that they do so at an exponential rate. The challenge, as he described it, is that cities have to innovate faster and faster in order to keep up with superlinear growth. So how can city governments cope? The authors of the McKinsey Global Institute article, Richard Dobbs and Jaana Remes, outline four principles to guide the leaders of quickly growing metropolises:
First, successful cities need sufficient funding to finance their running costs and new infrastructure. Sources of funding could include monetizing land assets and levying property taxes, sales taxes, or user charges. Second, cities need modern, accountable governance; many large successful cities, including London and New York, have opted for empowered mayors with long tenures and clear accountability. Third, cities need proper planning that spans a 1- to 40-year horizon. Finally, all cities should craft dedicated policies in critical areas such as affordable housing.
Published on Wednesday, November 2nd, 02011 by Austin Brown
There is new media available from our monthly series, the Seminars About Long-term Thinking. Stewart Brand’s summaries and audio downloads or podcasts of the talks are free to the public; Long Now members can view HD video of the Seminars and comment on them.
Published on Tuesday, November 1st, 02011 by Austin Brown
The Rosetta Project at The Long Now Foundation is working to build an open public digital collection of all human language as well as an analog backup that can last for thousands of years–The Rosetta Disk. In the “long now,” the goal is long-term storage and access to information–on the scale that both supports and transcends individual human societies and civilizations. In the “here and now,” the project serves to support and amplify the importance of the world’s nearly 7,000 human languages, the vast majority of which are endangered and, if current trends continue, likely to go extinct in the next 100 years. I’ll present our current work on the Rosetta Project Collection and Disk as well as some new initiatives including the “Language Commons” where we are working to help build the multilingual Web.
There will be a reception afterwards; come say Hello.
Published on Monday, October 31st, 02011 by Austin Brown
As founder and librarian of the storied Internet Archive (deemed impossible by all when he started it in 1996), Brewster Kahle has practical experience behind his universalist vision of access to every bit of knowledge ever created, for all time, ever improving.
He will speak to questions such as these:
Can we make a distributed web of books that supports vending and lending? How can our machines learn by reading these materials? Can we reconfigure the information to make interactive question answering machines? Can we learn from past human translations of documents to seed an automatic version? And, can we learn how to do optical character recognition by having billions of correct examples? What compensation systems will best serve creators and networked users? How do we preserve petabytes of changing data?
Published on Friday, October 28th, 02011 by Alex Mensing
In the effort to understand our environment, scientists generally rely on natural observations to describe the earth’s past. They examine tree rings, oxygen isotopes, sedimentary rock, pollen, and many other physical records from which we can glean information. These methods are quite fruitful, and when combined they offer compelling evidence. But wouldn’t it be nice if, at least for the last few millennia, our ancestors had just recorded all of that information for us?
Occasionally they did, particularly when they encountered conditions or events that they considered extremely important. For example, swarms of locusts that ate all of their food. Conservation Magazine describes a project by a team of scientists in China who have compiled over 8,000 historical documents that chronicle the insect’s effects:
“Outbreak of Oriental migratory locusts (Locusta migratoria manilensis) was, together with drought and flood, considered one of the three most severe natural disasters causing damage to crop production in ancient China,” a team led by Huidong Tian of the Chinese Academy of Sciences in Beijing notes in the Proceedings of the National Academy of Sciences. “The earliest known written record of locusts was found inscribed on an ox bone in Oracle Script (Jiaguwen, the earliest Chinese script) 3,500 years ago, asking: ‘Will locusts appear in the field; will it not rain?’” Ever since, local histories and government documents have been littered with detailed records of locust outbreaks.
The study has shown a link between dry conditions and locust outbreaks, providing a rare biological source of evidence for climate variations. Regardless of whether or not the authors of these documents intended for them to be useful to future generations, their efforts to describe and catalog their environment in an enduring medium have proven very valuable to us, thousands of years later.
Published on Thursday, October 27th, 02011 by Alex Mensing
About ten years ago The Long Now Foundation initiated an effort to document every living organism on the planet within 25 years. The project was called All Species and while it did not make it through the dot com burst, it was continued by initiatives such as the Encyclopedia of Life and the Census of Marine Life. Because our knowledge of biological diversity of the planet is incomplete, scientists have always been uncertain of just how many species we have left to identify. Recently, however, a paper was published in the open-access biology journal of the Public Library of Science that approaches that question in a novel statistical way. The results are impressive. They indicate that the 1.2 million or so species that scientists have described to date comprise a mere 14% of the total number inhabiting our planet.
Our current estimate of ~8.7 million species narrows the range of 3 to 100 million species suggested by taxonomic experts  and it suggests that after 250 years of taxonomic classification only a small fraction of species on Earth (~14%) and in the ocean (~9%) have been indexed in a central database (Table 2). Closing this knowledge gap may still take a lot longer. Considering current rates of description of eukaryote species in the last 20 years (i.e., 6,200 species per year; ±811 SD; Figure 3F–3J), the average number of new species described per taxonomist’s career (i.e., 24.8 species, ) and the estimated average cost to describe animal species (i.e., US$48,500 per species ) and assuming that these values remain constant and are general among taxonomic groups, describing Earth’s remaining species may take as long as 1,200 years and would require 303,000 taxonomists at an approximated cost of US$364 billion. With extinction rates now exceeding natural background rates by a factor of 100 to 1,000 , our results also suggest that this slow advance in the description of species will lead to species becoming extinct before we know they even existed. High rates of biodiversity loss provide an urgent incentive to increase our knowledge of Earth’s remaining species.
On the bright side, there are some encouraging technological advances in social media and genetic identification that are increasing the efficiency of documenting new organisms. The internet facilitates the development of grassroots or amateur scientific projects, and it more widely distributes the daunting task of identifying another seven and a half million species (a task which would otherwise be all the more daunting in light of the dwindling number of professional taxonomists). One such project, featured previously on this blog, is known as 10000 birds and aims to photograph every bird in the world, providing a public database of avian images. For the important task of genetic documentation, DNA barcoding offers an efficient way of analyzing the genetic makeup of new specimens.
With these technologies and the development of others, it may indeed be possible to achieve a comprehensive description of life on earth in a time span closer to Long Now’s 25 year goal for the All Species project than the twelve centuries cited by the study above. And why develop such a catalog? Robert May of Oxford University’s Zoology department wrote a compelling companion piece to the study in the Public Library of Science’s journal.
[...] we increasingly recognise that such knowledge is important for full understanding of the ecological and evolutionary processes which created, and which are struggling to maintain, the diverse biological riches we are heir to. Such biodiversity is much more than beauty and wonder, important though that is. It also underpins ecosystem services that—although not counted in conventional GDP—humanity is dependent upon. [...] The essential fact is that, if we are to meet the challenges facing tomorrow’s world, we need a clearer understanding of how many species there are—both on land and in the even less well-studied oceans—underpinning the structure and functioning of ecosystems.
Published on Wednesday, October 26th, 02011 by Alex Mensing
Long Now Board Member David Eagleman will be speaking as part of the Bay Area Science Festival presentation “Will We Ever Understand the Brain” on Wednesday, November 2, 02011. Eagleman will discuss with Henry Markram, coordinator of the Human Brain Project, whether the myriad functions of the brain will someday be clear to us, or if they will always be somewhat of a mystery.
Eagleman is a neuroscientist at the Baylor College of Medicine as well as an author whose works include the fictional Sum: Forty Tales from the Afterlives and most recently, Incognito: The Secret Lives of the Brain.
Ideas about Long-term Thinking.