How long is humanity's future?

bostrom

Much like the Centre for Existential Risk at Cambridge, the Future of Humanity Institute at Oxford spends significant effort grappling with scenarios that could lead to the human species’ demise.

The Institute is headed by Nick Bostrom, a scholar of philosophy, physics, computational neuroscience, and mathematical logic. Aeon Magazine’s Ross Anderson recently spoke with Bostrom and several other researchers at the Institute to ask what kinds of risks we should really be taking seriously:

The risks that keep Bostrom up at night are those for which there are no geological case studies, and no human track record of survival. These risks arise from human technology, a force capable of introducing entirely new phenomena into the world.

Studying risk of any kind leads inevitably to questions of statistics and probability – things human intuition is generally very very bad at comprehending. Fortunately, what nature did not give us, we can still nurture in ourselves. Bostrom is relentless is his mathematical and logical approach to the probability of different possibilities and the utility they afford the human race.  Depicting his utilitarian approach, Anderson paraphrases Bostrom’s explanation for why studying existential risk is so valuable:

We might be 7 billion strong, but we are also a fire hose of future lives, that extinction would choke off forever. The casualties of human extinction would include not only the corpses of the final generation, but also all of our potential descendants, a number that could reach into the trillions.

Read: Omens by Ross Anderson

Share on Facebook Share on Twitter

More from Futures

What is the long now?

The Long Now Foundation is a nonprofit established in 01996 to foster long-term thinking. Our work encourages imagination at the timescale of civilization — the next and last 10,000 years — a timespan we call the long now.

Learn more

Join our newsletter for the latest in long-term thinking

Long Now's website is changing...