03 January 2010

Existential Risks ~ Avoiding Catastophic Ends...

Extraordinary longevity may indeed be in our future, but it will be all for naught if we fail to deal with Existential Risks. Such risks are both global, affecting all of humanity, and terminal, destroying or irreversibly crippling us. Transhumanist philosopher Nick Bostrom spells it out...
"Where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential"
...and offers us this Scope vs Intensity Grid as a way of thinking about the possibility-space... Bostrom has perhaps done the most to bring this subject to widespread attention via his 2002 paper in the Journal of Evolution and Technology on Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards...
"Existential risks are distinct from global endurable risks. Examples of the latter kind include: threats to the biodiversity of Earth’s ecosphere, moderate global warming, global economic recessions (even major ones), and possibly stifling cultural or religious eras such as the “dark ages”, even if they encompass the whole global community, provided they are transitory (though see the section on “Shrieks” below). To say that a particular global risk is endurable is evidently not to say that it is acceptable or not very serious. A world war fought with conventional weapons or a Nazi-style Reich lasting for a decade would be extremely horrible events even though they would fall under the rubric of endurable global risks since humanity could eventually recover. (On the other hand, they could be a local terminal risk for many individuals and for persecuted ethnic groups.)"
The bottom-line is that a variety of risk-reduction or risk-survival activities are worth undertaking ASAP, including a concerted effort to accelerate humanity's movements Beyond Our Cradle!

1 comment:

Richard B said...

Joost-

Thanks for the very clear explanation. I agree with you that the historical perspective is so important. Of special interest to me is Bostrom's, and Rees', comment that X risks can be interdependent and that "totalitarianism is itself an existential risk" (Bostrom). My (amateur) questions are at www.sustainablerights.blogspot.com and I would sincerely appreciate your thoughts on improving, substantiating or clarifying those ideas.