Revision as of 07:52, 5 October 2007 editWill Beback (talk | contribs)112,162 edits →Humanity: rm bulk of "dysgeneics" - excess wieght for fringe oncept← Previous edit | Latest revision as of 17:40, 24 December 2024 edit undoRhododendrites (talk | contribs)Autopatrolled, Event coordinators, Extended confirmed users, Page movers, Mass message senders, Pending changes reviewers, Rollbackers67,017 edits →Organizations: sourced only to its own websiteTag: Visual edit | ||
Line 1: | Line 1: | ||
{{Short description|Hypothetical global-scale disaster risk}} | |||
:''This is about the future of civilization, humans and the earth. For past civilizations, see ].'' | |||
{{Redirect-multi|2|Existential threat|Doomsday scenario|other uses|Doomsday (disambiguation)||}} | |||
'''Risks to civilization, humans and planet Earth''' are ]s that would imperil mankind as a whole and/or have major adverse consequences for the course of human ], ] or even the ].<ref name="types">{{cite journal| authorlink=Nick Bostrom| first=Nick| last=Bostrom| year=2001| url=http://www.nickbostrom.com/existential/risks.html| title=Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards| journal=Journal of Evolution and Technology| volume=9| month=March| year=2002}}</ref> The concept finds expression in various ] phrases such as "]", "]", "]" and others. <!--The prediction of future events is known as ].--> | |||
{{distinguish|Global Catastrophic Risks (book)}} | |||
{{broader|Human extinction}} | |||
{{Use American English|date=August 2021}} | |||
{{Use mdy dates|date=October 2021}} | |||
]. An asteroid caused the ].<ref name="Schulte10">{{Cite journal |last1=Schulte |first1=P. |last2=Alegret |first2=L. |last3=Arenillas |first3=I. |last4=Arz |first4=J. A. |last5=Barton |first5=P. J. |last6=Bown |first6=P. R. |last7=Bralower |first7=T. J. |last8=Christeson |first8=G. L. |last9=Claeys |first9=P. |last10=Cockell |first10=C. S. |last11=Collins |first11=G. S. |display-authors=1 |date=March 5, 2010 |title=The Chicxulub Asteroid Impact and Mass Extinction at the Cretaceous-Paleogene Boundary |url=http://doc.rero.ch/record/210367/files/PAL_E4389.pdf |journal=] |volume=327 |issue=5970 |pages=1214–1218 |bibcode=2010Sci...327.1214S |doi=10.1126/science.1177265 |pmid=20203042 |first30=E. |last30=Pierazzo |first29=R. D. |last29=Norris |first28=D. J. |last28=Nichols |first27=C. R. |last27=Neal |first26=J. V. |last26=Morgan |first25=A. |last25=Montanari |first24=J. |last24=Melosh |first23=T. |last23=Matsui |first22=K. G. |last22=MacLeod |first21=D. A. |last21=Kring |first20=C. |last20=Koeberl |first19=W. |last19=Kiessling |first18=K. R. |last18=Johnson |first17=S. P. S. |last17=Gulick |first16=R. A. F. |last16=Grieve |first15=J. M. |last15=Grajales-Nishimura |first14=K. |last14=Goto |first13=T. J. |last13=Goldin |first12=A. |last12=Deutsch |s2cid=2659741}}</ref>]] | |||
{{Futures studies}} | |||
A '''global catastrophic risk''' or a '''doomsday scenario''' is a hypothetical event that could damage human well-being on a global scale,<ref>{{Cite book |last=Bostrom |first=Nick |url=http://www.global-catastrophic-risks.com/docs/Chap01.pdf |title=Global Catastrophic Risks |date=2008 |publisher=Oxford University Press |page=1 |author-link=Nick Bostrom}}</ref> even endangering or destroying ].<ref name="world">{{Cite journal |vauthors=Ripple WJ, Wolf C, Newsome TM, Galetti M, Alamgir M, Crist E, Mahmoud MI, Laurance WF |date=November 13, 2017 |title=World Scientists' Warning to Humanity: A Second Notice |journal=BioScience |volume=67 |issue=12 |pages=1026–1028 |doi=10.1093/biosci/bix125 |doi-access=free|hdl=11336/71342 |hdl-access=free }}</ref> An event that could cause ] or permanently and drastically curtail humanity's existence or potential is known as an "'''existential risk'''".<ref name="types">{{Cite journal |last=Bostrom |first=Nick |author-link=Nick Bostrom |date=March 2002 |title=Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards |url=http://www.nickbostrom.com/existential/risks.html |journal=] |volume=9}}</ref> | |||
In the 21st century, a number of academic and non-profit organizations have been established to research global catastrophic and existential risks, formulate potential mitigation measures and either advocate for or implement these measures.<ref name=":3">{{Cite web |title=About FHI |url=http://www.fhi.ox.ac.uk/ |access-date=August 12, 2021 |website=]}}</ref><ref name=":4">{{Cite web |title=About us |url=https://www.cser.ac.uk/about-us/ |access-date=August 12, 2021 |website=]}}</ref><ref name=":5" /><ref name=":6" /> | |||
==Types of risks== | |||
==Definition and classification== | |||
Various risks exist for ], but not all risks are equal. Risks can be roughly categorized into six types based on the scope of the risk (Personal, Regional, Global) and the intensity of the risk (Endurable or Terminal). This chart provides some examples. | |||
] | |||
=== Defining global catastrophic risks === | |||
{| border="1" cellspacing="0" cellpadding="2" class="wikitable" | |||
The term global catastrophic risk "lacks a sharp definition", and generally refers (loosely) to a risk that could inflict "serious damage to human well-being on a global scale".<ref name="Bostrom & Cirkovic 2008">{{Cite book |last1=Bostrom |first1=Nick |title=Global Catastrophic Risks |last2=Cirkovic |first2=Milan |date=2008 |publisher=Oxford University Press |isbn=978-0-19-857050-9 |location=Oxford |page=1}}</ref> | |||
!colspan="3" align="center"|'''Typology of risk''' <ref name="types" /> | |||
|- | |||
! | |||
!Endurable | |||
!Terminal | |||
|- | |||
|'''Global''' | |||
|Thinning of the ozone layer | |||
|Global nuclear war | |||
|- | |||
|'''Regional''' | |||
|Economic recession | |||
|Genocide | |||
|- | |||
|'''Personal''' | |||
|Theft of car | |||
|Terminal illness | |||
|} | |||
Humanity has suffered large catastrophes before. Some of these have caused serious damage but were only local in scope—e.g. the ] may have resulted in the deaths of a third of Europe's population,<ref>{{Cite book |last=Ziegler |first=Philip |title=The Black Death |date=2012 |publisher=Faber and Faber |isbn=9780571287116 |page=397}}</ref> 10% of the global population at the time.<ref>{{Cite web |last=Muehlhauser |first=Luke |date=March 15, 2017 |title=How big a deal was the Industrial Revolution? |url=https://lukemuehlhauser.com/industrial-revolution/ |access-date=August 3, 2020 |website=lukemuelhauser.com}}</ref> Some were global, but were not as severe—e.g. the ] killed an estimated 3–6% of the world's population.<ref>{{Cite journal |last1=Taubenberger |first1=Jeffery |last2=Morens |first2=David |date=2006 |title=1918 Influenza: the Mother of All Pandemics |journal=Emerging Infectious Diseases |volume=12 |issue=1 |pages=15–22 |doi= 10.3201/eid1201.050979|pmc=3291398 |pmid=16494711}}</ref> Most global catastrophic risks would not be so intense as to kill the majority of life on earth, but even if one did, the ecosystem and humanity would eventually recover (in contrast to ''existential risks''). | |||
The risks discussed in this article are those in the Global-Terminal category. This type of risk is one where an adverse outcome would either annihilate intelligent life, or permanently and drastically curtail its potential. For an alternative classification system see ]'s . A problem for this system is the sheer size of humanity; even in the event of catastrophic nuclear war, total collapse of the ice sheets or ocean currents, or the rise of a devastating epidemic, it is extremely likely that some people will survive, though conventional civilization may collapse. | |||
Similarly, in '']'', ] singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional" scale. Posner highlights such events as worthy of special attention on ] grounds because they could directly or indirectly jeopardize the survival of the human race as a whole.<ref>{{Cite book |last=Posner |first=Richard A. |title=Catastrophe: Risk and Response |date=2006 |publisher=Oxford University Press |isbn=978-0195306477 |location=Oxford}} Introduction, "What is Catastrophe?"</ref> | |||
==Future scenarios== | |||
=== Defining existential risks === | |||
There are many scenarios that have been suggested that could happen in the future. Some are certain to happen and will almost certainly end humanity, but may only happen on a very long timescale, or may happen sooner. Others are likely to happen on a shorter timescale, but will probably not completely destroy civilization. Still others are extremely unlikely, and may even be impossible. For example, Nick Bostrom writes:<ref name="unlikely">Nick Bostrom, section 4.7.</ref> | |||
Existential risks are defined as "risks that threaten the destruction of humanity's long-term potential."<ref name="Ord 2020">{{Cite book |last=Ord |first=Toby |title=] |date=2020 |publisher=Hachette |isbn=9780316484916 |location=New York |quote=This is an equivalent, though crisper statement of ]'s definition: "An existential risk is one that threatens the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development." Source: Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority". Global Policy. 4:15-31.}}</ref> The instantiation of an existential risk (an ''existential catastrophe<ref name="Cotton-Barratt 2015">{{Citation |last1=Cotton-Barratt |first1=Owen |title=Existential risk and existential hope: Definitions |url=http://www.fhi.ox.ac.uk/Existential-risk-and-existential-hope.pdf |pages=1–4 |year=2015 |series=Future of Humanity Institute – Technical Report #2015-1 |last2=Ord |first2=Toby}}</ref>'') would either cause outright human extinction or irreversibly lock in a drastically inferior state of affairs.<ref name="priority" /><ref name="waste">{{Cite journal |last=Bostrom |first=Nick |year=2009 |title=Astronomical Waste: The opportunity cost of delayed technological development |url=http://www.nickbostrom.com/astronomical/waste.html |journal=Utilitas |volume=15 |issue=3 |pages=308–314 |citeseerx=10.1.1.429.2849 |doi=10.1017/s0953820800004076 |s2cid=15860897}}</ref> Existential risks are a sub-class of global catastrophic risks, where the damage is not only ''global'' but also ''terminal'' and ''permanent,'' preventing recovery and thereby affecting both current and all future generations.<ref name="priority" /> | |||
:''Some foreseen hazards (hence not members of the current category) which have been excluded from the list of bangs on grounds that they seem too unlikely to cause a global terminal disaster are: solar flares, supernovae, black hole explosions or mergers, gamma-ray bursts, galactic center outbursts, supervolcanoes, buildup of air pollution, gradual loss of human fertility, and various religious doomsday scenarios.'' | |||
==== Non-extinction risks ==== | |||
===Space=== | |||
While extinction is the most obvious way in which humanity's long-term potential could be destroyed, there are others, including ''unrecoverable'' ''collapse'' and ''unrecoverable'' ''dystopia''.<ref name="Ord 20202">{{Cite book |last=Ord |first=Toby |title=The Precipice: Existential Risk and the Future of Humanity |date=2020 |publisher=Hachette |isbn=9780316484916 |location=New York}}</ref> A disaster severe enough to cause the permanent, irreversible collapse of human civilisation would constitute an existential catastrophe, even if it fell short of extinction.<ref name="Ord 20202" /> Similarly, if humanity fell under a totalitarian regime, and there were no chance of recovery then such a dystopia would also be an existential catastrophe.<ref name=":7">Bryan Caplan (2008). "". ''Global Catastrophic Risks'', eds. Bostrom & Cirkovic (Oxford University Press): 504–519. {{ISBN|9780198570509}}</ref> ] writes that "perhaps an eternity of totalitarianism would be worse than extinction".<ref name=":7" /> (]'s novel '']'' suggests<ref name="DG17">{{cite news |last=Glover |first=Dennis |date=2017-06-01 |title=Did George Orwell secretly rewrite the end of Nineteen Eighty-Four as he lay dying? |quote= Winston's creator, George Orwell, believed that freedom would eventually defeat the truth-twisting totalitarianism portrayed in Nineteen Eighty-Four. |url= https://www.smh.com.au/entertainment/books/did-george-orwell-secretly-rewrite-the-end-of-nineteen-eightyfour-as-he-lay-dying-20170613-gwqbom.html |work= ] |access-date= 2021-11-21}}</ref> an example.<ref>{{Cite book |last=Orwell |first=George |url=http://catalogue.bl.uk/ |title=Nineteen Eighty-Four. A novel |publisher=Secker & Warburg |year=1949 |location=London |access-date=August 12, 2021 |archive-date=May 4, 2012 |archive-url=https://web.archive.org/web/20120504165426/http://catalogue.bl.uk/ |url-status=dead }}</ref>) A dystopian scenario shares the key features of extinction and unrecoverable collapse of civilization: before the catastrophe humanity faced a vast range of bright futures to choose from; after the catastrophe, humanity is locked forever in a terrible state.<ref name="Ord 20202" /> | |||
Psychologist ] has called existential risk a "useless category" that can distract from threats he considers real and solvable, such as climate change and nuclear war.<ref name=":0" /> | |||
It is certain that events in space can cause ] to come to an end. The certain events, however, will happen at an extremely long timescale measured in billions of years. Projections indicate that the ] is on a collision course with the Milky Way. Impact is predicted in about 3 billion years, and so Andromeda will approach at an average speed of about 140 kilometres (87 miles) per second; the two galaxies will probably merge to form a giant elliptical. This merging could eject the solar system in a more eccentric orbit{{Fact|date=February 2007}} and an unwanted position in the merged galaxy causing our planet to become uninhabitable (an actual collision is unnecessary). In about 5 billion years, ] predicts our ] will exhaust its core hydrogen and become a ]. In so doing, it will become thousands of times more luminous.<ref></ref> Even in its current phase of stellar evolution, the sun is increasing in luminosity (at a very slow rate). Many scientists predict that in fewer than one billion years, the runaway ] will make Earth unsuitable for life. | |||
== Potential sources of risk == | |||
On an even longer time scale, the ]. The current ] is estimated as being 13.8 billion years old. There are several competing theories as to the nature of our ] and how it will end, but in all cases, there will be no life possible. These scenarios take place on a considerably longer timescale than the expansion of the sun. | |||
{{main|Global catastrophe scenarios}} | |||
Potential global catastrophic risks are conventionally classified as anthropogenic or non-anthropogenic hazards. Examples of non-anthropogenic risks are an asteroid or comet ], a ] ], a natural ], a ], a ] from a ] destroying electronic equipment, natural long-term ], hostile ], or the ] transforming into a ] and engulfing the Earth ].<ref>{{cite journal |last1=Baum |first1=Seth D. |title=Assessing natural global catastrophic risks |journal=Natural Hazards |date=2023 |volume=115 |issue=3 |pages=2699–2719 |doi=10.1007/s11069-022-05660-w |pmid=36245947 |pmc=9553633 |doi-access=free|bibcode=2023NatHa.115.2699B }}</ref> | |||
] | |||
==== Meteorite impact ==== | |||
Anthropogenic risks are those caused by humans and include those related to technology, governance, and climate change. Technological risks include the creation of ] with human goals, ], and ]. Insufficient or malign ] creates risks in the social and political domain, such as ] and ],<ref>{{cite journal |last1=Scouras |first1=James |title=Nuclear War as a Global Catastrophic Risk |journal=Journal of Benefit-Cost Analysis |date=2019 |volume=10 |issue=2 |pages=274–295 |doi=10.1017/bca.2019.16 |doi-access=free}}</ref> ] and ] using ]s, ] and ] destroying ] like the ], or ] using weapons such as large ]s. Other global catastrophic risks include climate change, ], ], ] as a result of ] resource distribution, ] or ], ], and non-]. | |||
In the ], it is widely accepted that several large ]s have hit ]. The ], for example, is theorized to have caused the extinction of the ]s. If such an object struck the Earth it could have a serious impact on civilization. It's even possible that humanity would be completely destroyed: for this, the asteroid would need to be at least 1 km (0.6 miles) in diameter, but probably between 3–10 km (2–6 miles).<ref name="meteor">Nick Bostrom, section 4.10</ref> Asteroids with a 1 km diameter impact the Earth every 0.5 million years<ref name="meteor" /> on average. Larger asteroids are less common. The last large (>10 km) impact happened ]. So-called ]s are regularly being observed. | |||
== Methodological challenges == | |||
Some scientists believe there are patterns in the number of meteorites hitting Earth. An interesting explanation of such a pattern is given by the hypothetical star ]. This hypothesis states that a star named Nemesis regularly passes through a denser part of the ], causing meteorite rains to collide onto Earth. However, the very existence of this pattern is not widely accepted, and the existence of the Nemesis star is highly contested. | |||
Research into the nature and mitigation of global catastrophic risks and existential risks is subject to a unique set of challenges and, as a result, is not easily subjected to the usual standards of scientific rigour.<ref name="Ord 20202" /> For instance, it is neither feasible nor ethical to study these risks experimentally. ] expressed this with regards to nuclear war: "Understanding the long-term consequences of nuclear war is not a problem amenable to experimental verification".<ref name="Sagan 1983">{{Cite magazine |last=Sagan |first=Carl |date=Winter 1983 |title=Nuclear War and Climatic Catastrophe: Some Policy Implications |url=https://www.foreignaffairs.com/articles/1983-12-01/nuclear-war-and-climatic-catastrophe-some-policy-implications |magazine=Foreign Affairs |publisher=Council on Foreign Relations |doi=10.2307/20041818 |jstor=20041818 |access-date=August 4, 2020}}</ref> Moreover, many catastrophic risks change rapidly as technology advances and background conditions, such as geopolitical conditions, change. Another challenge is the general difficulty of accurately predicting the future over long timescales, especially for anthropogenic risks which depend on complex human political, economic and social systems.<ref name="Ord 20202" /> In addition to known and tangible risks, unforeseeable ] extinction events may occur, presenting an additional methodological problem.<ref name="Ord 20202" /><ref>{{Cite journal |last=Jebari |first=Karim |date=2014 |title=Existential Risks: Exploring a Robust Risk Reduction Strategy |url=https://philpapers.org/archive/JEBERE.pdf |journal=Science and Engineering Ethics |volume=21 |issue=3 |pages=541–54 |doi=10.1007/s11948-014-9559-3 |pmid=24891130 |access-date=August 26, 2018 |s2cid=30387504}}</ref> | |||
=== Lack of historical precedent === | |||
A star passage that will cause an increase of meteorites is the arrival of a star called ]. This star is probably moving on a collision course with the ] and will likely be at a distance 1.1 ]s from the Sun in 1.4 million years. Some models predict that this will send large amounts of comets from the ] to the Earth.<ref>http://www.exitmundi.nl/Gliese710.htm</ref> Other models, such as the one by García-Sánchez, predict an increase of only 5%. | |||
Humanity has never suffered an existential catastrophe and if one were to occur, it would necessarily be unprecedented.<ref name="Ord 20202" /> Therefore, existential risks pose unique challenges to prediction, even more than other long-term events, because of ].<ref name=":8">{{Cite journal |last1=Cirkovic |first1=Milan M. |author-link=Milan M. Ćirković |last2=Bostrom |first2=Nick |author-link2=Nick Bostrom |last3=Sandberg |first3=Anders |author-link3=Anders Sandberg |date=2010 |title=Anthropic Shadow: Observation Selection Effects and Human Extinction Risks |url=https://www.nickbostrom.com/papers/anthropicshadow.pdf |journal=Risk Analysis |volume=30 |issue=10 |pages=1495–1506 |doi=10.1111/j.1539-6924.2010.01460.x |pmid=20626690|bibcode=2010RiskA..30.1495C |s2cid=6485564 }}</ref> Unlike with most events, the failure of a complete ] to occur in the past is not evidence against their likelihood in the future, because every world that has experienced such an extinction event has gone unobserved by humanity. Regardless of civilization collapsing events' frequency, no civilization observes existential risks in its history.<ref name=":8" /> These ] issues may partly be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology.<ref name="priority" /> | |||
To understand the dynamics of an unprecedented, unrecoverable global civilizational collapse (a type of existential risk), it may be instructive to study the various local ] that have occurred throughout human history.<ref>{{Cite web |last=Kemp |first=Luke |date=February 2019 |title=Are we on the road to civilization collapse? |url=https://www.bbc.com/future/article/20190218-are-we-on-the-road-to-civilisation-collapse |access-date=August 12, 2021 |website=]}}</ref> For instance, civilizations such as the ] have ended in a loss of centralized governance and a major civilization-wide loss of infrastructure and advanced technology. However, these examples demonstrate that societies appear to be fairly resilient to catastrophe; for example, Medieval Europe survived the ] without suffering anything resembling a ] despite losing 25 to 50 percent of its population.<ref>{{Cite book |last=Ord |first=Toby |title=The Precipice: Existential Risk and the Future of Humanity |date=2020 |publisher=Hachette Books |isbn=9780316484893 |quote=Europe survived losing 25 to 50 percent of its population in the Black Death, while keeping civilization firmly intact |author-link=Toby Ord}}</ref> | |||
==== Less likely cosmic threats ==== | |||
=== Incentives and coordination === | |||
<!-- Commented out until specifically sourced --> | |||
There are economic reasons that can explain why so little effort is going into global catastrophic risk reduction. First, it is speculative and may never happen, so many people focus on other more pressing issues. It is also a ], so we should expect it to be undersupplied by markets.<ref name="priority" /> Even if a large nation invested in risk mitigation measures, that nation would enjoy only a small fraction of the benefit of doing so. Furthermore, global catastrophic risk reduction can be thought of as an ''intergenerational'' global public good. Since most of the hypothetical benefits of the reduction would be enjoyed by future generations, and though these future people would perhaps be willing to pay substantial sums for risk reduction, no mechanism for such a transaction exists.<ref name="priority" /> | |||
<!-- I can't find anything on an event that might cause Kuiper Belt objects to break loose | |||
*''']'''. Beyond ] lies a vast reserve of large icy objects that could break loose and strike Earth. | |||
--> | |||
<!-- For this one, I have found 'http://www.exitmundi.nl/gmc.htm', but no scientific citations. | |||
Please don't re-add before you find one - exitmundi.nl does not look like a reliable source to me (Gerrit) | |||
*''']'''. A giant molecular cloud chokes out Earth's atmosphere. | |||
--> | |||
<!-- I don't see how this one can happen. Discover Magazine writes: | |||
''...persuasive evidence that our sun doesn't engage in such excess...'' | |||
Is anyone disagreeing with Discover Magazine? | |||
*''']'''. Giant solar flare fries the Earth. | |||
--> | |||
<!--Discover is a popular science magazine. It would better to have a professional-level source--> | |||
A number of other scenarios have been suggested. Massive objects, e.g., a ], large ] or ], could be catastrophic if a close encounter occurred in the solar system. Another threat might come from ]; some scientists believe this may have caused mass extinction 450 million years ago.<ref name="gammaray">, ].</ref> Both are very unlikely.<ref name="unlikely" /> Still others see ] as a possible threat to mankind;<ref name="aliens">, Discover Magazine</ref> although alien life has never been found, scientists such as ] have postulated that the existence of extraterrestrial life is very likely. In 1969, the "]" was added to the Code of Federal Regulations (Title 14, Section 1211) in response to the possibility of biological contamination resulting from the US Apollo Space Program. It was removed in 1991.<ref></ref> Scientists consider such a scenario technically possible, but unlikely.<ref name="aliensunlikely">Nick Bostrom, section 7.2.</ref> | |||
=== |
=== Cognitive biases === | ||
Numerous ] can influence people's judgment of the importance of existential risks, including ], ], ], the ], the ], and the ].<ref name="Yudkowsky 2008">{{Cite journal |last=Yudkowsky |first=Eliezer |date=2008 |title=Cognitive Biases Potentially Affecting Judgment of Global Risks |url=https://intelligence.org/files/CognitiveBiases.pdf |journal=Global Catastrophic Risks |pages=91–119 |bibcode=2008gcr..book...86Y}}</ref> | |||
Scope insensitivity influences how bad people consider the extinction of the human race to be. For example, when people are motivated to donate money to altruistic causes, the quantity they are willing to give does not increase linearly with the magnitude of the issue: people are roughly as willing to prevent the deaths of 200,000 or 2,000 birds.<ref>Desvousges, W.H., Johnson, F.R., Dunford, R.W., Boyle, K.J., Hudson, S.P., and Wilson, N. 1993, Measuring natural resource damages with contingent valuation: tests of validity and reliability. In Hausman, J.A. (ed), ''Contingent Valuation:A Critical Assessment'', pp. 91–159 (Amsterdam: North Holland).</ref> Similarly, people are often more concerned about threats to individuals than to larger groups.<ref name="Yudkowsky 2008" /> | |||
In the history of the Earth, many ]s have occurred. More ice ages will almost certainly come at an interval of 40,000–100,000 years. This would have a serious impact on civilization as we know it today, because vast areas of land (mainly in ], ], and ]) could become uninhabitable. It would still be possible to live in the tropical regions, but with possible loss of humidity/water. Currently, we technically exist in a warm period between such ice ages (the last ending c. 10000 years ago), and all civilization save a few hunter-gatherer populations has come into existence within that time. | |||
] theorizes that ] plays a role in public perception of existential risks:{{sfn|Bostrom|2013}}<ref>Yudkowsky, Eliezer. "". Global catastrophic risks 1 (2008): 86. p.114</ref> | |||
A less predictable scenario is a global ]. For example, if ] mutates and becomes as transmissible as the ], the consequences would be disastrous, but probably not fatal to the human species,<ref name="pandemic">Nick Bostrom, section 4.9.</ref> as some people are immune to HIV.<ref>http://www.pbs.org/wgbh/evolution/library/10/4/l_104_06.html</ref> This particular scenario would also contradict the observable tendency for pathogens to become less fatal over time as a function of ]. A pathogen that quickly kills its hosts will not likely have enough time to spread to new ones, while one that kills its hosts more slowly or not at all will allow carriers more time to spread the infection, and thus likely outcompete a more lethal species or strain. A real-life example of this process can be found in the historical evolution of ] towards . However, this idea is debated, see ]. Of course, a pandemic resulting in human extinction need not arise naturally; the possibility of one caused by a deliberately-engineered pathogen cannot be ruled out. | |||
<blockquote>Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".</blockquote> | |||
Another possibility is the ]. A megatsunami could, for example, destroy the entire east coast of the ] (see ]). The coastal areas of the entire world could be flooded in case of the collapse of the ].<ref name="wais"></ref> While none of these scenarios could possibly destroy humanity completely, they could regionally threaten civilization as we know it. | |||
All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. ] argues that the absence of human extinction in the past is weak evidence that there will be no human extinction in the future, due to ] and other ].<ref>{{Cite web |date=March 6, 2012 |title=We're Underestimating the Risk of Human Extinction |url=https://www.theatlantic.com/technology/archive/2012/03/were-underestimating-the-risk-of-human-extinction/253821/ |access-date=July 1, 2016 |publisher=The Atlantic}}</ref> | |||
However, the most likely scenario is via an ], such as world crop failure and collapse of ] that could be induced by the present trends of ] and non-]. Most of these scenarios involve one or more of the following: ], ] that could lead to approximately one half of the Earth's population being without safe ], ], ], massive ], ] or massive ] episode. A very recent threat in this direction is the ], a phenomenon that might foreshadow the imminent extinction of the ]. As the bee plays a vital role in ], its extinction would severely disrupt the ]. | |||
] ] argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the ] Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth."<ref>Is Humanity Suicidal? ''The New York Times Magazine'' May 30, 1993)</ref> | |||
The 20th century have seen a rapid increase in ] due to ] and massive increase in agricultural productivity<ref></ref> made by the ].<ref></ref> Between 1950 and 1984, as the Green Revolution transformed agriculture around the globe, world grain production increased by 250%. The Green Revolution in agriculture helped food production to keep pace with worldwide ]. The energy for the Green Revolution was provided by fossil fuels in the form of ] (natural gas), ] (oil), and ] fueled ].<ref></ref> David Pimentel, professor of ecology and ] at ], and Mario Giampietro, senior researcher at the National Research Institute on Food and Nutrition (INRAN), place in theirs study ''Food, Land, Population and the U.S. Economy'' the maximum ] for a ] at 200 million. To achieve a sustainable economy and avert ], the ] must reduce its population by at least one-third, and ] will have to be reduced by two-thirds, says study.<ref></ref> | |||
==Proposed mitigation== | |||
The authors of this study believe that the mentioned agricultural crisis will only begin to impact us after 2020, and will not become critical until 2050. The oncoming ] production (and subsequent decline of production), along with the peak of North American ] production will very likely precipitate this agricultural crisis much sooner than expected. Geologist ] claims that coming decades could see spiraling ] prices without relief and massive ] on a global level such as never experienced before.<ref></ref><ref></ref> | |||
===Multi-layer defense=== | |||
An abrupt ] could cause a new ].<ref>{{cite web | url=http://www.nytimes.com/2006/10/12/science/earth/12extinct.html?ex=1176609600&en=9fc4a5a53674ca70&ei=5087&excamp=mkt_at8 | title=Study Links Extinction Cycles to Changes in Earth’s Orbit and Tilt | publication=] | date=2006-10-12 | author=Wilford, John Noble}}</ref> | |||
] is a useful framework for categorizing risk mitigation measures into three layers of defense:<ref name=":2">{{Cite journal |last1=Cotton-Barratt |first1=Owen |last2=Daniel |first2=Max |last3=Sandberg |first3=Anders |date=2020 |title=Defence in Depth Against Human Extinction: Prevention, Response, Resilience, and Why They All Matter |journal=Global Policy |volume=11 |issue=3 |pages=271–282 |doi=10.1111/1758-5899.12786 |issn=1758-5899 |pmc=7228299 |pmid=32427180}}</ref> | |||
# ''Prevention'': Reducing the probability of a catastrophe occurring in the first place. Example: Measures to prevent outbreaks of new highly infectious diseases. | |||
<!-- Is any source claiming that the magnetic pole shift will cause any deaths? Discover Magazine doesn't really do so | |||
# ''Response'': Preventing the scaling of a catastrophe to the global level. Example: Measures to prevent escalation of a small-scale nuclear exchange into an all-out nuclear war. | |||
*It's likely that ''']s''' have happened in the past and will happen in the future. This might pose a problem for society, although it will not cause mass extinction. | |||
# ''Resilience'': Increasing humanity's resilience (against extinction) when faced with global catastrophes. Example: Measures to increase food security during a nuclear winter. | |||
--> | |||
When the ] at ] last erupted, 600,000 years ago, the magma and ash covered roughly all of the area of North America west of the Mississippi river. Another such eruption could threaten civilization. Such an eruption could also release large amounts of gases that could alter the balance of the planet's carbon dioxide and cause a runaway greenhouse effect, or enough pyroclastic debris and other material may be thrown into the atmosphere to partially block out the sun and cause a natural ], similar to 1816, the ]. | |||
<!-- | |||
commented out: need work | |||
*''']'''. | |||
--> | |||
Human extinction is most likely when all three defenses are weak, that is, "by risks we are unlikely to prevent, unlikely to successfully respond to, and unlikely to be resilient against".<ref name=":2" /> | |||
===Humanity=== | |||
The unprecedented nature of existential risks poses a special challenge in designing risk mitigation measures since humanity will not be able to learn from a track record of previous events.<ref name="Ord 20202" /> | |||
Some threats for humanity come from humanity itself. The scenario that has been explored most is a ] or ] with similar possibilities. It is difficult to predict whether it would exterminate humanity, but very certainly could alter civilization as we know it, in particular if there was a ] event.<ref name="nuclearwinter">Nick Bostrom, section 4.2.</ref> | |||
===Funding=== | |||
Another category of disasters are ]. | |||
Some researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on '']'', ], or ] than on existential risks. Bostrom's comparisons have been criticized as "high-handed".<ref name=":0">{{Cite news |last=Kupferschmidt |first=Kai |date=January 11, 2018 |title=Could science destroy the world? These scholars want to save us from a modern-day Frankenstein |work=Science |publisher=AAAS |url=https://www.science.org/content/article/could-science-destroy-world-these-scholars-want-save-us-modern-day-frankenstein |access-date=April 20, 2020}}</ref><ref>{{Cite news |date=2013 |title=Oxford Institute Forecasts The Possible Doom Of Humanity |work=Popular Science |url=https://www.popsci.com/science/article/2013-04/what-greatest-threat-our-species-continued-existence/ |access-date=April 20, 2020}}</ref> As of 2020, the ] organization had an annual budget of US$1.4 million.<ref>{{Cite book |last=Toby Ord |title=The precipice: Existential risk and the future of humanity |date=2020 |publisher=Hachette Books |isbn=9780316484893 |quote=The international body responsible for the continued prohibition of bioweapons (the Biological Weapons Convention) has an annual budget of $1.4 million - less than the average McDonald's restaurant |author-link=Toby Ord}}</ref> | |||
It has been suggested that ] take unforeseen actions or that ] would out-compete humanity.<ref name="billjoy">], . In:] magazine. See also ].</ref> | |||
] could lead to the creation of a ], ] could lead to ] - in both cases, either deliberately or by accident.<ref name="drexler">], ], ISBN 0-385-19973-2, </ref> | |||
It has also been suggested that physical scientists might accidentally create a device that could destroy the earth and the solar system.<ref name="physicsaccident">Nick Bostrum, section 4.8</ref> In ], there are some unknown variables. If those turn out to have an unfortunate value, the universe may not be stable and alter completely, destroying everything in it,<ref name="quantumtunneling">Malcolm Perry, ''Quantum Tunneling towards an exploding Universe?'' in: ], ] ]. .</ref> either at random or by an accidental experiment. This is called ] by some.<ref name="qvc"></ref> Another kind of accident is the '']'', in which our planet including everything on it becomes a strange matter planet in a chain reaction. Some do not view this as a credible scenario.<ref name="ice9">Frank Wilczek, in an e-mail, .</ref> | |||
===Survival planning=== | |||
It has been suggested that runaway ] might cause the climate on Earth to become like ], which would make it uninhabitable. In less extreme scenarios it could cause the end of civilization as we know it.<ref name="runaway">Isaac M. Held, Brian J. Soden, ''Water Vapor Feedback and Global Warming'', In: Annu. Rev. Energy Environ 2000. . Page 449.</ref> | |||
Some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving a global disaster.<ref name="matheny"/><ref name="wells1"/><ref name="wells2"/> Economist ] argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.<ref name="matheny"/><ref>Hanson, Robin. "". Global catastrophic risks 1 (2008): 357.</ref> | |||
], creator of the ], in his book '']'' (2006), has suggested that the elimination of ]s, and the falling planetary ] is removing the ] ] mechanisms that maintain climate stability by reducing the effects of ] emissions (particularly ]). With the heating of the oceans, the extension of the ] layer into ] and ] waters is preventing the overturning and nutrient enrichment necessary for ] of ] on which the ]s of these areas depend. With the loss of phytoplankton and tropical rain forests, two of the main ]s for reducing ], he suggests a runaway ] effect could cause tropical ]s to cover most of the worlds tropical regions, and the disappearance of polar ice caps, posing a serious challenge to global civilization. | |||
] has been proposed globally, but the monetary cost would be high. Furthermore, it would likely contribute to the current millions of deaths per year due to ].<ref>{{Cite book |last=Smil |first=Vaclav |url=https://books.google.com/books?id=8ntHWPMUgpMC&pg=PA25 |title=The Earth's Biosphere: Evolution, Dynamics, and Change |publisher=] |year=2003 |isbn=978-0-262-69298-4 |page=25 |author-link=Vaclav Smil}}</ref> In 2022, a team led by David Denkenberger modeled the cost-effectiveness of resilient foods to ] and found "~98-99% confidence" for a higher marginal impact of work on resilient foods.<ref>{{Cite journal |last1=Denkenberger |first1=David C. |last2=Sandberg |first2=Anders |last3=Tieman |first3=Ross John |last4=Pearce |first4=Joshua M. |date=2022 |title=Long term cost-effectiveness of resilient foods for global catastrophes compared to artificial general intelligence safety |url=https://www.sciencedirect.com/science/article/pii/S2212420922000176 |journal=International Journal of Disaster Risk Reduction |volume=73 |pages=102798 |doi=10.1016/j.ijdrr.2022.102798 |bibcode=2022IJDRR..7302798D }}</ref> Some ]s stock ] with multiple-year food supplies. | |||
Using ], the ] (GSG), a coalition of international scientists convened by ], developed a series of possible futures for the world as it enters a ]. One scenario involves the complete breakdown of civilization as the effects of ] become more pronounced, competition for scarce resources increases, and the rift between the poor and the wealthy widens. The GSG’s other scenarios, such as ], ], and ] avoid this societal collapse and eventually result in environmental and social ]. They claim the outcome is dependent on ]<ref>. ]. 2006. Boston: </ref> and the possible formation of a ] which could influence the trajectory of global development.<ref> Orion Kriegman. 2006. Boston:</ref> | |||
The ] is buried {{convert|400|ft|m}} inside a mountain on an island in the ]. It is designed to hold 2.5 billion seeds from more than 100 countries as a precaution to preserve the world's crops. The surrounding rock is {{convert|−6|°C|°F}} (as of 2015) but the vault is kept at {{convert|−18|°C|°F}} by refrigerators powered by locally sourced coal.<ref>{{Cite news |last=Lewis Smith |date=February 27, 2008 |title=Doomsday vault for world's seeds is opened under Arctic mountain |work=The Times Online |location=London |url=http://www.timesonline.co.uk/tol/news/environment/article3441435.ece |url-status=dead |archive-url=https://web.archive.org/web/20080512083814/http://www.timesonline.co.uk/tol/news/environment/article3441435.ece |archive-date=May 12, 2008}}</ref><ref>{{Cite web |last=Suzanne Goldenberg |date=May 20, 2015 |title=The doomsday vault: the seeds that could save a post-apocalyptic world |url=https://www.theguardian.com/science/2015/may/20/the-doomsday-vault-seeds-save-post-apocalyptic-world |access-date=June 30, 2017 |website=]}}</ref> | |||
Other scenarios that have been named are: | |||
More speculatively, if society continues to function and if the ] remains habitable, calorie needs for the present human population might in theory be met during an extended absence of sunlight, given sufficient advance planning. Conjectured solutions include growing mushrooms on the dead plant biomass left in the wake of the catastrophe, converting cellulose to sugar, or feeding natural gas to methane-digesting bacteria.<ref>{{Cite news |date=July 8, 2016 |title=Here's how the world could end—and what we can do about it |work=Science |publisher=AAAS |url=https://www.science.org/content/article/here-s-how-world-could-end-and-what-we-can-do-about-it-rev2 |access-date=March 23, 2018}}</ref><ref>{{Cite journal |last1=Denkenberger |first1=David C. |last2=Pearce |first2=Joshua M. |date=September 2015 |title=Feeding everyone: Solving the food crisis in event of global catastrophes that kill crops or obscure the sun |url=https://hal.archives-ouvertes.fr/hal-02113583/file/Feeding_Everyone_Solving_the_Food_Crisis.pdf |journal=Futures |volume=72 |pages=57–68 |doi=10.1016/j.futures.2014.11.008|s2cid=153917693 }}</ref> | |||
; ] : Natural selection would create super bacteria that are resistant to antibiotics, devastating the world population and causing a global collapse of civilization.{{Fact|date=February 2007}} | |||
; ] : Demographic trends create a "baby bust" that threatens the order of civilization.<ref name="babybust">] in ] magazine.</ref> | |||
; ] : A full scale ] could kill billions, and the resulting ] would effectively crush any form of civilization. | |||
; ] : The theory of ] suggests that the average individual in a civilization may eventually become weaker, because the most intelligent reproduce least leaving the population less able to perform higher functions. | |||
; ] : Markets fail worldwide, resulting in economic collapse: mass unemployment, rioting, famine, death, and cannabalism.{{Fact|date=February 2007}} | |||
; ] : World population may increase to such an extent in the future that it would lead to lack of space for habitation, except on the Moon and other planets.{{Fact|date=February 2007}} | |||
; ] : Oil becomes scarce before an economically viable replacement is devised, leading to global chaos and discomfort.<ref name="peakoil">] , in ]</ref> | |||
; ] : In the search for new quantum particles, scientists accidentally destroy the universe (or at the least, the Earth). This, however, is highly unlikely as far more powerful events occur in nature.{{Fact|date=February 2007}} | |||
; ] : Some researchers theorize a tiny loss of telomere length from one generation to the next, mirroring the process of aging in individuals. Over thousands of generations the telomere erodes down to its critical level. Once at the critical level we would expect to see outbreaks of age-related diseases occurring earlier in life and finally a population crash;<ref name="telomere">"What a way to go", ''The Guardian'' (April 14, 2005). See External links.</ref> however, this possibility may not result in extinction due to the self-reinforcing effects of ]. | |||
===Global catastrophic risks and global governance=== | |||
===Fictional=== | |||
Insufficient ] creates risks in the social and political domain, but the governance mechanisms develop more slowly than technological and social change. There are concerns from governments, the private sector, as well as the general public about the lack of governance mechanisms to efficiently deal with risks, negotiate and adjudicate between diverse and conflicting interests. This is further underlined by an understanding of the interconnectedness of global systemic risks.<ref>{{Cite web |title=Global Challenges Foundation {{!}} Understanding Global Systemic Risk |url=https://globalchallenges.org/en/our-work/quarterly-reports/resetting-the-frame/understanding-global-systemic-risk |url-status=dead |archive-url=https://web.archive.org/web/20170816005747/https://globalchallenges.org/en/our-work/quarterly-reports/resetting-the-frame/understanding-global-systemic-risk |archive-date=August 16, 2017 |access-date=August 15, 2017 |website=globalchallenges.org}}</ref> In absence or anticipation of global governance, national governments can act individually to better understand, mitigate and prepare for global catastrophes.<ref>{{Cite web |title=Global Catastrophic Risk Policy |url=https://www.gcrpolicy.com/ |access-date=August 11, 2019 |website=gcrpolicy.com |archive-date=August 11, 2019 |archive-url=https://web.archive.org/web/20190811233004/https://www.gcrpolicy.com/ |url-status=dead }}</ref> | |||
:''See main article ]. | |||
===Climate emergency plans=== | |||
==Historical futurist scenarios== | |||
In 2018, the ] called for greater climate change action and published its Climate Emergency Plan, which proposes ten action points to limit global average temperature increase to 1.5 degrees Celsius.<ref>{{Cite web |last=] |year=2018 |title=The Climate Emergency Plan |url=https://clubofrome.org/publication/the-climate-emergency-plan/ |access-date=August 17, 2020}}</ref> Further, in 2019, the Club published the more comprehensive Planetary Emergency Plan.<ref>{{Cite web |last=] |year=2019 |title=The Planetary Emergency Plan |url=https://clubofrome.org/publication/the-planetary-emergency-plan/ |access-date=August 17, 2020}}</ref> | |||
Every generation has faced its own fears of an unknown future; the historical record of prior end of civilization scenarios is plentiful. Some of these include the following. | |||
There is evidence to suggest that collectively engaging with the emotional experiences that emerge during contemplating the vulnerability of the human species within the context of climate change allows for these experiences to be adaptive. When collective engaging with and processing emotional experiences is supportive, this can lead to growth in resilience, psychological flexibility, tolerance of emotional experiences, and community engagement.<ref>{{Cite journal |last1=Kieft |first1=J. |last2=Bendell |first2=J |year=2021 |title=The responsibility of communicating difficult truths about climate influenced societal disruption and collapse: an introduction to psychological research |url=https://insight.cumbria.ac.uk/id/eprint/5950 |journal=Institute for Leadership and Sustainability (IFLAS) Occasional Papers |volume=7 |pages=1–39}}</ref> | |||
Many ]al (and ]al) stories from the era of the ] were based on the belief that a ] was inevitable, and that this would result in the destruction of all life on the planet Earth (see ] for a list). | |||
===Space colonization=== | |||
] wrote a prediction that a great catastrophe would occur in the seventh month (July, or some argue September, the seventh month of the pre-modern ]) of the year ]. Many followers of his writings took this to mean that the end of the world would occur. When the chosen date came and went without incident, translators of his works began revising them with new interpretations of what the prediction actually meant. (Many now believe that this prediction referred to September 11, 2001.) Despite this, some people also believe according to Nostradamus, that the world will end in the year ]. One leading Nostradamus scholar believes that is the year the Sun will explode as a Red Giant, possibly because of extraterrestrial intervention.{{Fact|date=February 2007}} | |||
{{main|Space and survival}} | |||
] is a proposed alternative to improve the odds of surviving an extinction scenario.<ref name="physorg20100809">{{Citation |title=Mankind must abandon earth or face extinction: Hawking |date=August 9, 2010 |url=http://www.physorg.com/news200591777.html |work=physorg.com |access-date=January 23, 2012}}</ref> Solutions of this scope may require ]. | |||
Astrophysicist ] advocated colonizing other planets within the Solar System once technology progresses sufficiently, in order to improve the ] from planet-wide events such as global thermonuclear war.<ref>{{Cite web |last=Malik |first=Tariq |date=April 13, 2013 |title=Stephen Hawking: Humanity Must Colonize Space to Survive |website=] |url=http://www.space.com/20657-stephen-hawking-humanity-survival-space.html |access-date=July 1, 2016}}</ref><ref>{{Cite news |last=Shukman |first=David |date=January 19, 2016 |title=Hawking: Humans at risk of lethal 'own goal' |work=BBC News |url=https://www.bbc.com/news/science-environment-35344664 |access-date=July 1, 2016}}</ref> | |||
The ] was supposed to wreak havoc on ]s. See also ]. | |||
Billionaire ] writes that humanity must become a multiplanetary species in order to avoid extinction.<ref>{{Cite web|url=https://www.cnbc.com/2017/06/16/elon-musk-colonize-mars-before-extinction-event-on-earth.html|title=Elon Musk thinks life on earth will go extinct, and is putting most of his fortune toward colonizing Mars|first=Leah|last=Ginsberg|website=CNBC|date=June 16, 2017 }}</ref> His company ] is developing technology he projects will be used to colonize ]. | |||
] (1642-1727), who was involved in alchemy and many other things in addition to science and mathematics, studied old texts and surmised that the end of the world would be in 2060, although he was reluctant to put an exact date on it.<ref>, by Stephen D. Snobelen, University of King’s College, Halifax</ref> | |||
==Organizations== | |||
Many mistakenly believe that the ]'s ] ends abruptly on ] (or ]) ]. This misconception is due to the Maya practice of abbreviating their dates to five decimal places. On monuments where the full date is shown the end of the last creation is said to happen much farther in the future; however, the Mayas did believe that there will be a baktun ending in 2012. A baktun marks the end of a 400 year period and was a significant event on the Maya calendar. In the ], 2012 marks the end of a 26,000 year planetary cycle. This cycle is known as the ] and most likely refers to the ]. | |||
The ] (est. 1945) is one of the oldest global risk organizations, founded after the public became alarmed by the potential of atomic warfare in the aftermath of WWII. It studies risks associated with nuclear war and energy and famously maintains the ] established in 1947. The ] (est. 1986) examines the risks of nanotechnology and its benefits. It was one of the earliest organizations to study the unintended consequences of otherwise harmless technology gone haywire at a global scale. It was founded by ] who postulated "]".<ref>{{Cite web |last=Fred Hapgood |date=November 1986 |title=Nanotechnology: Molecular Machines that Mimic Life |url=http://metamodern.com/b/wp-content/uploads/docs/OMNI_TINYTECH.pdf |access-date=June 5, 2015 |website=] |archive-date=July 27, 2013 |archive-url=https://web.archive.org/web/20130727021409/http://metamodern.com/b/wp-content/uploads/docs/OMNI_TINYTECH.pdf |url-status=dead }}</ref><ref>{{Cite journal |last=Giles |first=Jim |year=2004 |title=Nanotech takes small step towards burying 'grey goo' |journal=Nature |volume=429 |issue=6992 |pages=591 |bibcode=2004Natur.429..591G |doi=10.1038/429591b |pmid=15190320 |doi-access=free}}</ref> | |||
Beginning after 2000, a growing number of scientists, philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia.<ref>{{Cite web |last=Sophie McBain |date=September 25, 2014 |title=Apocalypse soon: the scientists preparing for the end times |url=http://www.newstatesman.com/sci-tech/2014/09/apocalypse-soon-scientists-preparing-end-times |access-date=June 5, 2015 |website=]}}</ref> | |||
==Notes== | |||
{{reflist|2}} | |||
Independent non-governmental organizations (NGOs) include the ] (est. 2000), which aims to reduce the risk of a catastrophe caused by artificial intelligence,<ref>{{Cite web |title=Reducing Long-Term Catastrophic Risks from Artificial Intelligence |url=https://intelligence.org/summary/ |access-date=June 5, 2015 |publisher=] |quote=The Machine Intelligence Research Institute aims to reduce the risk of a catastrophe, should such an event eventually occur.}}</ref> with donors including ] and ].<ref>{{Cite journal |last=Angela Chen |date=September 11, 2014 |title=Is Artificial Intelligence a Threat? |url=https://chronicle.com/article/Is-Artificial-Intelligence-a/148763/ |journal=The Chronicle of Higher Education |access-date=June 5, 2015}}</ref> The ] (est. 2001) seeks to reduce global threats from nuclear, biological and chemical threats, and containment of damage after an event.<ref name=":6">{{Cite web |title=Nuclear Threat Initiative |url=http://www.nti.org |access-date=June 5, 2015 |website=]}}</ref> It maintains a nuclear material security index.<ref>{{Cite web |last=Alexander Sehmar |date=May 31, 2015 |title=Isis could obtain nuclear weapon from Pakistan, warns India |url=https://www.independent.co.uk/news/world/asia/india-warns-isis-could-obtain-nuclear-weapon-from-pakistan-10287276.html |url-status=dead |archive-url=https://web.archive.org/web/20150602203957/http://www.independent.co.uk/news/world/asia/india-warns-isis-could-obtain-nuclear-weapon-from-pakistan-10287276.html |archive-date=June 2, 2015 |access-date=June 5, 2015 |website=The Independent}}</ref> The Lifeboat Foundation (est. 2009) funds research into preventing a technological catastrophe.<ref>{{Cite web |title=About the Lifeboat Foundation |url=http://lifeboat.com |access-date=April 26, 2013 |publisher=The Lifeboat Foundation}}</ref> Most of the research money funds projects at universities.<ref>{{Cite web |last=Ashlee |first=Vance |author-link=Ashlee Vance |date=July 20, 2010 |title=The Lifeboat Foundation: Battling Asteroids, Nanobots and A.I. |url=http://bits.blogs.nytimes.com/2010/07/20/the-lifeboat-foundation-battling-asteroids-nanobots-and-a-i/ |access-date=June 5, 2015 |website=]}}</ref> The Global Catastrophic Risk Institute (est. 2011) is a US-based non-profit, non-partisan think tank founded by ] and Tony Barrett. GCRI does research and policy work across various risks, including artificial intelligence, nuclear war, climate change, and asteroid impacts.<ref>{{cite web |url=https://gcrinstitute.org |title=Global Catastrophic Risk Institute |work=gcrinstitute.org |author= |date= |access-date=March 22, 2022}}</ref> The ] (est. 2012), based in Stockholm and founded by ], releases a yearly report on the state of global risks.<ref name="gcf_atlantic">{{Cite web |last=Meyer |first=Robinson |date=April 29, 2016 |title=Human Extinction Isn't That Unlikely |url=https://www.theatlantic.com/technology/archive/2016/04/a-human-extinction-isnt-that-unlikely/480444/ |access-date=April 30, 2016 |website=] |publisher=Emerson Collective |location=Boston, Massachusetts}}</ref><ref name="gcf_home">{{Cite web |title=Global Challenges Foundation website |url=http://globalchallenges.org |access-date=April 30, 2016 |website=globalchallenges.org}}</ref> The ] (est. 2014) works to reduce extreme, large-scale risks from transformative technologies, as well as steer the development and use of these technologies to benefit all life, through grantmaking, policy advocacy in the United States, European Union and United Nations, and educational outreach.<ref name=":5">{{Cite web |title=The Future of Life Institute |url=http://thefutureoflife.org/ |access-date=May 5, 2014 |website=]}}</ref> ], ] and ] are some of its biggest donors.<ref>{{Cite web |last=Nick Bilton |date=May 28, 2015 |title=Ava of 'Ex Machina' Is Just Sci-Fi (for Now) |url=https://www.nytimes.com/2015/05/21/style/ava-of-ex-machina-is-just-sci-fi-for-now.html |access-date=June 5, 2015 |website=]}}</ref> | |||
University-based organizations included the ] (est. 2005) which researched the questions of humanity's long-term future, particularly existential risk.<ref name=":3" /> It was founded by ] and was based at Oxford University.<ref name=":3" /> The ] (est. 2012) is a Cambridge University-based organization which studies four major technological risks: artificial intelligence, biotechnology, global warming and warfare.<ref name=":4" /> All are man-made risks, as ] explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us."<ref>{{Cite web |last=Hui |first=Sylvia |date=November 25, 2012 |title=Cambridge to study technology's risks to humans |url=http://bigstory.ap.org/article/cambridge-study-technologys-risk-humans |url-status=dead |archive-url=https://web.archive.org/web/20121201021050/http://bigstory.ap.org/article/cambridge-study-technologys-risk-humans |archive-date=December 1, 2012 |access-date=January 30, 2012 |publisher=Associated Press}}</ref> ] was an acting adviser. The ] is a Stanford University-based organization focusing on many issues related to global catastrophe by bringing together members of academia in the humanities.<ref>{{Cite book |last=Scott Barrett |url=https://books.google.com/books?id=_26VAwAAQBAJ&pg=PA112 |title=Environment and Development Economics: Essays in Honour of Sir Partha Dasgupta |publisher=Oxford University Press |year=2014 |isbn=9780199677856 |page=112 |access-date=June 5, 2015}}</ref><ref>{{Cite web |title=Millennium Alliance for Humanity & The Biosphere |url=http://mahb.stanford.edu/ |access-date=June 5, 2015 |website=Millennium Alliance for Humanity & The Biosphere}}</ref> It was founded by ], among others.<ref>{{Cite book |last=Guruprasad Madhavan |url=https://books.google.com/books?id=BQ6bxpMr5yMC&pg=PA43 |title=Practicing Sustainability |publisher=Springer Science & Business Media |year=2012 |isbn=9781461443483 |page=43 |access-date=June 5, 2015}}</ref> Stanford University also has the ] focusing on political cooperation to reduce global catastrophic risk.<ref>{{Cite web |title=Center for International Security and Cooperation |url=http://cisac.stanford.edu |access-date=June 5, 2015 |publisher=Center for International Security and Cooperation}}</ref> The ] was established in January 2019 at Georgetown's Walsh School of Foreign Service and will focus on policy research of emerging technologies with an initial emphasis on artificial intelligence.<ref name="CSET">{{Cite news |last=Anderson |first=Nick |date=February 28, 2019 |title=Georgetown launches think tank on security and emerging technology |newspaper=Washington Post |url=https://www.washingtonpost.com/local/education/georgetown-launches-think-tank-on-security-and-emerging-technology/2019/02/27/d6dabc62-391f-11e9-a2cd-307b06d0257b_story.html |access-date=March 12, 2019}}</ref> They received a grant of 55M USD from ] as suggested by ].<ref name="CSET" /> | |||
Other risk assessment groups are based in or are part of governmental organizations. The ] (WHO) includes a division called the Global Alert and Response (GAR) which monitors and responds to global epidemic crisis.<ref>{{Cite web |title=Global Alert and Response (GAR) |url=https://www.who.int/csr/en/ |archive-url=https://web.archive.org/web/20030216050054/http://www.who.int/csr/en/ |url-status=dead |archive-date=February 16, 2003 |access-date=June 5, 2015 |website=]}}</ref> GAR helps member states with training and coordination of response to epidemics.<ref>{{Cite book |last=Kelley Lee |url=https://books.google.com/books?id=9zCEmpopjG0C&pg=PA92 |title=Historical Dictionary of the World Health Organization |publisher=Rowman & Littlefield |year=2013 |isbn=9780810878587 |page=92 |author-link=Kelley Lee |access-date=June 5, 2015}}</ref> The ] (USAID) has its Emerging Pandemic Threats Program which aims to ] and contain naturally generated pandemics at their source.<ref>{{Cite web |title=USAID Emerging Pandemic Threats Program |url=http://avianflu.aed.org/eptprogram |url-status=dead |archive-url=https://web.archive.org/web/20141022095429/http://avianflu.aed.org/eptprogram/ |archive-date=October 22, 2014 |access-date=June 5, 2015 |publisher=]}}</ref> The ] has a division called the Global Security Principal Directorate which researches on behalf of the government issues such as bio-security and counter-terrorism.<ref>{{Cite web |title=Global Security |url=https://www-gs.llnl.gov/ |access-date=June 5, 2015 |publisher=] |archive-date=December 27, 2007 |archive-url=https://web.archive.org/web/20071227130802/https://www-gs.llnl.gov/ |url-status=dead }}</ref> | |||
== See also == | |||
{{portal|Society|Spaceflight|Outer space|Science|World|Philosophy|technology}} | |||
* {{annotated link|Artificial intelligence arms race}} | |||
* {{annotated link|Community resilience}} | |||
* {{annotated link|Extreme risk}} | |||
* {{annotated link|Fermi paradox}} | |||
* {{annotated link|Foresight (psychology)}} | |||
* {{annotated link|Future of Earth}} | |||
* {{annotated link|Future of the Solar System}} | |||
* {{annotated link|Climate engineering}} | |||
* {{annotated link|Global Risks Report}} | |||
* {{annotated link|Great Filter}} | |||
* {{annotated link|Holocene extinction}} | |||
* {{annotated link|Impact event}} | |||
* {{annotated link|List of global issues}} | |||
* {{annotated link|Nuclear proliferation}} | |||
* {{annotated link|Outside Context Problem}} | |||
* {{annotated link|Planetary boundaries}} | |||
* {{annotated link|Rare events}} | |||
* {{annotated link|The Sixth Extinction: An Unnatural History}} | |||
* {{annotated link|Societal collapse}} | |||
* {{annotated link|Speculative evolution}} | |||
* {{annotated link|Suffering risks}} | |||
* {{annotated link|Survivalism}} | |||
* {{annotated link|Tail risk}} | |||
* {{annotated link|The Precipice: Existential Risk and the Future of Humanity}} | |||
* {{annotated link|Timeline of the far future}} | |||
* {{annotated link|Triple planetary crisis}} | |||
* {{annotated link|World Scientists' Warning to Humanity}} | |||
==References== | ==References== | ||
{{reflist|refs= | |||
*Corey S. Powell (2000). , '']'' | |||
<ref name="matheny">{{Cite journal |last=Matheny |first=Jason Gaverick |date=2007 |title=Reducing the Risk of Human Extinction |journal=Risk Analysis |volume=27 |issue=5 |pages=1335–1344 |doi=10.1111/j.1539-6924.2007.00960.x |pmid=18076500 |bibcode=2007RiskA..27.1335M |s2cid=14265396 |url=http://users.physics.harvard.edu/~wilson/pmpmta/Mahoney_extinction.pdf |access-date=May 16, 2015 |archive-date=August 27, 2014 |archive-url=https://web.archive.org/web/20140827213919/http://users.physics.harvard.edu/~wilson/pmpmta/Mahoney_extinction.pdf |url-status=dead }}</ref> | |||
*] (2004). '']''. ISBN 0-465-06863-4 | |||
<ref name="wells1">{{cite book |last=Wells |first=Willard. |title=Apocalypse when? |publisher=Praxis |year=2009 |isbn=978-0387098364}}</ref> | |||
*] (2003). ''High Noon 20 Global Problems, 20 Years to Solve Them''. ISBN 0-465-07010-8 | |||
<ref name="wells2">{{cite book |last=Wells |first=Willard. |title=Prospects for Human Survival |publisher=Lifeboat Foundation |year=2017 |isbn=978-0998413105}}</ref> | |||
*] (2003). ''The Future of Life''. ISBN 0-679-76811-4 | |||
}} | |||
==Further reading== | ==Further reading== | ||
<!-- shouldn't the following merge into notes?: --> | |||
*] (2006) '']''. ISBN 1-58322-730-X | |||
* {{Cite journal |last1=Avin |first1=Shahar |last2=Wintle |first2=Bonnie C. |last3=Weitzdörfer |first3=Julius |last4=ó Héigeartaigh |first4=Seán S. |last5=Sutherland |first5=William J. |last6=Rees |first6=Martin J. |year=2018 |title=Classifying global catastrophic risks |journal=] |volume=102 |pages=20–26 |doi=10.1016/j.futures.2018.02.001 |doi-access=free}} | |||
*] (2005). '']''. ISBN 0-670-03337-5 | |||
* ] (2000) '']'' | |||
*] (1996). ''The End of the World''. ISBN 0-415-14043-9 | |||
* {{cite journal |last1=Currie |first1=Adrian |last2=Ó hÉigeartaigh |first2=Seán |title=Working together to face humanity's greatest threats: Introduction to the Future of Research on Catastrophic and Existential Risk |journal=] |date=2018 |volume=102 |pages=1–5 |doi=10.1016/j.futures.2018.07.003|hdl=10871/35764 |hdl-access=free }} | |||
* ] (2006) '']'' {{ISBN|1-58322-730-X}}. | |||
==See also== | |||
* ] (1972) '']'' {{ISBN|0-87663-165-0}}. | |||
*] | |||
* ] (2003) ''The Future of Life'' {{ISBN|0-679-76811-4}} | |||
*] | |||
* {{Cite magazine | last = Holt | first = Jim | title = The Power of Catastrophic Thinking| magazine = ] | author-link = Jim Holt (philosopher) | volume = LXVIII | issue = 3 | date = February 25, 2021 | pages = 26–29 | url = https://www.nybooks.com/articles/2021/02/25/power-catastrophic-thinking-toby-ord-precipice/ | quote = Whether you are searching for a cure for cancer, or pursuing a scholarly or artistic career, or engaged in establishing more just institutions, a threat to the future of humanity is also a threat to the significance of what you do. | quote-page = 28}} | |||
*] | |||
* Huesemann, Michael H., and Joyce A. Huesemann (2011) , Chapter 6, "Sustainability or Collapse", ], Gabriola Island, British Columbia, Canada, 464 pages {{ISBN|0865717044}}. | |||
*] | |||
* ] (2005 and 2011) '']'' ] {{ISBN|9780241958681}}. | |||
{{Doomsday}} | |||
* ] (2003) {{ISBN|0-465-07010-8}} | |||
* ] (2005) '']'' {{ISBN|978-0385509657}}. | |||
* ] (1996) ''The End of the World'' {{ISBN|0-415-14043-9}}. | |||
* ] (1990) <!-- available online: http://monoskop.org/images/a/ab/Tainter_Joseph_The_Collapse_of_Complex_Societies.pdf -->''The Collapse of Complex Societies'', ], Cambridge, UK {{ISBN|9780521386739}}. | |||
* ] (2020) ''The Doomsday Book: The Science Behind Humanity's Greatest Threats'' Union Square {{ISBN|9781454939962}} | |||
* ] (2004) '']'' {{ISBN|0-465-06863-4}} | |||
* {{cite book |last1=Rhodes |first1=Catherine |title=Managing Extreme Technological Risk |date=2024 |publisher=] |doi=10.1142/q0438 |isbn=978-1-80061-481-9 |url=https://worldscientific.com/worldscibooks/10.1142/q0438#t=toc |language=en}} | |||
* Roger-Maurice Bonnet and ] (2008) ''Surviving 1,000 Centuries Can We Do It?'' Springer-Praxis Books. | |||
* {{cite journal |last1=Taggart |first1=Gabel |title=Taking stock of systems for organizing existential and global catastrophic risks: Implications for policy |journal=] |date=2023 |volume=14 |issue=3 |pages=489–499 |doi=10.1111/1758-5899.13230}} | |||
* ] (2020) Bloomsbury Publishing {{ISBN|9781526600219}} | |||
* {{cite journal |last1=Turchin |first1=Alexey |last2=Denkenberger |first2=David |title=Global catastrophic and existential risks communication scale |journal=] |date=2018 |volume=102 |pages=27–38 |doi=10.1016/j.futures.2018.01.003|url=https://philarchive.org/rec/TURGCA }} | |||
* {{cite book |title=End Times: A Brief Guide to the End of the World |first=Bryan |last=Walsh |publisher=Hachette Books |year=2019 |isbn=978-0275948023}} | |||
==External links== | ==External links== | ||
{{Wikiquote}} | |||
* (TV documentary) ABC News 2-hour Special Edition of 20/20 on 7 real end-of-the-world scenarios (Wed. Aug 30 2006) | |||
* {{Cite news |date=February 19, 2019 |title=Are we on the road to civilisation collapse? |work=] |url=http://www.bbc.com/future/story/20190218-are-we-on-the-road-to-civilisation-collapse}} | |||
* from '']''. Ten scientists name the biggest danger to Earth and assesses the chances of it happening. April 14, 2005. | |||
* {{Cite news |last=MacAskill |first=William |author-link=William MacAskill |date=August 5, 2022 |title=The Case for Longtermism |work=] |url=https://www.nytimes.com/2022/08/05/opinion/the-case-for-longtermism.html |access-date=}} | |||
* The theories of the end of world into a television programme presented by BBC television. | |||
* from '']''. Ten scientists name the biggest dangers to Earth and assess the chances they will happen. April 14, 2005. | |||
*, by ] in '']'', April 18 2006 | |||
* . ''The Guardian''. February 6, 2020. | |||
* by the ] | |||
* | |||
* {{Webarchive|url=https://web.archive.org/web/20190811233004/https://www.gcrpolicy.com/ |date=August 11, 2019 }} | |||
* {{TED talk|stephen_petranek_counts_down_to_armageddon|Stephen Petranek: 10 ways the world could end}} | |||
{{Doomsday|state=expanded}} | |||
] | |||
{{Effective altruism}} | |||
{{Existential risk from artificial intelligence}} | |||
{{Risk management}} | |||
{{Sustainability|state=collapsed}} | |||
{{Authority control}} | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] | |||
] |
Latest revision as of 17:40, 24 December 2024
Hypothetical global-scale disaster risk "Existential threat" and "Doomsday scenario" redirect here. For other uses, see Doomsday (disambiguation). Not to be confused with Global Catastrophic Risks (book). For broader coverage of this topic, see Human extinction.
Futures studies |
---|
Concepts |
Techniques |
Technology assessment and forecasting |
Related topics |
A global catastrophic risk or a doomsday scenario is a hypothetical event that could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's existence or potential is known as an "existential risk".
In the 21st century, a number of academic and non-profit organizations have been established to research global catastrophic and existential risks, formulate potential mitigation measures and either advocate for or implement these measures.
Definition and classification
Defining global catastrophic risks
The term global catastrophic risk "lacks a sharp definition", and generally refers (loosely) to a risk that could inflict "serious damage to human well-being on a global scale".
Humanity has suffered large catastrophes before. Some of these have caused serious damage but were only local in scope—e.g. the Black Death may have resulted in the deaths of a third of Europe's population, 10% of the global population at the time. Some were global, but were not as severe—e.g. the 1918 influenza pandemic killed an estimated 3–6% of the world's population. Most global catastrophic risks would not be so intense as to kill the majority of life on earth, but even if one did, the ecosystem and humanity would eventually recover (in contrast to existential risks).
Similarly, in Catastrophe: Risk and Response, Richard Posner singles out and groups together events that bring about "utter overthrow or ruin" on a global, rather than a "local or regional" scale. Posner highlights such events as worthy of special attention on cost–benefit grounds because they could directly or indirectly jeopardize the survival of the human race as a whole.
Defining existential risks
Existential risks are defined as "risks that threaten the destruction of humanity's long-term potential." The instantiation of an existential risk (an existential catastrophe) would either cause outright human extinction or irreversibly lock in a drastically inferior state of affairs. Existential risks are a sub-class of global catastrophic risks, where the damage is not only global but also terminal and permanent, preventing recovery and thereby affecting both current and all future generations.
Non-extinction risks
While extinction is the most obvious way in which humanity's long-term potential could be destroyed, there are others, including unrecoverable collapse and unrecoverable dystopia. A disaster severe enough to cause the permanent, irreversible collapse of human civilisation would constitute an existential catastrophe, even if it fell short of extinction. Similarly, if humanity fell under a totalitarian regime, and there were no chance of recovery then such a dystopia would also be an existential catastrophe. Bryan Caplan writes that "perhaps an eternity of totalitarianism would be worse than extinction". (George Orwell's novel Nineteen Eighty-Four suggests an example.) A dystopian scenario shares the key features of extinction and unrecoverable collapse of civilization: before the catastrophe humanity faced a vast range of bright futures to choose from; after the catastrophe, humanity is locked forever in a terrible state.
Psychologist Steven Pinker has called existential risk a "useless category" that can distract from threats he considers real and solvable, such as climate change and nuclear war.
Potential sources of risk
Main article: Global catastrophe scenariosPotential global catastrophic risks are conventionally classified as anthropogenic or non-anthropogenic hazards. Examples of non-anthropogenic risks are an asteroid or comet impact event, a supervolcanic eruption, a natural pandemic, a lethal gamma-ray burst, a geomagnetic storm from a coronal mass ejection destroying electronic equipment, natural long-term climate change, hostile extraterrestrial life, or the Sun transforming into a red giant star and engulfing the Earth billions of years in the future.
Anthropogenic risks are those caused by humans and include those related to technology, governance, and climate change. Technological risks include the creation of artificial intelligence misaligned with human goals, biotechnology, and nanotechnology. Insufficient or malign global governance creates risks in the social and political domain, such as global war and nuclear holocaust, biological warfare and bioterrorism using genetically modified organisms, cyberwarfare and cyberterrorism destroying critical infrastructure like the electrical grid, or radiological warfare using weapons such as large cobalt bombs. Other global catastrophic risks include climate change, environmental degradation, extinction of species, famine as a result of non-equitable resource distribution, human overpopulation or underpopulation, crop failures, and non-sustainable agriculture.
Methodological challenges
Research into the nature and mitigation of global catastrophic risks and existential risks is subject to a unique set of challenges and, as a result, is not easily subjected to the usual standards of scientific rigour. For instance, it is neither feasible nor ethical to study these risks experimentally. Carl Sagan expressed this with regards to nuclear war: "Understanding the long-term consequences of nuclear war is not a problem amenable to experimental verification". Moreover, many catastrophic risks change rapidly as technology advances and background conditions, such as geopolitical conditions, change. Another challenge is the general difficulty of accurately predicting the future over long timescales, especially for anthropogenic risks which depend on complex human political, economic and social systems. In addition to known and tangible risks, unforeseeable black swan extinction events may occur, presenting an additional methodological problem.
Lack of historical precedent
Humanity has never suffered an existential catastrophe and if one were to occur, it would necessarily be unprecedented. Therefore, existential risks pose unique challenges to prediction, even more than other long-term events, because of observation selection effects. Unlike with most events, the failure of a complete extinction event to occur in the past is not evidence against their likelihood in the future, because every world that has experienced such an extinction event has gone unobserved by humanity. Regardless of civilization collapsing events' frequency, no civilization observes existential risks in its history. These anthropic issues may partly be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology.
To understand the dynamics of an unprecedented, unrecoverable global civilizational collapse (a type of existential risk), it may be instructive to study the various local civilizational collapses that have occurred throughout human history. For instance, civilizations such as the Roman Empire have ended in a loss of centralized governance and a major civilization-wide loss of infrastructure and advanced technology. However, these examples demonstrate that societies appear to be fairly resilient to catastrophe; for example, Medieval Europe survived the Black Death without suffering anything resembling a civilization collapse despite losing 25 to 50 percent of its population.
Incentives and coordination
There are economic reasons that can explain why so little effort is going into global catastrophic risk reduction. First, it is speculative and may never happen, so many people focus on other more pressing issues. It is also a global public good, so we should expect it to be undersupplied by markets. Even if a large nation invested in risk mitigation measures, that nation would enjoy only a small fraction of the benefit of doing so. Furthermore, global catastrophic risk reduction can be thought of as an intergenerational global public good. Since most of the hypothetical benefits of the reduction would be enjoyed by future generations, and though these future people would perhaps be willing to pay substantial sums for risk reduction, no mechanism for such a transaction exists.
Cognitive biases
Numerous cognitive biases can influence people's judgment of the importance of existential risks, including scope insensitivity, hyperbolic discounting, availability heuristic, the conjunction fallacy, the affect heuristic, and the overconfidence effect.
Scope insensitivity influences how bad people consider the extinction of the human race to be. For example, when people are motivated to donate money to altruistic causes, the quantity they are willing to give does not increase linearly with the magnitude of the issue: people are roughly as willing to prevent the deaths of 200,000 or 2,000 birds. Similarly, people are often more concerned about threats to individuals than to larger groups.
Eliezer Yudkowsky theorizes that scope neglect plays a role in public perception of existential risks:
Substantially larger numbers, such as 500 million deaths, and especially qualitatively different scenarios such as the extinction of the entire human species, seem to trigger a different mode of thinking... People who would never dream of hurting a child hear of existential risk, and say, "Well, maybe the human species doesn't really deserve to survive".
All past predictions of human extinction have proven to be false. To some, this makes future warnings seem less credible. Nick Bostrom argues that the absence of human extinction in the past is weak evidence that there will be no human extinction in the future, due to survivor bias and other anthropic effects.
Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth."
Proposed mitigation
Multi-layer defense
Defense in depth is a useful framework for categorizing risk mitigation measures into three layers of defense:
- Prevention: Reducing the probability of a catastrophe occurring in the first place. Example: Measures to prevent outbreaks of new highly infectious diseases.
- Response: Preventing the scaling of a catastrophe to the global level. Example: Measures to prevent escalation of a small-scale nuclear exchange into an all-out nuclear war.
- Resilience: Increasing humanity's resilience (against extinction) when faced with global catastrophes. Example: Measures to increase food security during a nuclear winter.
Human extinction is most likely when all three defenses are weak, that is, "by risks we are unlikely to prevent, unlikely to successfully respond to, and unlikely to be resilient against".
The unprecedented nature of existential risks poses a special challenge in designing risk mitigation measures since humanity will not be able to learn from a track record of previous events.
Funding
Some researchers argue that both research and other initiatives relating to existential risk are underfunded. Nick Bostrom states that more research has been done on Star Trek, snowboarding, or dung beetles than on existential risks. Bostrom's comparisons have been criticized as "high-handed". As of 2020, the Biological Weapons Convention organization had an annual budget of US$1.4 million.
Survival planning
Some scholars propose the establishment on Earth of one or more self-sufficient, remote, permanently occupied settlements specifically created for the purpose of surviving a global disaster. Economist Robin Hanson argues that a refuge permanently housing as few as 100 people would significantly improve the chances of human survival during a range of global catastrophes.
Food storage has been proposed globally, but the monetary cost would be high. Furthermore, it would likely contribute to the current millions of deaths per year due to malnutrition. In 2022, a team led by David Denkenberger modeled the cost-effectiveness of resilient foods to artificial general intelligence (AGI) safety and found "~98-99% confidence" for a higher marginal impact of work on resilient foods. Some survivalists stock survival retreats with multiple-year food supplies.
The Svalbard Global Seed Vault is buried 400 feet (120 m) inside a mountain on an island in the Arctic. It is designed to hold 2.5 billion seeds from more than 100 countries as a precaution to preserve the world's crops. The surrounding rock is −6 °C (21 °F) (as of 2015) but the vault is kept at −18 °C (0 °F) by refrigerators powered by locally sourced coal.
More speculatively, if society continues to function and if the biosphere remains habitable, calorie needs for the present human population might in theory be met during an extended absence of sunlight, given sufficient advance planning. Conjectured solutions include growing mushrooms on the dead plant biomass left in the wake of the catastrophe, converting cellulose to sugar, or feeding natural gas to methane-digesting bacteria.
Global catastrophic risks and global governance
Insufficient global governance creates risks in the social and political domain, but the governance mechanisms develop more slowly than technological and social change. There are concerns from governments, the private sector, as well as the general public about the lack of governance mechanisms to efficiently deal with risks, negotiate and adjudicate between diverse and conflicting interests. This is further underlined by an understanding of the interconnectedness of global systemic risks. In absence or anticipation of global governance, national governments can act individually to better understand, mitigate and prepare for global catastrophes.
Climate emergency plans
In 2018, the Club of Rome called for greater climate change action and published its Climate Emergency Plan, which proposes ten action points to limit global average temperature increase to 1.5 degrees Celsius. Further, in 2019, the Club published the more comprehensive Planetary Emergency Plan.
There is evidence to suggest that collectively engaging with the emotional experiences that emerge during contemplating the vulnerability of the human species within the context of climate change allows for these experiences to be adaptive. When collective engaging with and processing emotional experiences is supportive, this can lead to growth in resilience, psychological flexibility, tolerance of emotional experiences, and community engagement.
Space colonization
Main article: Space and survivalSpace colonization is a proposed alternative to improve the odds of surviving an extinction scenario. Solutions of this scope may require megascale engineering.
Astrophysicist Stephen Hawking advocated colonizing other planets within the Solar System once technology progresses sufficiently, in order to improve the chance of human survival from planet-wide events such as global thermonuclear war.
Billionaire Elon Musk writes that humanity must become a multiplanetary species in order to avoid extinction. His company SpaceX is developing technology he projects will be used to colonize Mars.
Organizations
The Bulletin of the Atomic Scientists (est. 1945) is one of the oldest global risk organizations, founded after the public became alarmed by the potential of atomic warfare in the aftermath of WWII. It studies risks associated with nuclear war and energy and famously maintains the Doomsday Clock established in 1947. The Foresight Institute (est. 1986) examines the risks of nanotechnology and its benefits. It was one of the earliest organizations to study the unintended consequences of otherwise harmless technology gone haywire at a global scale. It was founded by K. Eric Drexler who postulated "grey goo".
Beginning after 2000, a growing number of scientists, philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia.
Independent non-governmental organizations (NGOs) include the Machine Intelligence Research Institute (est. 2000), which aims to reduce the risk of a catastrophe caused by artificial intelligence, with donors including Peter Thiel and Jed McCaleb. The Nuclear Threat Initiative (est. 2001) seeks to reduce global threats from nuclear, biological and chemical threats, and containment of damage after an event. It maintains a nuclear material security index. The Lifeboat Foundation (est. 2009) funds research into preventing a technological catastrophe. Most of the research money funds projects at universities. The Global Catastrophic Risk Institute (est. 2011) is a US-based non-profit, non-partisan think tank founded by Seth Baum and Tony Barrett. GCRI does research and policy work across various risks, including artificial intelligence, nuclear war, climate change, and asteroid impacts. The Global Challenges Foundation (est. 2012), based in Stockholm and founded by Laszlo Szombatfalvy, releases a yearly report on the state of global risks. The Future of Life Institute (est. 2014) works to reduce extreme, large-scale risks from transformative technologies, as well as steer the development and use of these technologies to benefit all life, through grantmaking, policy advocacy in the United States, European Union and United Nations, and educational outreach. Elon Musk, Vitalik Buterin and Jaan Tallinn are some of its biggest donors.
University-based organizations included the Future of Humanity Institute (est. 2005) which researched the questions of humanity's long-term future, particularly existential risk. It was founded by Nick Bostrom and was based at Oxford University. The Centre for the Study of Existential Risk (est. 2012) is a Cambridge University-based organization which studies four major technological risks: artificial intelligence, biotechnology, global warming and warfare. All are man-made risks, as Huw Price explained to the AFP news agency, "It seems a reasonable prediction that some time in this or the next century intelligence will escape from the constraints of biology". He added that when this happens "we're no longer the smartest things around," and will risk being at the mercy of "machines that are not malicious, but machines whose interests don't include us." Stephen Hawking was an acting adviser. The Millennium Alliance for Humanity and the Biosphere is a Stanford University-based organization focusing on many issues related to global catastrophe by bringing together members of academia in the humanities. It was founded by Paul Ehrlich, among others. Stanford University also has the Center for International Security and Cooperation focusing on political cooperation to reduce global catastrophic risk. The Center for Security and Emerging Technology was established in January 2019 at Georgetown's Walsh School of Foreign Service and will focus on policy research of emerging technologies with an initial emphasis on artificial intelligence. They received a grant of 55M USD from Good Ventures as suggested by Open Philanthropy.
Other risk assessment groups are based in or are part of governmental organizations. The World Health Organization (WHO) includes a division called the Global Alert and Response (GAR) which monitors and responds to global epidemic crisis. GAR helps member states with training and coordination of response to epidemics. The United States Agency for International Development (USAID) has its Emerging Pandemic Threats Program which aims to prevent and contain naturally generated pandemics at their source. The Lawrence Livermore National Laboratory has a division called the Global Security Principal Directorate which researches on behalf of the government issues such as bio-security and counter-terrorism.
See also
- Artificial intelligence arms race – Arms race for the most advanced AI-related technologies
- Community resilience – Concept in crisis management
- Extreme risk – Low-probability risk of very bad outcomes
- Fermi paradox – Problem of the lack of evidence for alien life despite its apparent likelihood
- Foresight (psychology) – Behavior-based backcasting & forecasting factors
- Future of Earth – Long-term extrapolated geological and biological changes of planet Earth
- Future of the Solar System
- Climate engineering – Deliberate and large-scale intervention in Earth's climate system
- Global Risks Report
- Great Filter – Hypothesis of barriers to forming interstellar civilizations
- Holocene extinction – Ongoing extinction event caused by human activity
- Impact event – Collision of two astronomical objects
- List of global issues – List of environmental and other issues affecting life on Earth
- Nuclear proliferation – Spread of nuclear weapons
- Outside Context Problem – 1996 Book by Iain M. BanksPages displaying short descriptions of redirect targets
- Planetary boundaries – Limits not to be exceeded if humanity wants to survive in a safe ecosystem
- Rare events – event that occurs with low frequency, often with a widespread effect which might destabilize systemsPages displaying wikidata descriptions as a fallback
- The Sixth Extinction: An Unnatural History – 2014 nonfiction book by Elizabeth Kolbert
- Societal collapse – Fall of a complex human society
- Speculative evolution – Science fiction genre exploring hypothetical scenarios in the evolution of life
- Suffering risks – Risks of astronomical sufferingPages displaying short descriptions of redirect targets
- Survivalism – Movement of individuals or households preparing for emergencies and natural disasters
- Tail risk – Risk of rare events
- The Precipice: Existential Risk and the Future of Humanity – 2020 book about existential risks by Toby Ord
- Timeline of the far future – Scientific projections regarding the far future
- Triple planetary crisis – Three intersecting global environmental crises
- World Scientists' Warning to Humanity – 1992 document about human carbon footprint
References
- Schulte, P.; et al. (March 5, 2010). "The Chicxulub Asteroid Impact and Mass Extinction at the Cretaceous-Paleogene Boundary" (PDF). Science. 327 (5970): 1214–1218. Bibcode:2010Sci...327.1214S. doi:10.1126/science.1177265. PMID 20203042. S2CID 2659741.
- Bostrom, Nick (2008). Global Catastrophic Risks (PDF). Oxford University Press. p. 1.
- Ripple WJ, Wolf C, Newsome TM, Galetti M, Alamgir M, Crist E, Mahmoud MI, Laurance WF (November 13, 2017). "World Scientists' Warning to Humanity: A Second Notice". BioScience. 67 (12): 1026–1028. doi:10.1093/biosci/bix125. hdl:11336/71342.
- Bostrom, Nick (March 2002). "Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards". Journal of Evolution and Technology. 9.
- ^ "About FHI". Future of Humanity Institute. Retrieved August 12, 2021.
- ^ "About us". Centre for the Study of Existential Risk. Retrieved August 12, 2021.
- ^ "The Future of Life Institute". Future of Life Institute. Retrieved May 5, 2014.
- ^ "Nuclear Threat Initiative". Nuclear Threat Initiative. Retrieved June 5, 2015.
- ^ Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority" (PDF). Global Policy. 4 (1): 15–3. doi:10.1111/1758-5899.12002 – via Existential Risk.
- Bostrom, Nick; Cirkovic, Milan (2008). Global Catastrophic Risks. Oxford: Oxford University Press. p. 1. ISBN 978-0-19-857050-9.
- Ziegler, Philip (2012). The Black Death. Faber and Faber. p. 397. ISBN 9780571287116.
- Muehlhauser, Luke (March 15, 2017). "How big a deal was the Industrial Revolution?". lukemuelhauser.com. Retrieved August 3, 2020.
- Taubenberger, Jeffery; Morens, David (2006). "1918 Influenza: the Mother of All Pandemics". Emerging Infectious Diseases. 12 (1): 15–22. doi:10.3201/eid1201.050979. PMC 3291398. PMID 16494711.
- Posner, Richard A. (2006). Catastrophe: Risk and Response. Oxford: Oxford University Press. ISBN 978-0195306477. Introduction, "What is Catastrophe?"
- Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. New York: Hachette. ISBN 9780316484916.
This is an equivalent, though crisper statement of Nick Bostrom's definition: "An existential risk is one that threatens the premature extinction of Earth-originating intelligent life or the permanent and drastic destruction of its potential for desirable future development." Source: Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority". Global Policy. 4:15-31.
- Cotton-Barratt, Owen; Ord, Toby (2015), Existential risk and existential hope: Definitions (PDF), Future of Humanity Institute – Technical Report #2015-1, pp. 1–4
- Bostrom, Nick (2009). "Astronomical Waste: The opportunity cost of delayed technological development". Utilitas. 15 (3): 308–314. CiteSeerX 10.1.1.429.2849. doi:10.1017/s0953820800004076. S2CID 15860897.
- ^ Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. New York: Hachette. ISBN 9780316484916.
- ^ Bryan Caplan (2008). "The totalitarian threat". Global Catastrophic Risks, eds. Bostrom & Cirkovic (Oxford University Press): 504–519. ISBN 9780198570509
- Glover, Dennis (June 1, 2017). "Did George Orwell secretly rewrite the end of Nineteen Eighty-Four as he lay dying?". The Sydney Morning Herald. Retrieved November 21, 2021.
Winston's creator, George Orwell, believed that freedom would eventually defeat the truth-twisting totalitarianism portrayed in Nineteen Eighty-Four.
- Orwell, George (1949). Nineteen Eighty-Four. A novel. London: Secker & Warburg. Archived from the original on May 4, 2012. Retrieved August 12, 2021.
- ^ Kupferschmidt, Kai (January 11, 2018). "Could science destroy the world? These scholars want to save us from a modern-day Frankenstein". Science. AAAS. Retrieved April 20, 2020.
- Baum, Seth D. (2023). "Assessing natural global catastrophic risks". Natural Hazards. 115 (3): 2699–2719. Bibcode:2023NatHa.115.2699B. doi:10.1007/s11069-022-05660-w. PMC 9553633. PMID 36245947.
- Scouras, James (2019). "Nuclear War as a Global Catastrophic Risk". Journal of Benefit-Cost Analysis. 10 (2): 274–295. doi:10.1017/bca.2019.16.
- Sagan, Carl (Winter 1983). "Nuclear War and Climatic Catastrophe: Some Policy Implications". Foreign Affairs. Council on Foreign Relations. doi:10.2307/20041818. JSTOR 20041818. Retrieved August 4, 2020.
- Jebari, Karim (2014). "Existential Risks: Exploring a Robust Risk Reduction Strategy" (PDF). Science and Engineering Ethics. 21 (3): 541–54. doi:10.1007/s11948-014-9559-3. PMID 24891130. S2CID 30387504. Retrieved August 26, 2018.
- ^ Cirkovic, Milan M.; Bostrom, Nick; Sandberg, Anders (2010). "Anthropic Shadow: Observation Selection Effects and Human Extinction Risks" (PDF). Risk Analysis. 30 (10): 1495–1506. Bibcode:2010RiskA..30.1495C. doi:10.1111/j.1539-6924.2010.01460.x. PMID 20626690. S2CID 6485564.
- Kemp, Luke (February 2019). "Are we on the road to civilization collapse?". BBC. Retrieved August 12, 2021.
- Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. Hachette Books. ISBN 9780316484893.
Europe survived losing 25 to 50 percent of its population in the Black Death, while keeping civilization firmly intact
- ^ Yudkowsky, Eliezer (2008). "Cognitive Biases Potentially Affecting Judgment of Global Risks" (PDF). Global Catastrophic Risks: 91–119. Bibcode:2008gcr..book...86Y.
- Desvousges, W.H., Johnson, F.R., Dunford, R.W., Boyle, K.J., Hudson, S.P., and Wilson, N. 1993, Measuring natural resource damages with contingent valuation: tests of validity and reliability. In Hausman, J.A. (ed), Contingent Valuation:A Critical Assessment, pp. 91–159 (Amsterdam: North Holland).
- Bostrom 2013.
- Yudkowsky, Eliezer. "Cognitive biases potentially affecting judgment of global risks". Global catastrophic risks 1 (2008): 86. p.114
- "We're Underestimating the Risk of Human Extinction". The Atlantic. March 6, 2012. Retrieved July 1, 2016.
- Is Humanity Suicidal? The New York Times Magazine May 30, 1993)
- ^ Cotton-Barratt, Owen; Daniel, Max; Sandberg, Anders (2020). "Defence in Depth Against Human Extinction: Prevention, Response, Resilience, and Why They All Matter". Global Policy. 11 (3): 271–282. doi:10.1111/1758-5899.12786. ISSN 1758-5899. PMC 7228299. PMID 32427180.
- "Oxford Institute Forecasts The Possible Doom Of Humanity". Popular Science. 2013. Retrieved April 20, 2020.
- Toby Ord (2020). The precipice: Existential risk and the future of humanity. Hachette Books. ISBN 9780316484893.
The international body responsible for the continued prohibition of bioweapons (the Biological Weapons Convention) has an annual budget of $1.4 million - less than the average McDonald's restaurant
- ^ Matheny, Jason Gaverick (2007). "Reducing the Risk of Human Extinction" (PDF). Risk Analysis. 27 (5): 1335–1344. Bibcode:2007RiskA..27.1335M. doi:10.1111/j.1539-6924.2007.00960.x. PMID 18076500. S2CID 14265396. Archived from the original (PDF) on August 27, 2014. Retrieved May 16, 2015.
- Wells, Willard. (2009). Apocalypse when?. Praxis. ISBN 978-0387098364.
- Wells, Willard. (2017). Prospects for Human Survival. Lifeboat Foundation. ISBN 978-0998413105.
- Hanson, Robin. "Catastrophe, social collapse, and human extinction". Global catastrophic risks 1 (2008): 357.
- Smil, Vaclav (2003). The Earth's Biosphere: Evolution, Dynamics, and Change. MIT Press. p. 25. ISBN 978-0-262-69298-4.
- Denkenberger, David C.; Sandberg, Anders; Tieman, Ross John; Pearce, Joshua M. (2022). "Long term cost-effectiveness of resilient foods for global catastrophes compared to artificial general intelligence safety". International Journal of Disaster Risk Reduction. 73: 102798. Bibcode:2022IJDRR..7302798D. doi:10.1016/j.ijdrr.2022.102798.
- Lewis Smith (February 27, 2008). "Doomsday vault for world's seeds is opened under Arctic mountain". The Times Online. London. Archived from the original on May 12, 2008.
- Suzanne Goldenberg (May 20, 2015). "The doomsday vault: the seeds that could save a post-apocalyptic world". The Guardian. Retrieved June 30, 2017.
- "Here's how the world could end—and what we can do about it". Science. AAAS. July 8, 2016. Retrieved March 23, 2018.
- Denkenberger, David C.; Pearce, Joshua M. (September 2015). "Feeding everyone: Solving the food crisis in event of global catastrophes that kill crops or obscure the sun" (PDF). Futures. 72: 57–68. doi:10.1016/j.futures.2014.11.008. S2CID 153917693.
- "Global Challenges Foundation | Understanding Global Systemic Risk". globalchallenges.org. Archived from the original on August 16, 2017. Retrieved August 15, 2017.
- "Global Catastrophic Risk Policy". gcrpolicy.com. Archived from the original on August 11, 2019. Retrieved August 11, 2019.
- Club of Rome (2018). "The Climate Emergency Plan". Retrieved August 17, 2020.
- Club of Rome (2019). "The Planetary Emergency Plan". Retrieved August 17, 2020.
- Kieft, J.; Bendell, J (2021). "The responsibility of communicating difficult truths about climate influenced societal disruption and collapse: an introduction to psychological research". Institute for Leadership and Sustainability (IFLAS) Occasional Papers. 7: 1–39.
- "Mankind must abandon earth or face extinction: Hawking", physorg.com, August 9, 2010, retrieved January 23, 2012
- Malik, Tariq (April 13, 2013). "Stephen Hawking: Humanity Must Colonize Space to Survive". Space.com. Retrieved July 1, 2016.
- Shukman, David (January 19, 2016). "Hawking: Humans at risk of lethal 'own goal'". BBC News. Retrieved July 1, 2016.
- Ginsberg, Leah (June 16, 2017). "Elon Musk thinks life on earth will go extinct, and is putting most of his fortune toward colonizing Mars". CNBC.
- Fred Hapgood (November 1986). "Nanotechnology: Molecular Machines that Mimic Life" (PDF). Omni. Archived from the original (PDF) on July 27, 2013. Retrieved June 5, 2015.
- Giles, Jim (2004). "Nanotech takes small step towards burying 'grey goo'". Nature. 429 (6992): 591. Bibcode:2004Natur.429..591G. doi:10.1038/429591b. PMID 15190320.
- Sophie McBain (September 25, 2014). "Apocalypse soon: the scientists preparing for the end times". New Statesman. Retrieved June 5, 2015.
- "Reducing Long-Term Catastrophic Risks from Artificial Intelligence". Machine Intelligence Research Institute. Retrieved June 5, 2015.
The Machine Intelligence Research Institute aims to reduce the risk of a catastrophe, should such an event eventually occur.
- Angela Chen (September 11, 2014). "Is Artificial Intelligence a Threat?". The Chronicle of Higher Education. Retrieved June 5, 2015.
- Alexander Sehmar (May 31, 2015). "Isis could obtain nuclear weapon from Pakistan, warns India". The Independent. Archived from the original on June 2, 2015. Retrieved June 5, 2015.
- "About the Lifeboat Foundation". The Lifeboat Foundation. Retrieved April 26, 2013.
- Ashlee, Vance (July 20, 2010). "The Lifeboat Foundation: Battling Asteroids, Nanobots and A.I." New York Times. Retrieved June 5, 2015.
- "Global Catastrophic Risk Institute". gcrinstitute.org. Retrieved March 22, 2022.
- Meyer, Robinson (April 29, 2016). "Human Extinction Isn't That Unlikely". The Atlantic. Boston, Massachusetts: Emerson Collective. Retrieved April 30, 2016.
- "Global Challenges Foundation website". globalchallenges.org. Retrieved April 30, 2016.
- Nick Bilton (May 28, 2015). "Ava of 'Ex Machina' Is Just Sci-Fi (for Now)". New York Times. Retrieved June 5, 2015.
- Hui, Sylvia (November 25, 2012). "Cambridge to study technology's risks to humans". Associated Press. Archived from the original on December 1, 2012. Retrieved January 30, 2012.
- Scott Barrett (2014). Environment and Development Economics: Essays in Honour of Sir Partha Dasgupta. Oxford University Press. p. 112. ISBN 9780199677856. Retrieved June 5, 2015.
- "Millennium Alliance for Humanity & The Biosphere". Millennium Alliance for Humanity & The Biosphere. Retrieved June 5, 2015.
- Guruprasad Madhavan (2012). Practicing Sustainability. Springer Science & Business Media. p. 43. ISBN 9781461443483. Retrieved June 5, 2015.
- "Center for International Security and Cooperation". Center for International Security and Cooperation. Retrieved June 5, 2015.
- ^ Anderson, Nick (February 28, 2019). "Georgetown launches think tank on security and emerging technology". Washington Post. Retrieved March 12, 2019.
- "Global Alert and Response (GAR)". World Health Organization. Archived from the original on February 16, 2003. Retrieved June 5, 2015.
- Kelley Lee (2013). Historical Dictionary of the World Health Organization. Rowman & Littlefield. p. 92. ISBN 9780810878587. Retrieved June 5, 2015.
- "USAID Emerging Pandemic Threats Program". USAID. Archived from the original on October 22, 2014. Retrieved June 5, 2015.
- "Global Security". Lawrence Livermore National Laboratory. Archived from the original on December 27, 2007. Retrieved June 5, 2015.
Further reading
- Avin, Shahar; Wintle, Bonnie C.; Weitzdörfer, Julius; ó Héigeartaigh, Seán S.; Sutherland, William J.; Rees, Martin J. (2018). "Classifying global catastrophic risks". Futures. 102: 20–26. doi:10.1016/j.futures.2018.02.001.
- Corey S. Powell (2000) "Twenty ways the world could end suddenly" Discover Magazine
- Currie, Adrian; Ó hÉigeartaigh, Seán (2018). "Working together to face humanity's greatest threats: Introduction to the Future of Research on Catastrophic and Existential Risk". Futures. 102: 1–5. doi:10.1016/j.futures.2018.07.003. hdl:10871/35764.
- Derrick Jensen (2006) Endgame ISBN 1-58322-730-X.
- Donella Meadows (1972) The Limits to Growth ISBN 0-87663-165-0.
- Edward O. Wilson (2003) The Future of Life ISBN 0-679-76811-4
- Holt, Jim (February 25, 2021). "The Power of Catastrophic Thinking". The New York Review of Books. Vol. LXVIII, no. 3. pp. 26–29. p. 28:
Whether you are searching for a cure for cancer, or pursuing a scholarly or artistic career, or engaged in establishing more just institutions, a threat to the future of humanity is also a threat to the significance of what you do.
- Huesemann, Michael H., and Joyce A. Huesemann (2011) Technofix: Why Technology Won't Save Us or the Environment, Chapter 6, "Sustainability or Collapse", New Society Publishers, Gabriola Island, British Columbia, Canada, 464 pages ISBN 0865717044.
- Jared Diamond (2005 and 2011) Collapse: How Societies Choose to Fail or Succeed Penguin Books ISBN 9780241958681.
- Jean-Francois Rischard (2003) High Noon 20 Global Problems, 20 Years to Solve Them ISBN 0-465-07010-8
- Joel Garreau (2005) Radical Evolution ISBN 978-0385509657.
- John A. Leslie (1996) The End of the World ISBN 0-415-14043-9.
- Joseph Tainter (1990) The Collapse of Complex Societies, Cambridge University Press, Cambridge, UK ISBN 9780521386739.
- Marshall Brain (2020) The Doomsday Book: The Science Behind Humanity's Greatest Threats Union Square ISBN 9781454939962
- Martin Rees (2004) Our Final Hour: A Scientist's warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future in This Century—On Earth and Beyond ISBN 0-465-06863-4
- Rhodes, Catherine (2024). Managing Extreme Technological Risk. World Scientific. doi:10.1142/q0438. ISBN 978-1-80061-481-9.
- Roger-Maurice Bonnet and Lodewijk Woltjer (2008) Surviving 1,000 Centuries Can We Do It? Springer-Praxis Books.
- Taggart, Gabel (2023). "Taking stock of systems for organizing existential and global catastrophic risks: Implications for policy". Global Policy. 14 (3): 489–499. doi:10.1111/1758-5899.13230.
- Toby Ord (2020) The Precipice - Existential Risk and the Future of Humanity Bloomsbury Publishing ISBN 9781526600219
- Turchin, Alexey; Denkenberger, David (2018). "Global catastrophic and existential risks communication scale". Futures. 102: 27–38. doi:10.1016/j.futures.2018.01.003.
- Walsh, Bryan (2019). End Times: A Brief Guide to the End of the World. Hachette Books. ISBN 978-0275948023.
External links
- "Are we on the road to civilisation collapse?". BBC. February 19, 2019.
- MacAskill, William (August 5, 2022). "The Case for Longtermism". The New York Times.
- "What a way to go" from The Guardian. Ten scientists name the biggest dangers to Earth and assess the chances they will happen. April 14, 2005.
- Humanity under threat from perfect storm of crises – study. The Guardian. February 6, 2020.
- Annual Reports on Global Risk by the Global Challenges Foundation
- Center on Long-Term Risk
- Global Catastrophic Risk Policy Archived August 11, 2019, at the Wayback Machine
- Stephen Petranek: 10 ways the world could end, a TED talk
Effective altruism | |
---|---|
Concepts |
|
Key figures | |
Organizations |
|
Focus areas |
|
Literature | |
Events |
Existential risk from artificial intelligence | |
---|---|
Concepts |
|
Organizations |
|
People | |
Other | |
Category |
Risk management | |||||
---|---|---|---|---|---|
Risk type & source |
| ||||
Countermeasures | |||||
Risk assessment |
| ||||
Related concepts |
Sustainability | |
---|---|
Principles | |
Consumption |
|
World population | |
Technology | |
Biodiversity | |
Energy | |
Food | |
Water |
|
Accountability | |
Applications |
|
Sustainable management | |
Agreements and conferences |
|