Revision as of 19:38, 13 November 2006 editRamdrake (talk | contribs)8,680 edits Please bring your point to the talk page. You can't just censor a word from the whole Misplaced Pages.← Previous edit | Revision as of 19:43, 13 November 2006 edit undoMONGO (talk | contribs)Autopatrolled, Extended confirmed users, File movers, Pending changes reviewers, Rollbackers76,644 edits No, you bring your evidence to talk that this term is used...it is isn't...so cease your racist POV pushing nonsenseNext edit → | ||
Line 80: | Line 80: | ||
** A disruption, chemical, biological, or otherwise, in humans' ability to reproduce properly or at all | ** A disruption, chemical, biological, or otherwise, in humans' ability to reproduce properly or at all | ||
** ] | ** ] | ||
* Genetic or evolutionary regression or deterioration such as proposed by the idea of ]; or reduction in the diversity of human capabilities through ] (via ] genetic testing mainly). | |||
* Scientific accidents | * Scientific accidents |
Revision as of 19:43, 13 November 2006
Human extinction would be the extinction of the human species, Homo sapiens.
Attitudes to human extinction
Attitudes to human extinction vary widely depending on beliefs concerning spiritual survival (souls, heaven, reincarnation, and so forth), the value of the human race, whether the human race evolves individually or collectively, and many other factors. Many religions prophesy an end time to the universe, so eventual human extinction is necessarily a part of the faith of many humans, to the extent that the end time means the absolute end of their physical humanity (although perhaps not an internal soul (see eschatology).
Many people consider that the extinction of the entire species would be a much worse fate than the death of an individual. Although the mortality of the individual can be accepted as an inevitable part of the human condition, humans can nevertheless expect to attain some measurement of immortality through their progeny, or through contributions or advancement in culture or science. However, the extent to which this "immortality" can be achieved is subject to the continuation of the species as a whole, and human extinction would represent the termination of such expectations.
Fear of human extinction is said to be one of the motivating factors of the environmentalist movements of the 20th and 21st centuries.
Some minority views, in favour of human extinction takes two forms:
- Deep ecologists like VHEMT say that humanity is inherently destructive to the global ecosystem, the needs of which should outweigh humanity's desire for "immortality".
- Some pessimistic observers (such as Schopenhauer) have written that destroying the entire biosphere is a price worth paying to erase human evil.
Perception of human extinction risk
The general level of fear about human extinction (in the near term) is very low. It is not an outcome considered by many as a credible risk (excluding religious extinction). Suggested reasons for human extinction's low public visibility:
- There have been countless prophesies of extinction throughout history; in most cases the predicted date of doom has passed without much notice, making future warnings less frightening. However, a survivor bias would undercut the credibility of accurate extinction warnings. John von Neumann was probably wrong in having “a certainty” that nuclear war would occur; but our survival is not proof that the chance of a fatal nuclear exchange was low.
- To prevent public panics, official reports containing high casualty estimates are sometimes suppressed or changed (such as Admiral Rickover's critical report on nuclear industry safety).
- Extinction scenarios (see below) are speculative, and hard to quantify. A frequentist approach to probability cannot be used to assess the danger of an event that has never been observed by humans.
- Nick Bostrom suggests that extinction-analysis may be an overlooked field simply because it is too depressing a subject area to attract researchers.
- There are thousands of public safety jobs dedicated to analyzing and reducing the risks of individual death. There are no full-time existential safety commissioners partly because there is no way to tell if they are doing a good job, and no way to punish them for failure. The inability to judge performance might also explain the comparative governmental apathy on preventing human extinction (as compared to panda extinction, say).
- Some anthropologists believe that risk perception is biased by social structure; in the "Cultural Theory of risk" typography "individualist" societies predispose members to the belief that nature operates as a self-correcting system, which will return to its stable state after a disturbance. People in such cultures feel comfortable with a "trial-and-error" approach to risk, even to unsuitably rare dangers (such as extinction events).
- It is possible to do something about dietary or motor-vehicle health threats. Since it is much harder to know how existential threats should be minimized, they tend to be ignored. High technology societies tend to become "hierarchist" or "fatalist" in their attitudes to the ever-multiplying risks threatening them. In either case, the average member of society adopts a passive attitude to risk minimization, culturally, and psychologically.
- The bias in popular culture is to relate extinction scenario stories with non-extinction outcomes. (None of the 16 'most notable' WW3 scenarios in film are resolved by human extinction, for example.)
- The threat of nuclear annihilation actually was a daily concern in the lives of many people in the 1960s and 1970s. Since then the principal fear has been of localized terrorist attack, rather than a global war of extinction; contemplating human extinction may be out of fashion.
- Some people have philosophical reasons for doubting the possibility of human extinction, for instance the final anthropic principle, plenitude principle or intrinsic finality.
- Tversky and Kahneman have produced evidence that humans suffer cognitive biases which would tend to minimize the perception of this unprecented event:
- Denial is a negative "availability heuristic" shown to occur when an outcome is so upsetting that the very act of thinking about it leads to an increased refusal to believe it might occur. In this case, imagining human extinction probably makes it seem less likely.
- In cultures where human extinction is not expected the proposition must overcome the "disconfirmation bias" against heterodox theories.
- Another reliable psychological effect relevant here is the "positive outcome bias".
- Behavioural finance has strong evidence that recent evidence is given undue significance in risk analysis. Roughly speaking, "100 year storms" tend to occur every twenty years in the stock market as traders become convinced that the current good times will last forever. Doomsayers who hypothesize rare crisis-scenarios are dismissed even when they have statistical evidence behind them. An extreme form of this bias can diminish the subjective probability of the unprecedented.
In general, humanity's sense of self preservation, and intelligence are considered to offer safe-guards against extinction. It is felt that people will find creative ways to overcome potential threats, and will take care of the precautionary principle in attempting dangerous innovations. The arguments against this are; firstly, that the management of destructive technology is becoming difficult, and secondly, that the precautionary principle is often abandoned whenever the reward appears to outweigh the risk. Two examples where the principle has been overruled are:
- Some Anti-GM food campaigners are very concerned by "Frankenstein genes", which cross the species barrier and raise the spectre of a 'superbug' doomsday. They invoke the precautionary principle against the use of this technology, but its benefits are considered to be so significant that trials and distribution are permitted in many parts of the world.
- Before the Trinity nuclear test, one of the project's scientists (Teller) speculated that the fission explosion might destroy New Mexico and possibly the world, by causing a reaction in the nitrogen of the atmosphere. A calculation from another scientist on the project proved such a possibility theoretically impossible, but the fear of the possibility remained among some until the test took place. (See Ignition of the atmosphere with nuclear bombs, LA-602, online and Manhattan Project).
Observations about human extinction
The fact the majority of species that have existed on Earth have become extinct, has led to the suggestion that all species have a finite lifespan and thus human extinction would be inevitable. Dave Raup and Jack Sepkoski found for example a twenty six million year periodocy in elevated extinction rates, caused by factors unknown (See David M. Raup. "Extinction: Bad Genes or Bad Luck" (1992, Norton). Based upon evidence of past extinction rates Raup and others have suggested that the average longevity of an invertebrate species is between 4-6 million years, whilst that of vertebrates seems to be 2-4 million years. The shorter period of survival for mammals lies in their position further up the food chain than many invertebrates, and therefore an increased liability to suffer the effects of environmental change. A counter-argument to this is that humans are unique in their adaptive and technological capabilities, so it is not possible to draw reliable inferences about the probability of human extinction based on the past extinctions of other species. Certainly, the evidence collected by Raup and others suggested that generalist, geographically dispersed species, like humans, generally have a lower rate of extinction than those species that are incrdibly well adapted to a single environmnet. It is also widely believed that the human species is the only species with a conscious prior knowledge of their own demise, and well in advance.
Another characteristic of the human animal believed to be unique is its religious belief (see "Observations about human extinction", above). Some commentators (such as John F. Schumaker) claim that paranormal beliefs are the "excess evolutionary baggage" underlying the "seemingly suicidal qualities that are features of the human animal". Other socioecological observers maintain that hunter-gatherer evolution has simply produced a mind biased against considering the common good of more than a hundred people; this was Albert Einstein's belief, and he concluded:
- "We shall require a substantially new manner of thinking if mankind is to survive."
Humans are very similar to other primates in their genetic propensity towards intra-species violence; Jared Diamond's The Rise and Fall of the Third Chimpanzee estimates that 64% of hunter-gather societies engage in warfare every two years. Although it has been argued (e.g. in the UNESCO Seville Statement) that warfare is a cultural artifact, many anthropologists dispute this, noting that small human tribes exhibit similar patterns of violence to chimpanzee groups, the most murderous of the primates, and our nearest genetic relatives. The 'higher' functions of reason and speech may be more evolved in the brain of Homo sapiens than its cousins, but the relative size of the limbic system is a constant in apes, monkeys and humans; as human rational faculties have expanded, so has the wetware of emotion. The combination of inventiveness and urge to violence in the human animal has been cited as evidence against its long term survival.
Human extinction scenarios
- See also End of civilization
Various scenarios for the extinction of the human species have originated from science, popular culture, science fiction, and religion (see apocalypse and eschatology). The expression existential risk has been coined to refer to risks of total and irreversible destruction of human life, or of some lesser, but universal and permanent detriment to it.
The following are among the extinction scenarios that have been envisaged by various authors:
- Severe forms of known or recorded disasters
- Warfare, whether nuclear or biological; see World War III.
- Universal pandemic involving a genetic disease, virus, prion, or antibiotic-resistant bacterium.
- Famine resulting from overpopulation (see Malthusian catastrophe)
- Environmental collapses
- Catastrophic climate change as a result of global warming or the effects of extensive deforestation or pollution.
- Loss of a breathable atmosphere or destruction of the ozone layer.
- Occurrence of a supervolcano.
- Extreme ice age leading to Iceball Earth
- Magnetic pole change on earth would lead to the collapse of the earth's magnetic shielding against solar radiation, therefore giving an extreme dose of radiation to anyone who would venture outside unprotected. This change has been observed in the consistency of ancient clay pots and stones, its a cyclic proccess and the earth is due for a change.
- Long term habitat threats
- In 1.4 million years Gliese 710 will be only 1.1 Light years from Earth, and might catastrophically perturb the Oort cloud
- In about 3 billion years, our Milky Way galaxy is expected to pass through the Andromeda galaxy, which may or may not result in a collision
- In 5 billion years the Sun's stellar evolution will reach the red giant stage, in which it will expand to engulf the Earth. Before this date, its radiated spectrum may alter in ways Earth-bound humans could not survive.
- In the far future the main risks to human survival could be heat death and cooling with the expansion of the universe.
- Evolution of humanity into a posthuman life-form or existence by means of technology, leaving no trace of original humans
- Commentators such as Hans Moravec argue that humanity will eventually be supplanted and replaced by artificial intelligence or other forms of artificial life; while others have argued that humanity will inevitably experience a technological singularity, and furthermore that this outcome is desirable (see singularitarianism).
- transhumanist genetic engineering could lead to a species unable to inter-procreate, accidentally resulting in actual (rather than pseudo) extinction.
- Humans will continue to evolve via traditional natural selection over a period of millions of years, and homo sapiens will gradually transition into one or more new species.
- Extinction in a whimper
- Preference for fewer children; if developed world demographics are extrapolated they mathematically lead to 'soft' extinction before 3000 AD. (John Leslie estimates that if the reproduction rate drops to the German level the extinction date will be 2400).
- Political intervention in reproduction has failed to raise the birth rate above the replacement level in the rich world, but has dramatically succeeded in lowering it below the replacement level in China (see One child policy). A World government with a eugenic or small population policy could send humanity into 'voluntary' extinction.
- Infertility: Caused by hormonal disruption from the chemical/pharmaceutical industries, or biological changes, such as the (controversial) findings of falling sperm cell count in human males.
- A disruption, chemical, biological, or otherwise, in humans' ability to reproduce properly or at all
- Voluntary extinction
- Scientific accidents
- In his book Our Final Hour, Sir Martin Rees claims that without the appropriate regulation, scientific advancement increases the risk of human extinction as a result of the effects or use of new technology. Some examples are provided below.
- Uncontrolled nanotechnology (grey goo) incidents resulting in the destruction of the Earth's ecosystem (ecophagy).
- Creation of a naked singularity (such as a "micro black hole") on Earth during the course of a scientific experiment, or other foreseeable scientific accidents in high-energy physics research, such as vacuum phase transition or stranglet incidents.
- Biotech disaster (E.g. the warnings of Jeremy Rifkin)
- Scenarios of extra-terrestrial origin
- Major impact events.
- Gamma-ray burst in our part of the Milky Way (Bursts observable in other galaxies are calculated to act as a "sterilizer", and have been used by some astronomers to explain the Fermi paradox). The lack of fossil record interruptions, and relative distance of the nearest Hypernova candidate make this a long term (rather than imminent) threat.
- A black hole may suck the Earth in.
- Invasion by militarily superior aliens (see alien invasion) — often considered to be a scenario purely from the realms of science fiction, professional SETI researchers have given serious consideration to this possibility, but conclude that it is unlikely.
- Gerard O'Neill has cautioned that first contact with alien intelligence may follow the precedent set by historical examples of contact between human civilizations, where the less technologically-advanced civilization has inevitably succumbed to the other civilization, regardless of its intentions.
- Solar flares may suddenly heat the earth, or the light from the sun may be blocked by dust, slowly freezing it (eg. the dust and vapour may come from a Kuiper belt disturbance).
- Scenarios of extra-universal origin
- E.g., since it is entirely possible that the space of our universe, the Big Bang, and all its consequences are events taking place within a computing or other device on another cosmological plane, if this process were to end then everything within the universe would summarily vanish.
- Philosophical scenarios
See also
- Disaster
- Extinction
- Extinction event
- Law of Limited Competition (If violated, Daniel Quinn predicts coextinction for humanity, in the book Ishmael.)
- List of doomsday scenarios
- Novelty Theory (Mathematically derived eschatology, with arbitrary extinction mechanism.)
Further reading
- Cawthorne, N. (2004). Doomsday. Arcturus Publishing Limited. ISBN 1-84193-238-8
- Leslie, J. (1996). The End of the World: The Science and Ethics of Human Extinction. Routledge. ISBN 0-415-18447-9
External links
Groups for and against
- "Living on a lifeboat" - Garrett Hardin's 1974 grim review of the "Spaceship Earth" metaphor, arguing that "No generation has viewed the problem of the survival of the human species as seriously as we have.", and that survival may require cutting back international aid and barring emigration.
- The ethics of peace: why sustainability must be the basis, a Nature & Society Forum paper by John Ward where the author considers amongst other things whether there is a real risk of human extinction.
Human extinction scenario listings
- Armageddon Online posts at least one doomsday-related news item on its main page every day.
- Forty-five extinction scenarios from exitmundi.nl with light-hearted pictures and pithy names.
- Doomsday.org guide to extinction scenarios, according to religious prophecy and resulting from scientific advances.
- "Twenty ways the world could end suddenly". From Discover Magazine, Oct 2000.
- Existential risks analysed by Nick Bostrom. (Published in the Journal of Evolution and Technology, March 2002.) His definition of Existential risk: "– One where an adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential." In his typology of existential risks only the first ("Bangs") represents true human extinction, but this is a rare serious attempt to make a risk assessment. On the probability of extinction through existential risk he says "My subjective opinion is that setting this probability lower than 25% would be misguided, and the best estimate may be considerably higher. But even if the probability were much smaller (say, ~1%) the subject matter would still merit very serious attention because of how much is at stake."
Other
- BBC Discussion of how cavemen avoided extinction David Goldstein's molecular biology is credited with uncovering the population bottleneck, said to be "just before 100,000 years ago." Freezing temperatures or water shortages are given as possible reasons for the catastrophe presumed to have caused the bottleneck.
Notes
Von Neumann said it was "absolutely certain (1) that there would be a nuclear war; and (2) that everyone would die in it" (underline added to quote from: The Nature of the Physical Universe – 1979, John Wiley & Sons, ISBN 0-471-03190-9, in H. Putnam’s essay The place of facts in a world of values - page 113). This example illustrates why respectable scientists are very reluctant to go on record with extinction predictions: they can never be proven right. (The quotation is repeated by Leslie (1996) on page 26, on the subject of nuclear war annihilation, which he still considered a significant risk – in the mid 1990s.)
Although existential risks are less manageable by individuals than health risks, according to Ken Olum, Joshua Knobe, and Alexander Vilenkin the possibility of human extinction does have practical implications. For instance, if the “universal” Doomsday argument is accepted it changes the most likely source of disasters, and hence the most efficient means of preventing them. They write: "...you should be more concerned that a large number of asteroids have not yet been detected than about the particular orbit of each one. You should not worry especially about the chance that some specific nearby star will become a supernova, but more about the chance that supernovas are more deadly to nearby life then we believe." Source: “Practical application” page 39 of the Princeton University paper: Philosophical Implications of Inflationary Cosmology
The 2000 review Armageddon at the Millennial Dawn from The Journal of Religion and Film finds that "While end of the world threats perhaps are not avoidable, the cinematic formulation of millennial doom promotes the notion that the end can be averted through employing human ingenuity, scientific advance, and heroism." Since this review was conducted, there had been a Hollywood production which postulates a (far future) outcome where humans are extinct (at least in the wild): A.I..
For research on this, see Psychological science volume 15 (2004): Decisions From Experience and the Effect of Rare Events in Risky Choice. The under-perception of rare events mentioned above is actually the opposite of the phenomenon originally described by Kahneman in "prospect theory" (in their original experiments the likelihood of rare events is over-estimated). However, further analysis of the bias has shown that both forms occur: When judging from description people tend to over-estimate the described probability, so this effect taken alone would indicate that reading the extinction scenarios described here should make the reader over-estimate the likelihood of any probabilities given. However, the effect that is more relevant to common consideration of human extinction is the bias that occurs with estimates from experience, and these are in the opposite direction: When judging from personal experience people who have never heard of or experienced their species become extinct would be expected to dramatically under-estimate its likelihood. Sociobiologist E. O. Wilson argued that: "The reason for this myopic fog, evolutionary biologists contend, is that it was actually advantageous during all but the last few millennia of the two million years of existence of the genus Homo... A premium was placed on close attention to the near future and early reproduction, and little else. Disasters of a magnitude that occur only once every few centuries were forgotten or transmuted into myth." (Is Humanity Suicidal? New York Times Magazine May 301993).
Abrupt.org 1996 editorial lists (and condemns) the arguments for human’s tendency to self-destruction. In this view, the history of humanity suggests that humans will be the cause of their own extinction. However, others have reached the opposite conclusion with the same data on violence and hypothesize that as societies develop armies and weapons with greater destructive power, they tend to be used less often. It is claimed that this implies a more secure future, despite the development of WMD technology. As such this argument may constitute a form of deterrence theory. Counter-arguments against such views include the following: (1) All weapons ever designed have ultimately been used. States with strong military forces tend to engage in military aggression, (2) Although modern states have so far generally shown restraint in unleashing their most potent weapons, whatever rational control was guaranteed by government monopoly over such weapons becomes increasingly irrelevant in a world where individuals have access to the technology of mass destruction (as proposed in Our Final Hour, for example).
ReligiousTolerance.org says that Aum Supreme Truth is the only religion known to have planned Armageddon for non-believers. Their intention to unleash deadly viruses is covered in Our Final Hour, and by Aum watcher, Akihiko Misawa. The Gaia Liberation Front advocates (but is not known to have active plans for) total human genocide, see: GLF, A Modest Proposal. Leslie, 1996 says that Aum’s collection of nuclear physicists presented a doomsday threat from nuclear destruction as well, especially as the cult included a rocket scientist.
Leslie (1996) discusses the survivorship bias (which he calls an "observational selection" effect on page 139) he says that the a priori certainty of observing an "undisasterous past" could make it difficult to argue that we must be safe because nothing terrible has yet occurred. He quotes Holger Bech Nielsen’s formulation: “We do not even know if there should exist some extremely dangerous decay of say the proton which caused eradication of the earth, because if it happens we would no longer be there to observe it and if it does not happen there is nothing to observe.” (From: Random dynamics and relations between the number of fermion generations and the fine structure constants, Acta Pysica Polonica B, May 1989).
For example, in the essay Why the future doesn't need us, computer scientist Bill Joy argued that human beings are likely to guarantee their own extinction through transhumanism. See: Wired archive, Why the future doesn't need us.
For the “West Germany” extrapolation see: Leslie, 1996 (The End of the World) in the “War, Pollution, and disease” chapter (page 74). In this section the author also mentions the success (in lowering the birth rate) of programs such as the sterilization-for-rupees programs in India, and surveys other infertility or falling birth-rate extinciton scenarios. He says that the voluntary small family behaviour may be counter-evolutionary, but that the meme for small, rich families appears to be spreading rapidly throughout the world. In 2150 the world population is expected to start falling.
See estimate of contact’s probability at galactic-guide. Former NASA consultant David Brin's lengthy rebuttal to SETI enthusiast's optimism about alien intentions concludes: "The worst mistake of first contact, made throughout history by individuals on both sides of every new encounter, has been the unfortunate habit of making assumptions. It often proved fatal." (See full text at SETIleague.org.)
Category: