<<

Typology of human risks

Possible timeline - 100 000 BC 20 century Beginning of the 21 century 2030 2050 3000

Main adverse Natural risks Based on Based on Superintelligence Remote and Natural factor of caused by known new as final hypothetical the catastrophe risks human activity technologies technologies technology risks

Giant energy release

• Simple way of creation of Explosions Accidental start of natural Nuclear weapons nuclear weapons War with AI Artificial explosions in space • Accident on a collider: catastrophe • Big impact micro , false vacuum • Several AI fight each other for • Thermonuclear detonation of large • Nuclear war decay, • Runaway global warming world dominance gas planets • Artificial explosion of • Catastrophic magma release due Energy • Gamma ray burst (Venusian scenario) • Different types of AI • Artificial black holes to crust penetration (Stevenson • • New Friendliness • probe) • (disturbance of • Disruption of electric grid • Nuclear war against AI or against • Artificial nuclear winter by • Creation of superbomb stability of the atmosphere) because of solar flare attempts to create it explosions in forests • Artificial diverting of to Earth using precise calculations of celestial mechanics (space billiards)

Failure of control system

Competitors Disjunction Impairment of intelligence • Super addictive drug (brain Unfriendly AI destroy Encounter with alien stimulation) humanity intelligence • Another type of humans • Malfunction in robotic army causes • Human enhancement to • Dangerous memes • As a risk to this AI • Downloading Alien AI via (like Homo sapience were for mass destruction competing species (fascism, fundamentalism) • To get resources for his goals SETI Intelligence Neanderthals) • Computer virus with narrow AI • Cognitive biases (Paperclip maximaizer) • ET arrive on Earth • Another type of intelligent uses home robots and bioprinters • Risky decision (nuclear war) • Realize incorrectly formulated • METI attracts dangerous ET specie • Ignorance and neglect of • Dangerous meme (like ISIS) • GMO parasite that causes mass friendliness (Smile maximizer) • Beings from other existential risks • Fatal error in late stage AI (AI • We live in simulation and it • Tragedy of the Commons destructive behaviour • Accidental nuclear war halting) switched off or is built to model (arms race) catastrophes

Biotech • Multipandemic based on ge- netic engineering (10-100 lethal viruses) Organisms Genetic degradation Biological weapons Catastrophe on the level of • Artificial carriers like flies Extraterrestrial robots • Green goo (biosphere is eaten by mind and values • • Reduction of the medium level • Dangerous pandemic (eg: smallpox) artificial bioreplicators and viruses) • ET nanorobots-replicators • Super pest of intelligence in population • Humans are uploaded but became • Mistake of biodefense • Fast replicating killers (poisonous • ET robots-berserkers, killing Replication • Dangerous predator • Decline in fertility philosophical without qualia (contaminated vaccination, SV40) locust) civilization after certain threshold • Dangerous carrier • The accumulation of deleterious • Mass unemployment caused by • Artificial pest • Home bioprinters and (mosquitoes) mutations automation causes ennui, and • Omnivorous bacteria and fungi biohackers • Antibiotic-resistant bacteria suicide • Toxin producing GMO • Computer virus used for mass organisms (dioxin, VX) Nanotech attack on widely implanted brain • Cancer cell line adapted to live • Grey goo (unlimited replication) interface chips in population (HELA) • Nanotech weapons (small robots) • Universal nanoprinter (creates all other weapons)

Atmosphere Ecology Global contamination Whimpers Unknown physical laws

• Deliberated chemical • Value system of posthumanity • Atmospheric composition change • The accumulation of new toxins • Weapons based on them contamination Doomsday machine moves away from ours (disappearance of oxygen) in the biosphere and its collapse • Unexpected results of experiments • Deliberate destruction of all • AI goal is to maximize Poisoning • Release of toxic gases • • Dangerous natural phenomena (volcanism, methane hydrates, nuclear reactors by nuclear weapons • Cobalt superbomb with global suffering of as much humans as • Unknown unknowns dissolved gases in the ocean) • Autocatalytic reaction like Ice-9, contamination possible (worse than “I Have No • End of the : , artificial prions • Destruction of electric grid by Mouth, and I Must Scream”) vacuum state transition nuclear explosions in stratosphere • Several Doomsday machines with contradictory conditions

Classical species extinction Chain of War as a trigger Control systems failure Catalysts of bad scenarios Complexity crisis natural disasters • The confluence of a set of many • Nuclear war leads to use or release • As a result of the civilizational crisis • Cold War leads to creation of Unpredictably, chaos, and black Combined adverse circumstances • Epidemic followed by of biological weapons geoengineering system of Unfriendly AI swans to catastrophe • Changing environment and degradation and extinction • World War leads to arms race and termoregulation fails and a sharp • Asteroid impact causes accidental scenarios competitors • Degeneration of ability of the creation of Doomsday Machines global warming results (Seth Baum) nuclear war biosphere to regenerate, then loss • System crisis caused by many • Failure of Bioshield or Nanoshield • Scientists studying deadly viruses of the technology and starvation factors where it malfunctions and threatens accidentally release them • Death of crops from superpest, humanity or the biosphere than hunger and war

From 1 to 1 000 000 to Small, as such risks are long- 1 to 100 000 a year based on term, but more historic risks and other species dangerous risks of new tech- 0.1 – 1 per cent a year 10 – 30 per cent total 50 per cent total Small survival nologies could happened much Probability earlier

The roadmap is based on my book “Risks of in the 21 century” (with Michael Anissimov). Most important risks are inbold. Most hypothectical are italic. This roadmap is accompained by another one about the ways of prevention of existential risks. (c) Alexey Turchin, 2014