Anders Sandberg recently published his “Top 5” list of existential risks that humanity should be working to reduce:
- Nuclear War
- Bioengineered pandemic
- Superintelligence
- Nanotechnology
- Unknown unknowns
Notable exclusions include climate change and asteroid impacts. I agree with those exclusions from a top 5 list.
My only critique is that nuclear war doesn’t seem to qualify as an xrisk since:
- All the nuclear weapons in the world aren’t actually enough to kill everyone — either directly or with radiation
- Even the worst possible nuclear winters wouldn’t wipe out humanity
29 Responses to “The 5 Biggest Risks To Human Existence”
June 2
Artem LamninNuclear war is an existential risk to western civilization, not necessarily humanity as a whole. I’m sure some random villagers in northern india wont even notice that anything happened.
June 2
Michael AnissimovSee page 6 for the temperature drop from a 150 Tg smoke injection scenario: http://climate.envsci.rutgers.edu/pdf/RobockNW2006JD008235.pdf It is possible to imagine 2-3x more smoke than this but not much more. Temperature drops in the south hemisphere would be serious but survivable. Much of the northern hemisphere could be wiped out though.
June 2
Maciamo HayThe two big risks are superintelligence and nanotechnologies (grey goo). I don’t see how a pandemic could wipe out humanity. Nuclear war is also very unlikely to result in total annihilation. If a third existential risk should be added, it would be an alien invasion of the Earth caused by our accidental sending of information about us. But that possibility also seems more remote.
June 2
Seth BaumMy guess is even the worst nuclear winter would not kill everyone, but I’d still attach some probability to it. I’d also give some probability to advanced civilization not coming back after. I’d like to think it would but it’s not at all a certainty. But meanwhile nuclear war is also the one I think would be easiest to cross off the list by making sure it would never happen – diplomacy, disarmament, etc. So in that sense I think it’s worth working on.
June 2
Artem LamninAlien invasion should probably be on this list, however low probability. An encounter with an alien race capable of crossing light years would likely not turn out well for us. Dont know that you can quantify that as a probability, necessarily.
June 2
Artem LamninCreation of biological weapons is universally proscribed by international laws. Existing stocks of smallpox, ebola, etc. are under heavy guard in the major countries known to have ever manufactured bioweapons and are not universally lethal anyway. There has never been a disease with a 100% mortality rate (except maybe certain cancers). A smallpox epidemic may inflict casualties on the order of 40% and cause significant economic damage, but it would likely have less of an impact than even a limited nuclear exchange.
June 2
Michael AnissimovIf there were a smoke injection of 500+ Tg, nuclear winter could become quite cold. No one has modeled this scenario yet.
June 2
Douglas ScheinbergRabies might not have a 100% mortality rate, but it’s damn close…
June 2
Artem LamninRabies has also never been weaponized as far as I know and only spreads through infected saliva. Any serious biological weapon would be aerosolized, because while you can avoid getting bitten by an animal in a number of ways, it’s a bit harder to avoid infected air.
June 2
Maciamo HayA virus that kills quickly its extremely lethal may not spread very far if it kills its victims too quickly. If the incubation is long and death slow (like AIDS), that leaves us time to find a cure.
June 2
Artem LamninIndeed, that is why viruses like Ebola have only had a very limited impact. They kill so quickly and are so easily identifiable that the epidemic burns itself out. Consider that vs. the impact of the Spanish flu that had a much more ideal incubation/spread time.
June 2
Douglas ScheinbergEbola also isn’t an especially communicable virus; like HIV, it mostly spreads through contact with bodily fluids. Unlike HIV, the bodily fluid that transmits Ebola is usually blood, because one of the symptoms of Ebola is massive bleeding.
June 2
Jay GerigDibs on Unknown Unknowns as a band name.
June 2
Robby BensingerMy list would be something like:
1. Superintelligence
2. Nanotechnology
3. Pandemics
4. Unknown unknowns
5. Stable regressive global institutions
Nuclear war and climate change are in the same category — there are a few low-probability ways they could kill us all, but current evidence is that the probability is negligible.
Currently existing pandemics might not look very scary, but evolution isn’t optimizing for human extinction; we can probably do much better. Currently existing strains are well-contained, but technological advances that might make it possible to build a bacterium in your basement are less well-contained.
Asteroid impact is only a problem on very long timescales. By that point, we’ll probably either be dead or have off-Earth colonies. Fermi paradox suggests aliens are super-rare; might be an existential risk if our expansion bubble runs into a very distant expansion bubble (i.e., we’re as likely to be the ‘invaders’ as they are). I don’t think simulation shutdown is a big worry for the likeliest simulation scenarios; maybe it’s comparable to risk from aliens, though.
I’m not sure why ‘stable regressive global institutions’ isn’t on the list. Anything that keeps us from colonizing the observable universe counts as an x-risk, so surely the category ‘humans decide as a species not to colonize the observable universe’ should be assigned a lot of probability. FHI sometimes talks about this under the name ‘totalitarianism’, but there are other social structures that might have a similar effect.
June 2
Maciamo HayRobby, I doubt that global institutions will arise before humans start colonising other planets (esp. since the Mars Mission is only ten years away), let alone stable regressive global institutions. Judging from how the EU already has difficulties accommodating fairly similar cultures, I wouldn’t foresee the rise of global institutions with actual power to restrain all humanity for many centuries. In contrast the potential risk that an ASI destroys all humans or that grey goo occurs by accident is only a few decades away.
June 2
Jonas EicherI don’t think a bioengineered pandemic would mean total extinction. The same arguments as for the nuclear war apply. With 6.5 billion people, there would be resistances, unaffected areas, etc.
June 2
Robby BensingerMaciamo: I agree this isn’t a top-tier risk. I just think it’s orders of magnitudes larger than the other risks. Really, it surprises me how much of the danger appears to be concentrated in only 1-3 scenarios. Possibly we should spend less time trying to find risks comparable to nano and AI, and instead start making finer-grained lists that consider more specific ‘superintelligent AI’ or ‘nanotech’ or ‘pandemic’ scenarios.
Jonas: Roughly speaking, nuclear weapons are a single well-understood technology, whereas ‘engineered pandemics’ is a very large class of mostly unknown technologies. I don’t think we know how difficult it is to synthesize an organism that can reproduce itself and spread by air at rates needed to infect all humans; and I don’t think we know how difficult it is to synthesize an organism that remains dormant for a week and then has a 100% fatality rate. Some poisons have a 100% fatality rate for humans, so if it’s possible to build organisms that function as miniature factories for the worst poisons (e.g., botulinum at much higher doses), it should be possible to build organisms that no human has an adequate resistance to.
June 3
Jonas EicherAny party creating such a bioweapon would also have good knowledge about how to counteract it though. I am not saying there is a cure to every disease, but there are ways to survive it.
June 3
Jonas EicherYour post makes me think about little nanobots injecting poison into every being with blood flow or weaponized mosquitoes. Autonomous weapons like that are probably covered in superintelligence. That field actually coveres a lot. I think that should be split into different scenarios.
June 3
Michael WitbrockUnder the criteria that Jonas Eicher is using (including the uncertainty one), only Superintelligence, Nanotechnology and Asteroid impact would stay on the list (and asteroid impact is a temporary risk based on our present industrial capacity, not a long term one). But I think that the arguments are wrong, and that Bioengineered pandemic may actually be our worst near term existential risk (likelihood of happening combined with proportion of bad outcomes)
June 3
Jonas EicherWell, I am still missing what I think to be the most likely scenario: Extinction through evolution. With the advances in biotechnology, especially genetic engineering, the human species won’t be the same as today, 100 years from now. There will probably be human-like species, but to them we won’t be more than a common ancestor.
June 3
Luke CockerhamWhy aren’t people worrying more about highly addictive virtual reality? It seems like the tech in 20+ years will be fairly convincing.
June 3
Jonas EicherThere is already a movie about that. It has a happy ending. We are fine.
June 3
Matthew ScottOnly Superintelligence is guaranteed to wipe us out – or at least transform us to superhuman. Hopefully a pandemic will not reach people isolated in the hills and forests – although we can prepare that it will wipe out 25-50% of us – probably in the near future, Nanotech that can replicate and kill us is probably going to come after superintlligence. The only way we can survive such is the bubble cities we all saw in sci-fi books of olde, and compartmentalized airtight or negative pressure buildings – like we have in the Navy. There is also the reasonable early-warning system, with all citizens equipped with comms and protective gear – and/or shelters … air-tight vehicles etc. This is all so obvious and entirely predictable – but no one can do the obvious … like EMP hardened transformers. Couple hundred millions to save us trillions.
June 3
Matt FrankClimate change is already in your top-5 as a likely provocation for nuclear war, and as a likely precondition for incubating a pandemic. When China’s drought leads it to dam the headwaters of the Brahmaputra, and India retaliates…. Or when bad weather overwhelms the sewer system of some coastal city, and an otherwise minor new disease develops….
June 3
Michael WitbrockSuper intelligence is far from guaranteed to kill us. It is, however, guaranteed to be able to kill us. Perhaps that’s what Matthew Scott means.
June 3
Michael WitbrockNatural pandemics are no threat, but it would be a foolish species that didn’t worry about desktop biosynthesis (kind-of imminent) and selecting sufficiently many attack mechanisms from genome libraries to lower the joint probability of resistance to all attacks below 10^-10). My view is that we need nano to have a plausible defence against this sort of engineered bio-nano-weapon
June 4
Niel BowermanSpeaking with Anders about this yesterday, he claimed that he was talking about current x-risks. i.e. if we were wiped out tomorrow, what would the cause have been. This is a qualitatively different question to what is likely to cause existential risk this century, and so risks such as nuclear war will come out higher (I still agree that I wouldn’t put nuclear war as number one though).
As a climate modeller, I wouldn’t put a lot of weight on the simulations of large dust clouds and nuclear winters. The models haven’t been tested to observations in that regime. They are going to be a lot better than pure guesswork, but I would assign substantial probability to their being qualitatively wrong in multiple ways in that sort of regime.
June 16
desbestI can add 2 more to the list.
1. Antibiotic resistance (the biggest threat to public health)
2. Natural disasters making lead barrels of nuclear waste leak and harm people (this happened in Somalia)