Anders Sandberg recently published his “Top 5” list of existential risks that humanity should be working to reduce:
- Nuclear War
- Bioengineered pandemic
- Unknown unknowns
Notable exclusions include climate change and asteroid impacts. I agree with those exclusions from a top 5 list.
My only critique is that nuclear war doesn’t seem to qualify as an xrisk since:
- All the nuclear weapons in the world aren’t actually enough to kill everyone — either directly or with radiation
- Even the worst possible nuclear winters wouldn’t wipe out humanity