More on this book
Community
Kindle Notes & Highlights
by
Toby Ord
Read between
January 30 - March 20, 2022
SECURITY AMONG THE STARS?
RISKS WITHOUT PRECEDENT
As Carl Sagan memorably put it: ‘Theories that involve the end of the world are not amenable to experimental verification—or at least, not more than once.’26
INTERNATIONAL COORDINATION
TECHNOLOGICAL PROGRESS
STATE RISKS & TRANSITION RISKS
RESEARCH ON EXISTENTIAL RISK
And alongside these many strands of research on concrete topics, we also need research on more abstract matters. We need to better understand longtermism, humanity’s potential and existential risk: to refine the ideas, developing the strongest versions of each; to understand what ethical foundations they depend upon, and what ethical commitments they imply; and to better understand the major strategic questions facing humanity.
WHAT NOT TO DO
WHAT YOU CAN DO
8 OUR POTENTIAL It is possible to believe that all the past is but the beginning of a beginning, and that all that is and has been is but the twilight of the dawn. It is possible to believe that all that the human mind has ever accomplished is but the dream before the awakening. — H. G. Wells
DURATION
FIGURE 8.1 A timeline showing the scale of the past and future. The top row shows the prior century (on the first image) and the coming century (on the second), with our own moment in the middle. Then each successive row zooms out, showing 100 times the duration, until we can see the whole history of the universe.
SCALE
QUALITY
I love humanity, not because we are Homo sapiens, but because of our capacity to flourish, and to bring greater flourishing to the world around us. And in this most important respect, our descendants, however different, could reach heights that would be forever beyond our present grasp.
CHOICES
ACKNOWLEDGEMENTS
APPENDICES A. Discounting the Longterm Future B. Population Ethics and Existential Risk C. Nuclear Weapons Accidents D. Surprising Effects when Combining Risks E. The Value of Protecting Humanity F. Policy and Research Recommendations G. Extending the Kardashev Scale
APPENDIX A DISCOUNTING THE LONGTERM FUTURE
APPENDIX B POPULATION ETHICS AND EXISTENTIAL RISK
APPENDIX C NUCLEAR WEAPONS ACCIDENTS
The Accidental Launch Order
APPENDIX D SURPRISING EFFECTS WHEN COMBINING RISKS
FIGURE D.1 Independent 10% and 90% risks give a total risk of 91%. Removing the 10% risk would lower the total risk (the total shaded area) by just a single percentage point to 90%, while removing the 90% risk would lower it by 81 percentage points to 10%.
APPENDIX E THE VALUE OF PROTECTING HUMANITY
Just how valuable is it to protect humanity? While we can’t answer with precision, there is a way of approaching this question that I’ve found helpful for my thinking.
APPENDIX F POLICY AND RESEARCH RECOMMENDATIONS For ease of reference, I’ve gathered together my recommendations for policy and research on existential risk. Asteroids & Comets
Supervolcanic Eruptions
Stellar Explosions
Nuclear Weapons
Climate
Environmental Damage
Engineered Pandemics
Unaligned Artificial Intelligence
General
APPENDIX G EXTENDING THE KARDASHEV SCALE
FURTHER READING