The Precipice: ‘A book that seems made for the present moment’ New Yorker
Rate it:
Kindle Notes & Highlights
36%
Flag icon
SECURITY AMONG THE STARS?
36%
Flag icon
RISKS WITHOUT PRECEDENT
37%
Flag icon
As Carl Sagan memorably put it: ‘Theories that involve the end of the world are not amenable to experimental verification—or at least, not more than once.’26
37%
Flag icon
INTERNATIONAL COORDINATION
38%
Flag icon
TECHNOLOGICAL PROGRESS
39%
Flag icon
STATE RISKS & TRANSITION RISKS
39%
Flag icon
RESEARCH ON EXISTENTIAL RISK
39%
Flag icon
And alongside these many strands of research on concrete topics, we also need research on more abstract matters. We need to better understand longtermism, humanity’s potential and existential risk: to refine the ideas, developing the strongest versions of each; to understand what ethical foundations they depend upon, and what ethical commitments they imply; and to better understand the major strategic questions facing humanity.
40%
Flag icon
WHAT NOT TO DO
40%
Flag icon
WHAT YOU CAN DO
40%
Flag icon
8 OUR POTENTIAL It is possible to believe that all the past is but the beginning of a beginning, and that all that is and has been is but the twilight of the dawn. It is possible to believe that all that the human mind has ever accomplished is but the dream before the awakening. — H. G. Wells
41%
Flag icon
DURATION
42%
Flag icon
FIGURE 8.1 A timeline showing the scale of the past and future. The top row shows the prior century (on the first image) and the coming century (on the second), with our own moment in the middle. Then each successive row zooms out, showing 100 times the duration, until we can see the whole history of the universe.
42%
Flag icon
SCALE
44%
Flag icon
QUALITY
44%
Flag icon
I love humanity, not because we are Homo sapiens, but because of our capacity to flourish, and to bring greater flourishing to the world around us. And in this most important respect, our descendants, however different, could reach heights that would be forever beyond our present grasp.
44%
Flag icon
CHOICES
45%
Flag icon
ACKNOWLEDGEMENTS
46%
Flag icon
APPENDICES A. Discounting the Longterm Future B. Population Ethics and Existential Risk C. Nuclear Weapons Accidents D. Surprising Effects when Combining Risks E. The Value of Protecting Humanity F. Policy and Research Recommendations G. Extending the Kardashev Scale
46%
Flag icon
APPENDIX A DISCOUNTING THE LONGTERM FUTURE
47%
Flag icon
APPENDIX B POPULATION ETHICS AND EXISTENTIAL RISK
48%
Flag icon
APPENDIX C NUCLEAR WEAPONS ACCIDENTS
48%
Flag icon
The Accidental Launch Order
49%
Flag icon
APPENDIX D SURPRISING EFFECTS WHEN COMBINING RISKS
49%
Flag icon
FIGURE D.1 Independent 10% and 90% risks give a total risk of 91%. Removing the 10% risk would lower the total risk (the total shaded area) by just a single percentage point to 90%, while removing the 90% risk would lower it by 81 percentage points to 10%.
49%
Flag icon
APPENDIX E THE VALUE OF PROTECTING HUMANITY
49%
Flag icon
Just how valuable is it to protect humanity? While we can’t answer with precision, there is a way of approaching this question that I’ve found helpful for my thinking.
50%
Flag icon
APPENDIX F POLICY AND RESEARCH RECOMMENDATIONS For ease of reference, I’ve gathered together my recommendations for policy and research on existential risk. Asteroids & Comets
50%
Flag icon
Supervolcanic Eruptions
50%
Flag icon
Stellar Explosions
50%
Flag icon
Nuclear Weapons
50%
Flag icon
Climate
50%
Flag icon
Environmental Damage
50%
Flag icon
Engineered Pandemics
50%
Flag icon
Unaligned Artificial Intelligence
51%
Flag icon
General
51%
Flag icon
APPENDIX G EXTENDING THE KARDASHEV SCALE
51%
Flag icon
FURTHER READING
1 2 3 5 Next »