Mathieu Bouillon

44%
Flag icon
THE BOTTOM LINE: • If we one day succeed in building human-level AGI, this may trigger an intelligence explosion, leaving us far behind. • If a group of humans manage to control an intelligence explosion, they may be able to take over the world in a matter of years. • If humans fail to control an intelligence explosion, the AI itself may take over the world even faster. • Whereas a rapid intelligence explosion is likely to lead to a single world power, a slow one dragging on for years or decades may be more likely to lead to a multipolar scenario with a balance of power between a large number ...more
This highlight has been truncated due to consecutive passage length restrictions.
Life 3.0: Being Human in the Age of Artificial Intelligence
Rate this book
Clear rating