If Anyone Builds It, Everyone Dies Quotes

Rate this book
Clear rating
If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All by Eliezer Yudkowsky
6,964 ratings, 3.92 average rating, 1,163 reviews
If Anyone Builds It, Everyone Dies Quotes Showing 1-21 of 21
“If any company or group, anywhere on the planet, builds an artificial superintelligence using anything remotely like current techniques, based on anything remotely like the present understanding of AI, then everyone, everywhere on Earth, will die.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“The most fundamental fact about current AIs is that they are grown”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“We make a mistake the first time”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“Sometimes the engineer learns better at the cost of only time and money. Sometimes the engineer kills only themselves or only consenting volunteers; and Science writes down what happened”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“The inner workings of batteries and rocket engines are well understood, governed by known physics recorded in careful textbooks. AIs, on the other hand, are grown, and no one understands their inner workings. There are fewer equations to constrain one's thinking... and so, many opportunities to think about high-minded ideals like truth-seeking instead.

If you know the history of science, this kind of talk is recognizable as the stage of folk theory, the stage where lots of different people are inventing lots of different theories that appeal to them personally, the way people talk before science has really gotten started on something. They're the words of an alchemist who's decided that some complicated philosophical scheme will let them transmute lead into gold.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“datacenters can kill more people than nuclear weapons.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI
“All over the Earth, it must become illegal for AI companies to charge ahead in developing artificial intelligence as they’ve been doing.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI
“If anyone anywhere builds superintelligence, everyone everywhere dies.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI
“Attempting to solve a problem like that, with the lives of everyone on Earth at stake, would be an insane and stupid gamble that NOBODY SHOULD BE ALLOWED TO TRY.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI
“Engineers failed at crafting AI, but eventually succeeded in growing it.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI
“They did all that because world leaders knew that, in the event of a nuclear war, both they and the people of their countries would have a bad day.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: The Case Against Superintelligent AI
“An artificial superintelligence is like a nuclear reactor”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“There are even deeper reasons to expect advanced AIs to behave like they have wants.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“MIRI’s research, and shifted the institute’s focus to conveying one single point, the warning at the core of this book: If any company or group, anywhere on the planet, builds an artificial superintelligence using anything remotely like current techniques, based on anything remotely like the present understanding of AI, then everyone, everywhere on Earth, will die.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“wanting is an effective strategy for doing.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“We ultimately predict AIs that will not hate us”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“If you launch a rocket and load the whole human species on board”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“The critical number governing a nuclear reactor is the neutron multiplication factor”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“AIs grown in this way do things that their growers did not intend. In 2023”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“The survivors of the blind cheerful optimists turn into cynical pessimistic veterans;”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All
“With this book, we hope to inspire individuals and countries to rise to the occasion.”
Eliezer Yudkowsky, If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All