More on this book
Community
Kindle Notes & Highlights
by
Parmy Olson
Read between
October 15 - November 1, 2024
If AI was potentially dangerous, why couldn’t we just build it on virtual machines to separate it from other computer systems? Surely that would stop the AI from infiltrating our physical infrastructure and shutting down an electrical grid or manipulating financial markets. Yudkowsky immediately had an answer. “It wouldn’t really be virtual,” he replied, sipping his drink. Electrons could flow in all sorts of different directions, which meant there was always going to be a way for powerful AI systems to touch and change the configuration of hardware.
Some so-called AI accelerationists, for instance, believe that scientists have a moral imperative to work as quickly as possible to build AGI to create a posthuman paradise, a kind of rapture for nerds. If it was built in their lifetimes, they could live forever.
“The goal of Google search is to get you to click on links, ideally ads,” says Sridhar Ramaswamy, who ran Google’s ads and commerce business between 2013 and 2018. “All other text on the page is just filler.”
If you multiply tiny odds with an infinite cost, you still get a problem that is infinitely large.
“We need to get there before the AI takes over,” he told engineers in 2023, according to his biographer Ashlee Vance. “We want to get there with a maniacal sense of urgency. Maniacal.” Musk believes that with brain implants, humans will be able to prevent a future artificial superintelligence from wiping us out, and so he wants Neuralink to perform surgeries on more than 22,000 people by 2030.