More Everything Forever Quotes

Rate this book
Clear rating
More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity by Adam Becker
1,223 ratings, 4.27 average rating, 236 reviews
Open Preview
More Everything Forever Quotes Showing 1-11 of 11
“Michael Hendricks, a neurobiologist at McGill University, is even more blunt. "What's being done now in the commercial cryonics industry is garbage. They're making puddles of pink mush in a liquid nitrogen tank. It's nothing that could ever be used for anything," he tells me. While it might eventually be possible to chill living people to a temperature near freezing and keep them in a kind of hibernation state, Hendricks says that "when you get into people who have died, and then the cryonic people show up with their head saws—it's too late by the time you get in there with a saw. The tissue starts breaking down, and importantly, it's pretty fast.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“This is a proposal for total capture of the national economy, making Altman functionally the king of the United States and possibly the world.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Transhumanism is the belief that we can and should use advanced technology to transform ourselves”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“future” that he is alluding to goes well beyond that. Transhumanism is the belief that we can and should use advanced technology to transform ourselves”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“If we want a future that puts people first, we need to recognize that there are no panaceas, and likely no utopias either. Nothing is coming to save us. There is no genie inside a computer that will grant us three wishes. Technology can't heal the world. We have to do that ourselves.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“This promise of a benevolent godhead, a superintelligent AI that foresees and solves all human problems, is the same goal that the singularitarians and the rationalists have: the reduction of all problems to judicious application of computer science. More broadly, it's the dream of technology as salvation from all threats. Technology doesn't solve social and political problems, any more than it causes them. The prospect of nuclear war was made possible through technology, but it's a live concern because of geopolitics. Humans could come together and choose to rid the world of nuclear weapons, just as we could some together to end global warming. Applying more intelligence and technology to these problems won't solve them; they're fundamentally political.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Ord ultimately concludes that human civilization has a good chance to survive even at double that temperature rise. "I looked at these models up to about 20 degrees of warming, and it still seems like there would be substantial habitable areas," he said. "But, it's something where it'd be very bad, just to be clear to the audience," Ord hastened to add.
     Climate science suggests that "very bad" is a gross understatement. "A temperature rise of 10 degrees [Celsius] would be a mass extinction event in the long term," says Luke Kemp, a researcher at the University of Cambridge and an expert on climate-induced civilizational collapse.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“It's rather breathtaking to see an Oxford ethics professor state that the danger over the next century from an ill-defined hypothetical technology is fifty times greater than the danger posed by global warming and nuclear weapons combined.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
Harry Potter and the Methods of Rationality (HPMOR), a novel running to 650,000 words (substantially longer than War and Peace), is among the most widely read pieces of Harry Potter fan fiction on the internet. Yudkowsky started working on it in 2010, posting chapters online as he wrote them for the next five years.
     Yudkowsky's Harry Potter is a wizard in training and a child prodigy with a set of interests and goals suspiciously similar to those of Yudkowsky himself: eliminating death is at the top of his list. The book has chapter titles like "Machiavellian Intelligence Hypothesis," "Bayes's Theorem," and "Personhood Theory." Yudkowsky's Potter is supposed to be eleven, but he talks much more like the adult Yudkowsky. And like Yudkowsky, he wants to save the world—his way. "World domination is such an ugly phrase," Yudkowsky's Potter says at one point. "I prefer to call it world optimisation.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“We have a logarithmic view of history because we can't possibly learn and retain everything that ever happened before we were born.
     Seen through this lens, Kurzweil's trends become more suspect. It seems likely that he's confusing a logarithmic view of history for an exponential trend in biological and technological development. His list of biological milestones give this away: rather than picking particularly important events in the evolution of all life on Earth, he's mostly chosen milestones leading up to the evolution of humans, as if humans are the ultimate goal of evolution. (Evolution has no goal, as Kurzweil surely knows.) This kind of cherry-picking makes it easy to create the appearance of an exponential trend.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity