More Everything Forever Quotes

Rate this book
Clear rating
More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity by Adam Becker
1,754 ratings, 4.25 average rating, 338 reviews
Open Preview
More Everything Forever Quotes Showing 1-23 of 23
“If we want a future that puts people first, we need to recognize that there are no panaceas, and likely no utopias either. Nothing is coming to save us. There is no genie inside a computer that will grant us three wishes. Technology can't heal the world. We have to do that ourselves.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“This promise of a benevolent godhead, a superintelligent AI that foresees and solves all human problems, is the same goal that the singularitarians and the rationalists have: the reduction of all problems to judicious application of computer science. More broadly, it's the dream of technology as salvation from all threats. Technology doesn't solve social and political problems, any more than it causes them. The prospect of nuclear war was made possible through technology, but it's a live concern because of geopolitics. Humans could come together and choose to rid the world of nuclear weapons, just as we could some together to end global warming. Applying more intelligence and technology to these problems won't solve them; they're fundamentally political.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Michael Hendricks, a neurobiologist at McGill University, is even more blunt. "What's being done now in the commercial cryonics industry is garbage. They're making puddles of pink mush in a liquid nitrogen tank. It's nothing that could ever be used for anything," he tells me. While it might eventually be possible to chill living people to a temperature near freezing and keep them in a kind of hibernation state, Hendricks says that "when you get into people who have died, and then the cryonic people show up with their head saws—it's too late by the time you get in there with a saw. The tissue starts breaking down, and importantly, it's pretty fast.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“harm. But an obsessively quantitative focus on ultimate outcomes is implicit in the evaluation of risks that Ord and MacAskill have both pushed, which ranks AI alignment above addressing global warming. Global warming will disproportionately affect poor people of color. But to the longtermists, any problem that promises an outcome short of full extinction isn’t as important as something that could wipe out the entire species, even if the former is real and present and the latter is purely hypothetical. If we disregard calamities that fall short of full extinction, wide swaths of human culture and diversity will be lost forever, eternally absent from the longtermists’ glorious future in space. Who gets to decide what makes the cut? The entire EA movement, and especially longtermism, has a very specific idea of what matters. By looking only at what they consider most important, they ignore the other needs and problems in the world, all while claiming they’re saving the species.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Ord claims his fears about insufficient guardrails on superintelligent AGI are shared by many experts in AI research. “It turns out that the average ML researcher thinks that there’s something like a 5 percent chance of” superintelligent AGI murdering everyone, he tells me. “So it’s not that I’m out on a limb or something, either.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Kathy Forth, a writer and data scientist in training, alleged that multiple members of the rationalist and EA communities had sexually abused her. “I could leave rationality, effective altruism and programming to escape the male-dominated environments that increase my sexual violence risk so much. The trouble is, I wouldn’t be myself,” she wrote. “What I need is to be alive and flourishing in my own skin, not just going through the motions, trapped in my body, with my mind on mute for the rest of my life. If I can’t even have myself, no one can.” After writing that note in 2018, Kathy Forth killed herself. She was thirty-seven years old.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“It’s rather breathtaking to see an Oxford ethics professor state that the danger over the next century from an ill-defined hypothetical technology is fifty times greater than the danger posed by global warming and nuclear weapons combined. But Ord isn’t even the only person matching that description who works in that building. In What We Owe the Future”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“It’s an echo of the tech-first libertarian attitude among the Extropians. There are also echoes of it in the views of a certain venture capitalist who backed MIRI”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“It will always be possible to get ChatGPT to produce hate speech at volume. Other ML systems will have algorithmic bias”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Indeed, when ChatGPT first hit the scene in late 2022, there was a great deal of talk about it as a replacement for internet search engines like Google. Yet a basic understanding of what LLMs actually are reveals that they are fundamentally unsuitable for internet search on their own. (Incorporating one into a search engine, as Google has done, isn’t a great idea either.) It’s true that LLMs have been fed enormous amounts of information from the internet, so the idea that they could replace a search engine seems natural at first. To build ChatGPT, OpenAI started out by doing the same thing that everyone else (Google, Anthropic) does when building an LLM: they obtained a snapshot of much of the text available on the internet at the time. The data used for training GPT-3 (the LLM that powered ChatGPT when it was first launched in late 2022) included all of Wikipedia, many websites sourced from Reddit links, an undisclosed number of books (likely numbering in the hundreds of thousands or more), and a great deal of the news, blogs, recipes, flame wars, and the rest of the mess that makes up the modern internet. But, crucially, that doesn’t mean that ChatGPT or any other LLM actually has all of that information inside itself. Instead, the software engineers training the LLM first break down the text into small chunks called tokens, usually around the size of a single word. Then they feed the tokenized text into the LLM, which analyzes the connections between the tokens. All the LLM knows about are tokens and the connections between them—and all it knows how to do is generate new strings of tokens in response to whatever input is given to it. So in one sense, ChatGPT and other LLMs are text-prediction generators: give ChatGPT text, in the form of a question or conversation, and it will try to respond in a manner similar to the text it was trained on—namely, the entire internet. “Think of ChatGPT as a blurry JPEG of all the text on the Web,” wrote the science fiction author Ted Chiang. “It retains much of the information on the Web, in the same way that a JPEG retains much of the information of a higher-resolution image, but, if you’re looking for an exact sequence of bits, you won’t find it; all you will ever get is an approximation.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Moreover, unlike the hypothetical paper clip–maximizing AI, human intelligence is not centered on the optimization of fixed goals; instead, a person’s goals are formed through complex integration of innate needs and the social and cultural environment that supports their intelligence.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“I have won this lottery, it’s a gigantic lottery, and it’s called Amazon.com. And I’m using my lottery winnings to push us a little further into space,” Jeff Bezos said in 2017. “We need to build reusable rockets, and that is what Blue Origin is dedicated to… taking my Amazon lottery winnings and dedicating [them] to [that].… It’s a passion, but it’s also important.”52 Don’t look at the horrifying labor conditions at the local Amazon fulfillment center. Look at the shiny rocket instead. Ignore the problems of this world. Everything will be better in space.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Musk has done precisely that, quite explicitly: he has said that longtermism “is a close match for my philosophy” and claims that he is simply taking the actions he must take to preserve humanity.50 “Elon’s concept that SpaceX is on this mission to go to Mars as fast as possible and save humanity permeates every part of the company,” says Tom Moline, a former SpaceX engineer. “The company justifies casting aside anything that could stand in the way of accomplishing that goal, including worker safety.” Moline was fired after making complaints about the workplace at SpaceX. A 2023 Reuters report uncovered over six hundred workplace injuries, including amputations, head wounds, and one death. Most were never reported to OSHA. According to Reuters, SpaceX’s “lax safety culture, more than a dozen current and former employees said, stems in part from Musk’s disdain for perceived bureaucracy and a belief inside SpaceX that it’s leading an urgent quest to create a refuge in space from a dying Earth.”51”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“If humanity’s energy usage continues to grow by a more modest 2.3 percent per year”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“This is a proposal for total capture of the national economy, making Altman functionally the king of the United States and possibly the world.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Transhumanism is the belief that we can and should use advanced technology to transform ourselves”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“future” that he is alluding to goes well beyond that. Transhumanism is the belief that we can and should use advanced technology to transform ourselves”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“Ord ultimately concludes that human civilization has a good chance to survive even at double that temperature rise. "I looked at these models up to about 20 degrees of warming, and it still seems like there would be substantial habitable areas," he said. "But, it's something where it'd be very bad, just to be clear to the audience," Ord hastened to add.
     Climate science suggests that "very bad" is a gross understatement. "A temperature rise of 10 degrees [Celsius] would be a mass extinction event in the long term," says Luke Kemp, a researcher at the University of Cambridge and an expert on climate-induced civilizational collapse.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“It's rather breathtaking to see an Oxford ethics professor state that the danger over the next century from an ill-defined hypothetical technology is fifty times greater than the danger posed by global warming and nuclear weapons combined.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
Harry Potter and the Methods of Rationality (HPMOR), a novel running to 650,000 words (substantially longer than War and Peace), is among the most widely read pieces of Harry Potter fan fiction on the internet. Yudkowsky started working on it in 2010, posting chapters online as he wrote them for the next five years.
     Yudkowsky's Harry Potter is a wizard in training and a child prodigy with a set of interests and goals suspiciously similar to those of Yudkowsky himself: eliminating death is at the top of his list. The book has chapter titles like "Machiavellian Intelligence Hypothesis," "Bayes's Theorem," and "Personhood Theory." Yudkowsky's Potter is supposed to be eleven, but he talks much more like the adult Yudkowsky. And like Yudkowsky, he wants to save the world—his way. "World domination is such an ugly phrase," Yudkowsky's Potter says at one point. "I prefer to call it world optimisation.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity
“We have a logarithmic view of history because we can't possibly learn and retain everything that ever happened before we were born.
     Seen through this lens, Kurzweil's trends become more suspect. It seems likely that he's confusing a logarithmic view of history for an exponential trend in biological and technological development. His list of biological milestones give this away: rather than picking particularly important events in the evolution of all life on Earth, he's mostly chosen milestones leading up to the evolution of humans, as if humans are the ultimate goal of evolution. (Evolution has no goal, as Kurzweil surely knows.) This kind of cherry-picking makes it easy to create the appearance of an exponential trend.”
Adam Becker, More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity