Empire of AI: Dreams and Nightmares in Sam Altman's OpenAI
Rate it:
Open Preview
Read between August 16 - August 21, 2025
23%
Flag icon
The Retro Biosciences bet reflected Altman’s fixation on longevity. He was an avid follower of “young blood” research—a line of scientific inquiry that studied how to reverse aging with transfusions of healthier, younger blood.
23%
Flag icon
While at YC, Altman had also signed up with a $10,000 deposit to be on the wait list of a controversial startup called Nectome, which had been in one of the accelerator’s batches. Ripped straight out of science fiction, Nectome was pitching a service that would cryogenically freeze customers’ brains to one day—potentially hundreds of years into the future—upload to a computer after scientists had cracked the technology to do so. The catch was that Nectome needed the person’s brain to be fresh for the preservation to work. To Antonio Regalado, cofounder Robert McIntyre called his product “100 ...more
23%
Flag icon
Altman liked to say that he had taken no equity in OpenAI to avoid corrupting the quest of safe AGI with his own desires for profit. He made only a yearly salary of $65,000 and accumulated his wealth through other ventures. The sentiment had a nice ring to it—and echoed his original rhetoric around why OpenAI started as a nonprofit. It was also a statement, like the nonprofit status of the organization, that by 2021 no longer reflected the full truth. Altman had a significant stake in YC, and YC, through its $10 million investment in OpenAI, could receive up to a $1 billion return. As OpenAI ...more
24%
Flag icon
Originally called Samasource, it was a San Francisco–based social enterprise that had begun in 2008 with a mission of providing meaningful, dignified work to people in impoverished countries to lift them out of poverty. Under its founder, Leila Janah, it had established operations in India and Kenya and developed a reputation as an ethical outsourcing company. In 2018, it transitioned to a for-profit, during which it shortened its name, in order to scale its operations. In 2020, it received a B Corp certification. In
24%
Flag icon
Behind the scenes, however, Sama was in disarray. In January 2020, Janah had passed away from a rare cancer at just thirty-seven; combined with the pandemic soon after, workers say it seemed to mark the beginning of more organizational mismanagement, a characterization that a Sama spokesperson denied.
24%
Flag icon
Nearly two hundred workers would file multiple lawsuits against Sama and Meta alleging traumatic working conditions and unlawful terminations for attempting
24%
Flag icon
Against this backdrop, OpenAI began the first phase of its project in late 2021. Under the code names PBJ1, PBJ2, PBJ3, and PBJ4, it thrust teams of Sama workers into more traumatic content-moderation work, for on average between $1.46 and $3.74 an hour.
24%
Flag icon
In 2019, they published their book Ghost Work, based on five years of extensive fieldwork, revealing a hidden web of piecemeal labor and digital exploitation that propped up Silicon Valley. Tech giants and unicorns were building their extravagant valuations not just with engineers paid six-figure salaries in trendy offices. Essential, too, were workers, often in the Global South, being paid pennies to carefully annotate reams of data.
24%
Flag icon
MTurk, as it was called, was a generalist platform, meaning it didn’t cater to any particular kind of work. It was just a self-service website. Its interface—stuck in the web design of the mid-aughts, when it launched—had a place to upload datasets, to specify simple annotation instructions, and to set a price for the work. Once the task was claimed, it showed randomized strings of numbers and letters in place of the workers’ names. It had two buttons next to each worker: one to give them a bonus, the other to boot them off the project. Data annotation for self-driving cars necessitated a ...more
24%
Flag icon
Venezuela was nose-diving headfirst into the worst peacetime economic crisis globally in fifty years. Economists say it was a toxic cocktail of political corruption and the government’s misguided policies that squandered the country’s rich natural endowment. Venezuela sits atop the largest proven petroleum reserves in the world. It was once Latin America’s wealthiest country. But beginning in 2016, hyperinflation went haywire; unemployment skyrocketed; violent crime exploded as families across the country watched the value of their entire life savings collapse. From late 2017 to 2019, ...more
24%
Flag icon
That “freak coincidence” revealed a disturbing formula. When faced with economic collapse, Venezuela suddenly checked off the perfect mix of conditions for which to find an inexhaustible supply of cheap labor: Its population had a high level of education, good internet connectivity, and, now, a zealous desire to work for whatever wages. It was not the only country that fit that description. More populations were getting wired to better internet. And with accelerating climate change and growing geopolitical instability, it was hard to bet against more populations plunging into crisis. “It’s ...more
25%
Flag icon
Each time she completed a task, the sum of money she earned, displayed in US dollars, would increase by a few pennies. She needed a minimum of ten dollars to withdraw it, which, when she first joined the platform, wasn’t a problem. Now, it could take weeks to accumulate that much money.
25%
Flag icon
For workers actually living in Venezuela, the process of withdrawal was even more challenging. Most global payment systems such as PayPal didn’t allow money transfers into Venezuela. Most stores and shops in Venezuela didn’t accept payments from the ones that do. This meant workers needed to convert their digital funds into cash to pay for basic goods and services. But where the money arrived online in US dollars, the cash needed to be in Venezuelan bolivares. The black market to convert one to the other abounded with scams and high commissions.
25%
Flag icon
There were other rules. Submitting a task quickly was rewarded, but submitting a task too quickly triggered something in the system that meant a worker wouldn’t get paid for that task. The prevailing theory was that the platform associated exceptional speed with bot activity, which meant it discarded the answers. Sometimes the tasks that appeared also had few instructions and were impossible to decipher; other times the platform had bugs that didn’t load the tasks correctly. The Venezuelans in the group who were once software engineers created browser extensions to deal with these issues and ...more
25%
Flag icon
She wanted Appen to be a traditional employer, to give her a full-time contract, a manager she could talk to, a consistent salary, and health care benefits. All she and other workers wanted was security, she told me, and for the company they worked so hard for to know that they existed.
25%
Flag icon
the following in what constitutes acceptable conditions: Workers should be paid living wages; they should be given regular, standardized shifts and paid sick leave; they should have contracts that make clear the terms of their engagement; and they should have ways of communicating their concerns to management and be able to unionize without fear of retaliation. Over the years, more players have emerged within the data-annotation industry that seek to meet these conditions and treat the work as not just a job but a career. But few have lasted in the price competition against the companies that ...more
25%
Flag icon
But after the company launched its worker-facing platform, Remotasks, and noticed the overwhelming interest from Venezuela, Venezuelans became one of Scale’s top recruiting priorities. “They’re the cheapest in the market,” the former employee said.
25%
Flag icon
Once Scale held the dominant position, its promises to workers faded. Through late 2021 and early 2022, I partnered with a Venezuelan journalist in Caracas, Andrea Paola Hernández, who interviewed Venezuelans who had worked for Scale during the Remotasks Plus program. We also embedded ourselves within the Remotasks Discord community, which Scale used to communicate and coordinate with its global workforce. We found through a spreadsheet the company left public that the workers’ earnings began to decline within weeks of the program’s launch; workers who started with earnings of forty dollars a ...more
25%
Flag icon
After two hours of completing a tutorial and twenty tasks, Hernández earned eleven US cents. Matt Park, then the senior vice president of operations at Scale, told us in response to the findings that Venezuelans on the platform earned an average of a little more than ninety cents an hour. “Remotasks is committed to paying fair wages in every region we operate,” he said.
25%
Flag icon
Scale was indeed bringing in new users. By mid-2021, as Venezuelans burned out and left the platform, Scale was scouting and onboarding tens of thousands more workers from other economies that had collapsed during the pandemic. To support its expanding and diversifying client needs, it entered countries with large populations facing financial duress and who could also speak the most economically valuable languages: English, French, Italian, German, Chinese, Japanese, Spanish. It sought French speakers from former French colonies in Africa, an employee who worked on international expansion ...more
26%
Flag icon
One group of eight workers in North Africa said Scale reduced their pay by more than a third in a matter of months. At least one worker was left with negative pending payments, suggesting that he owed Scale money. When the group attempted to organize against the changes, the company threatened to ban anyone engaging in “revolutions and protests.” Nearly all who spoke to me were booted off the platform. The Scale spokesperson said the company does not suspend workers for concerns about pay, only violations of Community Guidelines. Scale’s payment systems, chronically underinvested in by its US ...more
26%
Flag icon
As part of company benefits, Sama provided free psychological counseling, but many found the services inadequate. Sessions were often in groups, making it difficult for individuals to share their private thoughts, and the psychologists were seemingly unaware of the nature of their work. Many workers were also scared to show up and admit they were struggling. To struggle meant that they weren’t doing their best work and could be replaced by someone else.
26%
Flag icon
The Sama spokesperson said instead the company terminated the OpenAI contract, which she noted had always been a pilot, because OpenAI began sending images for annotation that “veered outside of the agreed upon scope.” The company never received the full $230,000 payment from OpenAI.
26%
Flag icon
Labor rights scholars and advocates say that that exploitation begins with the AI companies at the top. They take advantage of the outsourcing model in part precisely to keep their dirtiest work out of their own sight and out of sight of customers, and to distance themselves from responsibility while incentivizing the middlemen to outbid one another for contracts by skimping on paying livable wages. Mercy Mutemi, a lawyer who represented Okinyi and his fellow workers in a fight to pass better digital labor protections in Kenya, told me the result is that workers are squeezed twice—once each to ...more
26%
Flag icon
CloudFactory’s Mark Sears, who told me his company doesn’t accept these kinds of projects, said that in all his years of running a data-annotation firm, content-moderation work for generative AI was by far the most morally troubling. “It’s just so unbelievably ugly,” he said.
27%
Flag icon
“Perhaps this is over-cautious,” an OpenAI employee had commented on this line, “but do we have concerns about plagiarism here?” “Ah, reworded to make sure they attribute sources,” another had responded. “Maybe I’ll add an explicit field for that too!” “Cool! One of the things I was thinking about here was preserving future optionality,” the first had written. “(if in future we want to be able to use data we hired contractors to create, it could be really helpful to have a way to easily weed out anything that could be seen as stolen).”
27%
Flag icon
But in April 2023, John Schulman, one of the scientists on OpenAI’s founding team, would remind the audience during a talk at UC Berkeley that the issue of hallucinations was rooted in the nature of neural networks. Unlike the deterministic information databases of symbolic systems, neural networks would always traffic in fuzzy probabilities. Even with RLHF, which helped to strengthen the probabilities within a deep learning model that correlate with accuracy, there was fundamentally a limit to how far the technique can go. “The model obviously has to guess sometimes when it’s outputting a lot ...more
27%
Flag icon
To Scale AI, Kenya had one advantage that Venezuela did not. The workers speak English, like the chatbots who need them. As self-driving car work largely disappeared from the platform, so did Venezuelans. “They wouldn’t use Venezuelans for generative AI work,” says a former Scale employee. “That country is relegated to image annotation at best.” Scale would soon ban Venezuela from its platform completely, citing “changing customer requirements.”
28%
Flag icon
Less than a year later, she would learn the truth. In March 2024, Scale would block Kenya wholesale as a country from Remotasks, just like it did with Venezuela. For Scale, it was part of its housecleaning—a regular reevaluation of whether workers from different countries were really serving the business. Kenya, they decided, along with several other countries including Nigeria and Pakistan, simply had too many workers attempting to scam the platform to earn more money.
28%
Flag icon
By then, Scale was moving on to a new focus, following the demands of the AI industry. OpenAI and its competitors were increasingly searching for highly educated workers to perform RLHF—doctors, coders, physicists, people with PhDs. So went the profit-chasing progression of chatbot development. Those willing to pay money for chatbots were not casual consumers but businesses that expected tools to perform complex tasks such as in science and software development. Kenya did not fulfill the new labor demand. Scale was now recruiting a fresh workforce primarily in the US with a new worker-facing ...more
28%
Flag icon
Scale’s decision would send Winnie and her family spiraling. By then Millicent had lost her job and Remotasks had been the only thing keeping them afloat. Now they were struggling to feed their kids. Winnie was terrified they would soon be evicted.
29%
Flag icon
As the team started on DALL-E 2, a new method for generating images was gaining traction. Known as diffusion, it was a technique inspired by physics that made it possible for Transformers to better learn the correlations between pixels in a vast swath of images.
29%
Flag icon
Using diffusion created much sharper and more photorealistic images; the method also significantly reduced the amount of compute needed to achieve the same performance as DALL-E 1.
29%
Flag icon
OpenAI wouldn’t adopt latent diffusion until much later, leaving DALL-E 2 and 3 much more computationally expensive than Stable Diffusion or Midjourney, which many users deemed the higher-quality products. It was just one example of how, even within the narrow realm of generative AI, scale was not the only, or even the highest-performing, path to more expanded AI capabilities.
29%
Flag icon
Several employees made a significant effort to check for and cull any CSAM. But after some discussion, the employees left in other types of sexual images, in part because they felt such content was part of the human experience. Keeping such photos in the training data, however, meant the model would still be able to produce synthetic CSAM. In the same way DALL-E could generate an avocado armchair having only ever seen avocados and armchairs, DALL-E 2 and DALL-E 3 could do the same thing with children and porn for child pornography, a capability known as “compositional generation.”
29%
Flag icon
Later, during the development of DALL-E 3, when the data imperative had grown even larger, the research team decided that sexual images were no longer just a “nice to have” but a “need to have.” The share of pornographic images on the internet was so large that removing them shrank the training dataset enough to notably degrade the model’s performance.
29%
Flag icon
In December 2023, an alarmed AI engineer at Microsoft, Shane Jones, would discover the downstream consequences of those decisions. As he played around with Copilot Designer, Microsoft’s image generator built on DALL-E 3, he was horrified by how quickly it spit out offensive and sexualized images with little prompting. Just adding the term “pro-choice” into the prompt, Jones found, produced scenes of a demon eating an infant and what appeared to be a drill labeled “pro choice” being used to mutilate a baby. Just prompting the tool for a “car accident” and nothing else produced sexualized women ...more
31%
Flag icon
Willner’s team urged executives for the resources to properly build out the tooling they needed to make the plan work. OpenAI didn’t have most of these data points at its disposal; its monitoring platform was logging only basic data on how much traffic each app was sending through the company’s servers. Sometimes it didn’t even know the name of the developer or the purpose of the application.
31%
Flag icon
Several board members felt strongly that OpenAI’s board was meant to be different. “The board is a nonprofit board that was set up explicitly for the purpose of making sure that the company’s public good mission was primary, was coming first over profits, investor interests, and other things,” Helen Toner would tell The TED AI Show podcast. “Not just like, you know, helping the CEO to raise more money.”
31%
Flag icon
Some employees also felt exactly the opposite. While there was a clear qualitative change in what GPT-3 could do over GPT-2, GPT-4 was just bigger, says one of the researchers who worked on the model. “The big result was that there were a bunch of exams that the model does well. But even that is highly questionable.” OpenAI never did a comprehensive review of GPT-4’s training data to check whether those exams—and their answers—were just in the data and being regurgitated, or whether GPT-4 had in fact developed a novel capability to pass them. It was the kind of shaky science that had become ...more
32%
Flag icon
How could the company have failed to predict user behavior and ChatGPT’s popularity so badly? What did that say about the company’s ability to calibrate and forecast the future impacts of its technologies?
33%
Flag icon
In November 2022, as users latched on to ChatGPT as if it were a search tool, spawning widespread speculation that it could unseat Google, an internal document noted that OpenAI’s model had hallucinated during an internal test on roughly 30 percent of so-called closed-domain questions. Closed-domain questions are meant to be the easiest category of questions: when users ask the model only about the information they give it—for example, uploading a pdf and asking for a summary, or providing bullet points and asking for a rewrite to complete sentences. This is in contrast to open-domain ...more
33%
Flag icon
After ChatGPT went viral, SemiAnalysis, a trade newsletter focused on the semiconductor industry, estimated that the company was spending some $700,000 a day on compute costs alone.
33%
Flag icon
Handing off the technology had its own challenges. As OpenAI’s release schedule picked up, so did Microsoft’s. But Microsoft completely dwarfed OpenAI, leading to a dynamic where a single OpenAI employee could get pinged by dozens of Microsoft counterparts across various departments with all sorts of questions about technical or logistical details with every new product. It was growing increasingly frustrating and overwhelming for OpenAI staff to support Microsoft releases while focusing on their own road map.
33%
Flag icon
Today nearly 60 percent of Chile’s exports are minerals, primarily found in the Atacama Desert, chiefly copper, a highly conductive metal used in all kinds of electronics, and more recently lithium, the essential ingredient for lithium-ion batteries.
33%
Flag icon
Under Pinochet’s rule, Chile privatized nearly everything—education, health care, the pension system, even water. The strategy produced economic growth; it also fueled stunning inequality. Chile is among the most unequal countries in the world today, with nearly a quarter of the country’s income concentrated among a few powerful families in the 1 percent. Having never meaningfully industrialized, it also remains tethered to the extraction economy that makes it relevant to higher geopolitical powers. And so, as the AI boom arrived, Chile would become ground zero for a new scale of extractivism, ...more
34%
Flag icon
The four largest hyperscalers—Google, Microsoft, Amazon, Meta—now spend more money building data centers each year than almost all the others, relatively unknown developers like Equinix and Digital Realty, combined.
34%
Flag icon
A rack of GPUs consumes three times more power than a rack of other computer chips. And it’s not just the training of the generative AI models that is costly, it is also serving them: According to the International Energy Agency, each ChatGPT query is estimated to need on average about ten times more electricity than a typical search on Google. Until recently, the largest data centers were designed to be around 150-megawatt facilities, meaning they could consume as much energy annually as close to 122,000 American households. Developers and utility companies are now preparing for AI ...more
This highlight has been truncated due to consecutive passage length restrictions.
34%
Flag icon
There are indeed many AI technologies, as cataloged by the initiative turned nonprofit Climate Change AI, that can accelerate sustainability, but rarely are they ever generative AI technologies. “What you need for climate are supervised learning models or anomaly detection models or even statistical time series models,”
34%
Flag icon
Luccioni says her past collaborators within closed-off companies no longer receive approval from their employers to cowrite papers with her about AI’s environmental impact.