Allison

20%
Flag icon
GPT-3, released one year after Strubell’s paper, now topped them. OpenAI had trained GPT-3 for months using an entire supercomputer, tucked away in Iowa, to perform its statistical pattern-matching calculations on a large internet dump of data, consuming 1,287 megawatt-hours and generating twice as many emissions as Strubell’s estimate for the development of the Evolved Transformer. But these energy and carbon costs wouldn’t be known for nearly a year. OpenAI would initially give the public one number to convey the sheer size of the model: 175 billion parameters, over one hundred times the ...more
Empire of AI: Dreams and Nightmares in Sam Altman's OpenAI
Rate this book
Clear rating
Open Preview