More on this book
Community
Kindle Notes & Highlights
by
Parmy Olson
Read between
July 21 - July 25, 2025
Baruch Spinoza was a seventeenth-century philosopher who proposed that God was effectively nature and everything that existed, rather than a separate being. It was a pantheistic view. “Spinoza thinks of nature as the embodiment of whatever God is,” Hassabis says. “So doing science is exploring that mystery.”
The basic premise of transhumanism is that the human race is currently second-rate. With the right scientific discoveries and technology, we might one day evolve beyond our physical and mental limits into a new, more intelligent species. We’ll be smarter and more creative, and we’ll live longer. We might even manage to meld our minds with computers and explore the galaxy.
Ng left Stanford for Google and then China’s Baidu.
The transformer model that powered Google Translate used something called an encoder and a decoder to process words.
Radford and Sutskever figured out that they could get rid of that first robot, and instead just have one, the decoder, listen to you and talk back by itself. Early testing showed that the idea worked in practice, which meant they could build a more streamlined language model that was quicker and easier to troubleshoot and grow. And making it “decoder only” would also be a game-changer. By combining a model’s ability to “understand” and speak into one fluid process, it could ultimately generate more humanlike text.
The next step was to vastly increase the amount of data, computing power, and capacity of their language model. Sutskever had long believed that “success was guaranteed” when you scaled everything up in AI, especially with language models.
He and his colleagues started working on a new language model they called a “generatively pre-trained transformer” or GPT for short. They trained it on an online corpus of about seven thousand mostly self-published books found on the internet, many of them skewed toward romance and vampire fiction. Plenty of AI scientists had used this same dataset, too, known as BooksCorpus, and anyone could download it for free. Radford and his team believed they had all the right ingredients to ensure that this time, their model would also be able to infer context.
Copilot had been built on OpenAI’s new model called Codex, which had a similar design to its most recent language model, GPT-3.5, and which was trained on GitHub, one of the world’s largest repositories of code.