complexity), GPT-2 was trained on 8 million pages of web text. But it wasn’t until the summer of 2020, when OpenAI released GPT-3, that people started to truly grasp the magnitude of what was happening. With a whopping 175 billion parameters it was, at the time, the largest neural network ever constructed, more than a hundred times larger than its predecessor of just a year earlier.