GPT stands for ‘Generative Pre-trained Transformer’ – a type of neural network specialised for language prediction and generation. These networks are trained using an unsupervised deep learning approach essentially to ‘predict the next word’ given a previous word or text snippet. GPT-3 has an astonishing 175 billion parameters and was trained on some 45 terabytes of text data. See https://openai.com/blog/openai-api/ and for technical details: https://arxiv.org/abs/2005.14165