Krishna Chaitanya Venkata

61%
Flag icon
BERT was what researchers call a “universal language model.” Several other labs, including the Allen Institute and OpenAI, had been working on similar systems. Universal language models are giant neural networks that learn the vagaries of language by analyzing millions of sentences written by humans. The system built by OpenAI analyzed thousands of self-published books, including romance, science fiction, and mysteries. BERT analyzed the same vast library of books as well as every article on Wikipedia, spending days poring over all this text with help from hundreds of GPU chips.
Genius Makers: The Mavericks Who Brought A.I. to Google, Facebook, and the World
Rate this book
Clear rating
Open Preview