RoBERTa (Robustly Optimized BERT pre-training Approach) is a NLP model and is the modified version (by Facebook) of the popular NLP model, BERT. It is more like an approach better train and optimize BERT (Bidirectional Encoder Representations from Transformers).
Introduction of BERT led to the state-of-the-art results in the range of NLP tasks. BERT uses a robust method of training the model which first pre-trains it on a very large dataset as opposed to training it directly on specific labelled...
Published on December 02, 2020 16:54