Explore the most challenging issues of natural language processing, and learn how to solve them with cutting-edge deep learning!
Inside Deep Learning for Natural Language Processing you’ll find a wealth of NLP insights,
An overview of NLP and deep learning One-hot text representations Word embeddings Models for textual similarity Sequential NLP Semantic role labeling Deep memory-based NLP Linguistic structure Hyperparameters for deep NLP
Deep learning has advanced natural language processing to exciting new levels and powerful new applications! For the first time, computer systems can achieve "human" levels of summarizing, making connections, and other tasks that require comprehension and context. Deep Learning for Natural Language Processing reveals the groundbreaking techniques that make these innovations possible. Stephan Raaijmakers distills his extensive knowledge into useful best practices, real-world applications, and the inner workings of top NLP algorithms.
Purchase of the print book includes a free eBook in PDF, Kindle, and ePub formats from Manning Publications.
About the technology Deep learning has transformed the field of natural language processing. Neural networks recognize not just words and phrases, but also patterns. Models infer meaning from context, and determine emotional tone. Powerful deep learning-based NLP models open up a goldmine of potential uses.
About the book Deep Learning for Natural Language Processing teaches you how to create advanced NLP applications using Python and the Keras deep learning library. You’ll learn to use state-of the-art tools and techniques including BERT and XLNET, multitask learning, and deep memory-based NLP. Fascinating examples give you hands-on experience with a variety of real world NLP applications. Plus, the detailed code discussions show you exactly how to adapt each example to your own uses!
What's inside
Improve question answering with sequential NLP Boost performance with linguistic multitask learning Accurately interpret linguistic structure Master multiple word embedding techniques
About the reader For readers with intermediate Python skills and a general knowledge of NLP. No experience with deep learning is required.
About the author Stephan Raaijmakers is professor of Communicative AI at Leiden University and a senior scientist at The Netherlands Organization for Applied Scientific Research (TNO).
Table of Contents PART 1 INTRODUCTION 1 Deep learning for NLP 2 Deep learning and The basics 3 Text embeddings PART 2 DEEP NLP 4 Textual similarity 5 Sequential NLP 6 Episodic memory for NLP PART 3 ADVANCED TOPICS 7 Attention 8 Multitask learning 9 Transformers 10 Applications of Hands-on with BERT
I read the book – Deep Learning for Natural Language Processing – by Stephen Raaijmakers, almost all sections. I find it well organized. The main reason I am saying this is that the author has taken the reader’s journey into account while designing the flow. The book (as with all books!) starts with an introduction about Deep Learning for NLP and then gets into the core techniques namely the embeddings, text similarity, etc. before getting on to the advanced topics like Attention-based mechanisms. It then culminates into a hands-on section with BERT. Overall, the flow gives an extremely good reader journey that acts as a guide, while simultaneously the reader can skip areas that are familiar thus accelerating the journey.
The author has taken good care to sprinkle in generous snippets of code along with clear comments at various places in the book. This encourages the reader to not just read but also try out! For the lazy reader, there are also parts of the indicative outputs for some of the important sections. So, in essence, it is both theory and practice of the subject in whatever limited areas it is in, within the vast range of topics in the subject. I have not tested the code myself, but the flow does make sense. The code sections are in Python using standard libraries like Keras for deep learning. There is a facility for Live Book access which offers collaborating with the author as well as with other Live users. Some of the areas that need improvement include the need for better representation of the flow diagrams, architecture diagrams and a link to some of the advancements that happen so quickly in this field. This will help be more interactive with the reader. A few industry case studies would also help the reader quickly relate to their respective use cases...
But overall, it is a good book – read fast before the technology goes obsolete!
Must for NLP explorers & innovators. I picked this title primarily for Attention & Transformer mechanism, covers it well. All NLP concepts are explained well and related. Though it can safely be implemented in both PyTorch & Tensor flow.
Considering popularity of PyTorch. It makes sense to incorporate corresponding implementation in PyTorch as well.