- Home /
- Shop All /
- Networking & Security /
- (Reference Guide) Natural Language Processing with TensorFlow
Book Description
Natural language processing (NLP) supplies the majority of data available to deep learning applications, while TensorFlow is the most important deep learning framework currently available. Natural Language Processing with TensorFlow brings TensorFlow and NLP together to give you invaluable tools to work with the immense volume of unstructured data in today’s data streams, and apply these tools to specific NLP tasks.
Thushan Ganegedara starts by giving you a grounding in NLP and TensorFlow basics. You'll then learn how to use Word2vec, including advanced extensions, to create word embeddings that turn sequences of words into vectors accessible to deep learning algorithms. Chapters on classical deep learning algorithms, like convolutional neural networks (CNN) and recurrent neural networks (RNN), demonstrate important NLP tasks as sentence classification and language generation. You will learn how to apply high-performance RNN models, like long short-term memory (LSTM) cells, to NLP tasks. You will also explore neural machine translation and implement a neural machine translator.
After reading this book, you will gain an understanding of NLP and you'll have the skills to apply TensorFlow in deep learning NLP applications, and how to perform specific NLP tasks.
What You Will Learn
- Core concepts of NLP and various approaches to natural language processing
- How to solve NLP tasks by applying TensorFlow functions to create neural networks
- Strategies to process large amounts of data into word representations that can be used by deep learning applications
- Techniques for performing sentence classification and language generation using CNNs and RNNs
- About employing state-of-the art advanced RNNs, like long short-term memory, to solve complex text generation tasks
- How to write automatic translation programs and implement an actual neural machine translator from scratch
- The trends and innovations that are paving the future in NLP
Table of Contents
1: Introduction to Natural Language Processing
- What is Natural Language Processing?
- Tasks of Natural Language Processing
- The traditional approach to Natural Language Processing
- The deep learning approach to Natural Language Processing
- The roadmap – beyond this chapter
- Introduction to the technical tools
- Summary
2: Understanding TensorFlow
- What is TensorFlow?
- Inputs, variables, outputs, and operations
- Reusing variables with scoping
- Implementing our first neural network
- Summary
3: Word2vec – Learning Word Embeddings
- What is a word representation or meaning?
- Classical approaches to learning word representation
- Word2vec – a neural network-based approach to learning word representation
- The skip-gram algorithm
- The Continuous Bag-of-Words algorithm
- Summary
4: Advanced Word2vec
- The original skip-gram algorithm
- Comparing skip-gram with CBOW
- Extensions to the word embeddings algorithms
- More recent algorithms extending skip-gram and CBOW
- GloVe – Global Vectors representation
- Document classification with Word2vec
- Summary
5: Sentence Classification with Convolutional Neural Networks
- Introducing Convolution Neural Networks
- Understanding Convolution Neural Networks
- Exercise – image classification on MNIST with CNN
- Using CNNs for sentence classification
- Summary
6: Recurrent Neural Networks
- Understanding Recurrent Neural Networks
- Backpropagation Through Time
- Applications of RNNs
- Generating text with RNNs
- Evaluating text results output from the RNN
- Perplexity – measuring the quality of the text result
- Recurrent Neural Networks with Context Features – RNNs with longer memory
- Summary
7: Long Short-Term Memory Networks
- Understanding Long Short-Term Memory Networks
- How LSTMs solve the vanishing gradient problem
- Other variants of LSTMs
- Summary
8: Applications of LSTM – Generating Text
- Our data
- Implementing an LSTM
- Comparing LSTMs to LSTMs with peephole connections and GRUs
- Improving LSTMs – beam search
- Improving LSTMs – generating text with words instead of n-grams
- Using the TensorFlow RNN API
- Summary
9: Applications of LSTM – Image Caption Generation
- Getting to know the data
- The machine learning pipeline for image caption generation
- Extracting image features with CNNs
- Implementation – loading weights and inferencing with VGG-
- Learning word embeddings
- Preparing captions for feeding into LSTMs
- Generating data for LSTMs
- Defining the LSTM
- Evaluating the results quantitatively
- Captions generated for test images
- Using TensorFlow RNN API with pretrained GloVe word vectors
- Summary
10: Sequence-to-Sequence Learning – Neural Machine Translation
- Machine translation
- A brief historical tour of machine translation
- Understanding Neural Machine Translation
- Preparing data for the NMT system
- Training the NMT
- Inference with NMT
- The BLEU score – evaluating the machine translation systems
- Implementing an NMT from scratch – a German to English translator
- Training an NMT jointly with word embeddings
- Improving NMTs
- Attention
- Other applications of Seq2Seq models – chatbots
- Summary
11: Current Trends and the Future of Natural Language Processing
- Current trends in NLP
- Penetration into other research fields
- Towards Artificial General Intelligence
- NLP for social media
- New tasks emerging
- Newer machine learning models
- Summary
- References
SKU | 031046S |
---|---|
Weight | 2.5050 |
Coming Soon | No |
Days of Training | No |
Audience | Student |
Product Family | Partnerware |
Product Type | Print Courseware |
Electronic | No |
ISBN | 1788478311 |
Language | English |
Page Count | 470 |
Curriculum Library | TensorFlow |
Year | No |
Manufacturer's Product Code | No |
Current Revision | 1.0 |
---|---|
Revision Notes | No Revision Information Available |
Original Publication Date | 2018-10-18 00:00:00 |
-
(Reference Guide) Natural Language Processing with TensorFlow eBook
(031046SE) Student Digital Courseware$31.99