Natural Language Processing with PyTorch. Build Intelligent Language Applications Using Deep Learning - Helion
ISBN: 978-14-919-7818-4
stron: 256, Format: ebook
Data wydania: 2019-01-22
Księgarnia: Helion
Cena książki: 288,15 zł (poprzednio: 339,00 zł)
Oszczędzasz: 15% (-50,85 zł)
Natural Language Processing (NLP) provides boundless opportunities for solving problems in artificial intelligence, making products such as Amazon Alexa and Google Translate possible. If you’re a developer or data scientist new to NLP and deep learning, this practical guide shows you how to apply these methods using PyTorch, a Python-based deep learning library.
Authors Delip Rao and Brian McMahon provide you with a solid grounding in NLP and deep learning algorithms and demonstrate how to use PyTorch to build applications involving rich representations of text specific to the problems you face. Each chapter includes several code examples and illustrations.
- Explore computational graphs and the supervised learning paradigm
- Master the basics of the PyTorch optimized tensor manipulation library
- Get an overview of traditional NLP concepts and methods
- Learn the basic ideas involved in building neural networks
- Use embeddings to represent words, sentences, documents, and other features
- Explore sequence prediction and generate sequence-to-sequence models
- Learn design patterns for building production NLP systems
Osoby które kupowały "Natural Language Processing with PyTorch. Build Intelligent Language Applications Using Deep Learning", wybierały także:
- Data Science w Pythonie. Kurs video. Przetwarzanie i analiza danych 149,00 zł, (67,05 zł -55%)
- Matematyka w deep learningu. Co musisz wiedzie 89,00 zł, (44,50 zł -50%)
- Dylemat sztucznej inteligencji. 7 zasad odpowiedzialnego tworzenia technologii 54,90 zł, (27,45 zł -50%)
- Eksploracja danych za pomoc 67,00 zł, (33,50 zł -50%)
- Podr 129,00 zł, (64,50 zł -50%)
Spis treści
Natural Language Processing with PyTorch. Build Intelligent Language Applications Using Deep Learning eBook -- spis treści
- Preface
- Conventions Used in This Book
- Using Code Examples
- OReilly Safari
- How to Contact Us
- Acknowledments
- 1. Introduction
- The Supervised Learning Paradigm
- Observation and Target Encoding
- One-Hot Representation
- TF Representation
- TF-IDF Representation
- Target Encoding
- Computational Graphs
- PyTorch Basics
- Installing PyTorch
- Creating Tensors
- Tensor Types and Size
- Tensor Operations
- Indexing, Slicing, and Joining
- Tensors and Computational Graphs
- CUDA Tensors
- Exercises
- Solutions
- Summary
- References
- 2. A Quick Tour of Traditional NLP
- Corpora, Tokens, and Types
- Unigrams, Bigrams, Trigrams, , N-grams
- Lemmas and Stems
- Categorizing Sentences and Documents
- Categorizing Words: POS Tagging
- Categorizing Spans: Chunking and Named Entity Recognition
- Structure of Sentences
- Word Senses and Semantics
- Summary
- References
- 3. Foundational Components of Neural Networks
- The Perceptron: The Simplest Neural Network
- Activation Functions
- Sigmoid
- Tanh
- ReLU
- Softmax
- Loss Functions
- Mean Squared Error Loss
- Categorical Cross-Entropy Loss
- Binary Cross-Entropy Loss
- Diving Deep into Supervised Training
- Constructing Toy Data
- Choosing a model
- Converting the probabilities to discrete classes
- Choosing a loss function
- Choosing an optimizer
- Putting It Together: Gradient-Based Supervised Learning
- Constructing Toy Data
- Auxiliary Training Concepts
- Correctly Measuring Model Performance: Evaluation Metrics
- Correctly Measuring Model Performance: Splitting the Dataset
- Knowing When to Stop Training
- Finding the Right Hyperparameters
- Regularization
- Example: Classifying Sentiment of Restaurant Reviews
- The Yelp Review Dataset
- Understanding PyTorchs Dataset Representation
- The Vocabulary, the Vectorizer, and the DataLoader
- Vocabulary
- Vectorizer
- DataLoader
- A Perceptron Classifier
- The Training Routine
- Setting the stage for the training to begin
- The training loop
- Evaluation, Inference, and Inspection
- Evaluating on test data
- Inference and classifying new data points
- Inspecting model weights
- Summary
- References
- 4. Feed-Forward Networks for Natural Language Processing
- The Multilayer Perceptron
- A Simple Example: XOR
- Implementing MLPs in PyTorch
- Example: Surname Classification with an MLP
- The Surnames Dataset
- Vocabulary, Vectorizer, and DataLoader
- The Vocabulary class
- The SurnameVectorizer
- The SurnameClassifier Model
- The Training Routine
- The training loop
- Model Evaluation and Prediction
- Evaluating on the test dataset
- Classifying a new surname
- Retrieving the top k predictions for a new surname
- Regularizing MLPs: Weight Regularization and Structural Regularization (or Dropout)
- Convolutional Neural Networks
- CNN Hyperparameters
- Dimension of the convolution operation
- Channels
- Kernel size
- Stride
- Padding
- Dilation
- Implementing CNNs in PyTorch
- CNN Hyperparameters
- Example: Classifying Surnames by Using a CNN
- The SurnameDataset Class
- Vocabulary, Vectorizer, and DataLoader
- Reimplementing the SurnameClassifier with Convolutional Networks
- The Training Routine
- Model Evaluation and Prediction
- Evaluating on the test dataset
- Classifying or retrieving top predictions for a new surname
- Miscellaneous Topics in CNNs
- Pooling
- Batch Normalization (BatchNorm)
- Network-in-Network Connections (1x1 Convolutions)
- Residual Connections/Residual Block
- Summary
- References
- The Multilayer Perceptron
- 5. Embedding Words and Types
- Why Learn Embeddings?
- Efficiency of Embeddings
- Approaches to Learning Word Embeddings
- The Practical Use of Pretrained Word Embeddings
- Loading embeddings
- Relationships between word embeddings
- Example: Learning the Continuous Bag of Words Embeddings
- The Frankenstein Dataset
- Vocabulary, Vectorizer, and DataLoader
- The CBOWClassifier Model
- The Training Routine
- Model Evaluation and Prediction
- Example: Transfer Learning Using Pretrained Embeddings for Document Classification
- The AG News Dataset
- Vocabulary, Vectorizer, and DataLoader
- The NewsClassifier Model
- The Training Routine
- Model Evaluation and Prediction
- Evaluating on the test dataset
- Predicting the category of novel news headlines
- Summary
- References
- Why Learn Embeddings?
- 6. Sequence Modeling for Natural Language Processing
- Introduction to Recurrent Neural Networks
- Implementing an Elman RNN
- Example: Classifying Surname Nationality Using a Character RNN
- The SurnameDataset Class
- The Vectorization Data Structures
- The SurnameClassifier Model
- The Training Routine and Results
- Summary
- References
- Introduction to Recurrent Neural Networks
- 7. Intermediate Sequence Modeling for Natural Language Processing
- The Problem with Vanilla RNNs (or Elman RNNs)
- Gating as a Solution to a Vanilla RNNs Challenges
- Example: A Character RNN for Generating Surnames
- The SurnameDataset Class
- The Vectorization Data Structures
- SurnameVectorizer and END-OF-SEQUENCE
- From the ElmanRNN to the GRU
- Model 1: The Unconditioned SurnameGenerationModel
- Model 2: The Conditioned SurnameGenerationModel
- The Training Routine and Results
- Tips and Tricks for Training Sequence Models
- References
- 8. Advanced Sequence Modeling for Natural Language Processing
- Sequence-to-Sequence Models, EncoderDecoder Models, and Conditioned Generation
- Capturing More from a Sequence: Bidirectional Recurrent Models
- Capturing More from a Sequence: Attention
- Attention in Deep Neural Networks
- Evaluating Sequence Generation Models
- Example: Neural Machine Translation
- The Machine Translation Dataset
- A Vectorization Pipeline for NMT
- Encoding and Decoding in the NMT Model
- A closer look at attention
- Learning to search and scheduled sampling
- The Training Routine and Results
- Summary
- References
- 9. Classics, Frontiers, and Next Steps
- What Have We Learned so Far?
- Timeless Topics in NLP
- Dialogue and Interactive Systems
- Discourse
- Information Extraction and Text Mining
- Document Analysis and Retrieval
- Frontiers in NLP
- Design Patterns for Production NLP Systems
- Where Next?
- References
- Index