封面
书名页
Deep Learning with Theano
Credits
About the Author
Acknowledgments
About the Reviewers
www.PacktPub.com
eBooks discount offers and more
Customer Feedback
Preface
What this book covers
What you need for this book
Who this book is for
Conventions
Reader feedback
Customer support
Chapter 1. Theano Basics
The need for tensors
Installing and loading Theano
Tensors
Graphs and symbolic computing
Operations on tensors
Memory and variables
Functions and automatic differentiation
Loops in symbolic computing
Configuration profiling and debugging
Summary
Chapter 2. Classifying Handwritten Digits with a Feedforward Network
The MNIST dataset
Structure of a training program
Classification loss function
Single-layer linear model
Cost function and errors
Backpropagation and stochastic gradient descent
Multiple layer model
Convolutions and max layers
Training
Dropout
Inference
Optimization and other update rules
Related articles
Summary
Chapter 3. Encoding Word into Vector
Encoding and embedding
Dataset
Continuous Bag of Words model
Training the model
Visualizing the learned embeddings
Evaluating embeddings – analogical reasoning
Evaluating embeddings – quantitative analysis
Application of word embeddings
Weight tying
Further reading
Summary
Chapter 4. Generating Text with a Recurrent Neural Net
Need for RNN
A dataset for natural language
Simple recurrent network
Metrics for natural language performance
Training loss comparison
Example of predictions
Applications of RNN
Related articles
Summary
Chapter 5. Analyzing Sentiment with a Bidirectional LSTM
Installing and configuring Keras
Preprocessing text data
Designing the architecture for the model
Compiling and training the model
Evaluating the model
Saving and loading the model
Running the example
Further reading
Summary
Chapter 6. Locating with Spatial Transformer Networks
MNIST CNN model with Lasagne
A localization network
Unsupervised learning with co-localization
Region-based localization networks
Further reading
Summary
Chapter 7. Classifying Images with Residual Networks
Natural image datasets
Residual connections
Stochastic depth
Dense connections
Multi-GPU
Data augmentation
Further reading
Summary
Chapter 8. Translating and Explaining with Encoding – decoding Networks
Sequence-to-sequence networks for natural language processing
Seq2seq for translation
Seq2seq for chatbots
Improving efficiency of sequence-to-sequence network
Deconvolutions for images
Multimodal deep learning
Further reading
Summary
Chapter 9. Selecting Relevant Inputs or Memories with the Mechanism of Attention
Differentiable mechanism of attention
Store and retrieve information in Neural Turing Machines
Memory networks
Further reading
Summary
Chapter 10. Predicting Times Sequences with Advanced RNN
Dropout for RNN
Deep approaches for RNN
Stacked recurrent networks
Deep transition recurrent network
Highway networks design principle
Recurrent Highway Networks
Further reading
Summary
Chapter 11. Learning from the Environment with Reinforcement
Reinforcement learning tasks
Simulation environments
Q-learning
Deep Q-network
Training stability
Policy gradients with REINFORCE algorithms
Related articles
Summary
Chapter 12. Learning Features with Unsupervised Generative Networks
Generative models
Semi-supervised learning
Further reading
Summary
Chapter 13. Extending Deep Learning with Theano
Theano Op in Python for CPU
Theano Op in Python for the GPU
Theano Op in C for CPU
Theano Op in C for GPU
Coalesced transpose via shared memory NVIDIA parallel for all
The future of artificial intelligence
Further reading
Summary
Index
更新时间:2021-07-15 17:17:25