Deep Learning For Natural Language Processing
Frequency
Every 2 years
Summary
The Deep Learning for NLP course provides an overview of neural network based methods applied to text. The focus is on models particularly suited to the properties of human language, such as categorical, unbounded, and structured representations, and very large input and output vocabularies.
Content
Models
- Word embeddings
- LSTMs and CNNs for text
- Attention models
- Sequence-to-sequence models
- NN integration with decoding
- Multi-task learning
Applications
- Language modelling
- Machine translation
- Syntactic parsing
- Semantic parsing
- Dialogue systems
Keywords
Machine Learning, Natural Language Processing, Neural Networks.
Learning Prerequisites
Required courses
Undergraduate level probability, linear algebra, and programming.
Recommended courses
Courses on Machine Learning, Natural Language Processing (Human Language Technology, Computational Linguistics), or Artificial Intelligence would be useful.
Learning Outcomes
By the end of the course, the student must be able to:
- Identify appropriate deep learning architectures for different natural language processing tasks.
- Apply appropriate training and evaluation methodology to such models on large datasets using existing packages.
Assessment methods
Multiple.
In the programs
- Number of places: 40
- Exam form: Multiple (session free)
- Subject examined: Deep Learning For Natural Language Processing
- Courses: 28 Hour(s)
- TP: 28 Hour(s)
- Type: optional
- Number of places: 40
- Exam form: Multiple (session free)
- Subject examined: Deep Learning For Natural Language Processing
- Courses: 28 Hour(s)
- TP: 28 Hour(s)
- Type: optional