Introduction to Deep Learning & Natural Language Processing


We would cover the following:
1) What is deep learning?
2) Motivation: Some use cases where it has produced state-of-art results
3) Basic building blocks of Neural networks (Neuron, activation function, back propagation algorithm, gradient descent algorithm)
4) Supervised learning (multi-layer perceptron, convolution neural networks, recurrent neural network)
5) Introduction to word2vec
6) Introduction to Recurrent Neural Networks
7) Text classification using RNN
8) Impact of GPUs (Some practical thoughts on hardware and software)
Broadly, there will be three hands-on modules
1) A simple multi-layer perceptron - to understand basics of neural networks (everything will be coded from scratch)
2) A text classification problem and a text generation problem: This will be solved using Recurrent Neural Networks. The data and software requirements are posted on the github repository. The repository for this workshop: The slides for this workshop are available at:

London, United Kingdom
Nischal Harohalli Padmanabha
Software Engineer by profession, filter kapi drinker by choice.

My research interests include deep learning, large scale engineering and social interactions.