# Learn Deep Learning: From Beginner to Expert

January 16, 2018

This 24-part course consists of tutorials on deep learning concepts and neural networks, as well as quizzes and hands-on projects to practice implementing the algorithms and applying them to problems. You can think of this course as a “Free Online Nano Book”.

**Why Deep Learning?:** Deep Learning has caused the revival of artificial intelligence. It has become the dominant method for speech recognition (allowing us to talk to the Google Assistant), computer vision (allowing us to search for “my pictures on the beach” on Google Photos), language translation and even game artificial intelligence (think AlphaGo and DeepMind). If you’d like to learn how these systems work and maybe make your own, deep learning is for you.

The **primary objectives** of this course are as follows:

- Understand what
**machine learning**is, and learn the**gradient descent**algorithm. - Understand what
**deep learning**is, and how deep learning differs from and relates to machine learning. - Understand what
**neural networks**are and how they are trained using**back-propagation**. (and train your own neural network). - Understand the concept of
**computational graphs**, a core idea (often overlooked in DL courses) foundational to understanding and implementing all sorts of complex neural network architectures. - Understand how
**Convolutional Nets**work and how to solve computer vision tasks like**image classification**. - Understand how
**Recurrent Neural Nets**work and how to**generate text**and perform language translation. - Bonus: Understand
**Generative Adversarial Networks**, and how they can be used for**generating images**!

**Prerequisites**: Python, Linear Algebra, Statistics and NumPy and Calculus 1 (differentiation and chain rule).

**Subscribe** to add this course to the top of your Home Page**. Get started** with the first article below**.**

# Machine Learning and Gradient Descent

We’ll start by describing what machine learning is, and introduce a simple learning algorithm: **linear regression + gradient descent**. Using this algorithm, we’ll introduce the core concepts in machine learning: *model parameters*, *cost function*, *optimization method*, and *overfitting and regularization*.

- What is Machine Learning? Why Machine Learning?
- Linear Regression
- Gradient Descent
- Hands-on Assignment: Implementing Linear Regression with Gradient Descent
- Generalization and Overfitting
- Strategies to Avoid Overfitting
- Quiz: Train vs Test Data, Overfitting, Cost functions

# An Introduction to Deep Learning and Neural Networks

In this section, we’ll learn about how deep learning differs from machine learning, i.e. what *deep* means and why it is important. We’ll learn specifically what **neural networks** look like and how they are trained using **back-propagation**. We’ll also introduce the concept of **computational graphs** which is how neural networks are implemented in popular deep learning libraries such as TensorFlow, Torch, etc and then **train our first neural network**. We’ll end the section with learning **regularization techniques** specific to deep learning, such as **early-stopping** and **dropout**.

- What is deep learning? How does it relate to machine learning?
- A First Look at Neural Networks
- Quiz: Deep Learning and Neural Networks
- Computational Graphs and Backpropagation
- Hands-on Assignment: Training your first Neural Network
- Quiz: The Importance of “Good” Gradients
- Regularization methods in Deep Learning
- Dropout (neural network regularization)

# Convolutional Neural Networks and Computer Vision

We’ll start by introducing the **image classification** problem in computer vision and a small note on how images are stored in a computer. Thereafter, we’ll describe the architecture of a **Convolutional Neural Network (CNN)** in substantial detail, and see why the *convolutional layer* is very well suited for computer vision applications. Then, we’ll solve the **image classification** problem using CNNs, and perform an **ablation study** which will allow us to improve our accuracy.

- Computer Vision tasks: Image classification, localization, etc
- How do computers see an image?
- Convolutional Neural Network
- Hands-on Assignment: Convolutional Neural Networks (and Ablation Studies)

# Long Short-Term Memory Networks and Natural Language Processing

We’ll start by introducing **Recurrent Neural Networks (RNN)**, which are neural networks with loops in them. We’ll see how RNNs can be used for inputting and outputting sequences and how they maintain an internal *state*. Then we’ll introduce **Long Short-Term Memory (LSTM) networks** and see how they overcome some significant shortcomings of general RNNs. Along the way, we’ll also introduce the concept of word embeddings. We’ll **implement an RNN to generate text** one character at a time, and also see how this model can be used for performing state-of-the-art **language translation**.

- Recurrent Neural Networks and Long Short-Term Memory Networks
- Deep Natural Language Processing
- Quiz: Recurrent Neural Networks
- Hands-on Assignment: Text Generation using Recurrent Neural Networks
- Sequence to Sequence Learning with Neural Networks

# Generative Adversarial Networks and Image Generation

In this section (single tutorial), we’ll learn about **Generative Adversarial Networks** (GANs). GANs are one of the few machine learning techniques which has given good performance for **generative tasks**, or more broadly unsupervised learning. In particular, they have given splendid performance for a variety of **image generation** related tasks. We’ll introduce the image generation problem, describe the GAN setup and look at the results of applying GANs to image generation.