Deep Learning is one of the most sought-after skills in tech right now. On November 14, 2019, I completed the Neural Networks and Deep Learning course offered by deeplearning.ai on coursera.org. Besides Cloud Computing and Big Data technologies, I have huge interests in Machine Learning and Deep Learning. I did my Masters in Computer Science with a concentration in Data Science and I had taken a few courses on Machine Learning. I wanted to advance my knowledge on Deep Learning and started on the Deep Learning Specialization on Coursera. In this blog post, I share my experience with the first course on the specialization.
Coursera.org is an online learning website that offers massive open online courses (MOOC), specializations, and degrees. There are courses on various subjects such as Computer Science, Biology, Mathematics, etc. offered from various Universities and Organizations from different parts of the world. Many of the courses on Coursera follow the format of University courses with Assignments, Projects, Quizzes, and Exams. With a Coursera premium subscription, a Certificate of Completion can be obtained by completing the graded assignments and exams. I paid $49 per month to complete this course and obtain the Certificate.
Deep Learning Specialization
Deep Learning Specialization is one of the most popular programs on Deep Learning and Neural Networks. This specialization aims to help students master Deep Learning and build a career in AI. This program is taught by Andrew Ng, the co-founder of coursera.org and one of the most popular teachers of Machine Learning and Deep Learning. Andrew Ng is one of the main reasons for me to take this specialization. This program comprises of 5 courses.
- Neural Networks and Deep Learning
- Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization
- Structuring Machine Learning Projects
- Convolutional Neural Networks
- Sequence Models
Neural Networks and Deep Learning
Neural Networks and Deep Learning is the first course in the Deep Learning Specialization. The course spans for 4 weeks and covers all the foundations of Deep Learning. Each week has at least one quiz and one assignment. The quizzes have multiple-choice questions, and the assignments are in Python and are submitted through Jupyter Notebooks.
Familiarity with Python programming language is a must as all the assignments are in Python. If you’re not familiar with Python, I would strongly recommend learning Python before taking this course.
It would help to have some knowledge about Linear Algebra, Calculus, and Machine Learning basic concepts. It’s not a huge problem if it’s been a long time since you learned them. The course has refresher tutorials on Linear Algebra and Calculus and covers most of the concepts that’s used in the course.
Before taking this course, I had working experience with Python and Machine Learning.
The course spans over 4 weeks starting with the basics and building upon it each week. We start with the simplest 1-layer neural network in the second week and complete an L-layer neural network in the fourth week.
Week 1 – Introduction to deep learning
In the first week, introductions about Neural Networks and Deep Learning are discussed.
- Introduction to Neural Network
- Supervised learning with Neural Network
- Reasons for Deep Learning taking off
Week 2 – Neural Networks Basics
The second week is, in my opinion, the foundation week for the specialization. Many of the important concepts of Deep Learning are discussed. There are also refreshers on calculus and linear algebra.
- Binary Classification
- Logistic Regression
- Cost Function
- Gradient Descent
- Computation graph and Derivatives with a Computation Graph
- Vectorization and Vectorizing Logistic Regression/Gradient Output
Week 3 – Shallow neural networks
The lectures from the third week teach how to build a neural network with one hidden layer using forward propagation and back propagation.
- Two-layer Neural Network
- Neural Network Representation
- Computing a Neural Network’s Output
- Vectorizing across multiple examples
- Activation functions need for non-linear activation functions, and derivatives of activation functions
- Gradient Descent for Neural Networks
- Random Initialization
Week 4 – Deep Neural Networks
In the last week, we learned about the key computations underlying deep learning, used them to build and train deep neural networks, and applied it to computer vision
- Deep L-layer neural network
- Forward Propagation in a Deep Network
- Matrix dimensions right
- Purpose of Deep Representations
- Building blocks of deep neural networks
- Forward and Backward Propagation
- Parameters vs Hyperparameters
There are 4 required and 1 optional assignment. All the assignments are in Python and are submitted through Jupyter Notebook. The notebooks run on Coursera servers so no setup is required on the local machines.
Assignment 0 Python Basics with Numpy
This is an optional assignment to practice numpy, vectorization, and broadcasting. We implement deep learning functions such as the softmax, sigmoid, and sigmoid. We also practice normalizing inputs, vectorization, and broadcasting.
Assignment 1 Logistic Regression with a Neural Network Mindset
We built our first model. We implement an image recognition algorithm using Logistic Regression with a Neural Network mindset. The logistic regression can be thought of as a 1-layer neural network. The image recognition algorithm classifies pictures as a cat or not a cat with 70% accuracy.
- Define the model structure with input features
- Initialize the model’s parameters
- Calculate loss using forward propagation
- Calculate gradient using backward propagation
- Update parameters using gradient descent
- Build a model with a neural network mindset
Assignment 2 Planar data classification with a hidden layer
We build a 2-layer neural network to generate red and blue points to form a flower. At first, we used logistic regression to classify the points but it didn’t perform well. Then we build a 2-layer neural network with the following structure with an accuracy of 90%.
As you can see from the above pictures, logistic regression doesn’t classify well because the dataset is not linearly separable. A 2 layer neural network performs way better.
Assignment 3 Building your Deep Neural Network: Step by Step
In this assignment, we build all the build blocks of an L-layer neural network. We implement functions for initializing parameters, forward propagation, computing loss, backward propagation, and updating the parameters. These functions are then utilized in the next assignment.
Assignment 4 Deep Neural Network – Application
This is the most exciting assignment in the course. We use all the helper functions we developed in the previous assignment to build an L-Layer neural network that classifies cat vs. non-cat images with an accuracy of 80%.
I tested my neural network classifier with a few pictures and below are the results of the classification.
Here are few examples where the classifier correctly classified the pictures as cats.
In the below examples, the neural network classifier correctly classified the pictures as not-cat.
The classifier incorrectly classified the below picture as not-cat even though it’s a picture of a cat. This is expected since the accuracy of the neural network was 80% on the test dataset. There are going to be approximately 20% incorrect predictions.
Neural Networks and Deep Learning is the first course in the Deep Learning Specialization program. Over the course of 4 weeks, we learned all the required foundations to build an L-Layer neural network to classify pictures as cat or not cat using Python. I recommend it to anyone with some experience with Python programming and an interest in learning Deep Learning. I thoroughly enjoyed the course and learned the building blocks of neural networks. I look forward to taking the second course in the specialization – Improving Deep Neural Networks: Hyperparameter Tuning, Regularization, and Optimization.