Delve into the history of deep learning, and explore neural networks like the perceptron, how they function, and what architectures underpin them. Complete short coding assignments in Python.
Deep Learning Essentials
This course is part of AI and Machine Learning Essentials with Python Specialization
Instructors: Chris Callison-Burch
Sponsored by ITC-Infotech
Recommended experience
What you'll learn
Understand the history and context of the deep learning field, and explore what "intelligence" really means.
Explore deep learning models like the perceptron, neural networks and backpropagation, and study the techniques that drive them.
Code a project using Python where you will preprocess data and use your data to train a Support Vector Machine (SVM.)
Details to know
Add to your LinkedIn profile
12 assignments
September 2024
See how employees at top companies are mastering in-demand skills
Build your subject-matter expertise
- Learn new concepts from industry experts
- Gain a foundational understanding of a subject or tool
- Develop job-relevant skills with hands-on projects
- Earn a shareable career certificate
Earn a career certificate
Add this credential to your LinkedIn profile, resume, or CV
Share it on social media and in your performance review
There are 4 modules in this course
In this module, we'll first peek through history, talk about the different ways in which people have attempted to build artificial intelligences in the past and explore what intelligence is made up of. Then, we'll start our investigation into an early model called the perceptron.
What's included
11 videos2 readings3 assignments1 discussion prompt
This module, we will continue exploring the perceptron. We'll delve into stochastic gradient descent (SGD), a fundamental optimization technique that enables the perceptron, and other models, to learn from data by iteratively updating the model's parameters to minimize errors. Afterward, we will look at kernel methods. These techniques can separate two sets of points in more complicated ways, drawing inspiration from how the human eye works.
What's included
11 videos3 assignments1 programming assignment
This module, we will move to exploring fully-connected networks. These networks are sophisticated models that can be thought of as a perceptron sitting on top of another perceptron, continuing in such a fashion. Each layer in a fully-connected network takes inputs from the layer below it, working to separate data points (such as the red and the blue scattered points) a little better than the one before it, and then passes it on to the next layer.
What's included
8 videos3 assignments1 discussion prompt
We will finish this course by looking at backpropagation, which is an algorithm to train neural networks to find the best set of weights that minimize error on the data. Backpropagation applies the chain rule from calculus to efficiently calculate gradients of the loss function with respect to the weights, enabling the model to update its weights in the opposite direction of the gradient. We'll discuss the importance of typical datasets consisting of images, sentences, and sounds, and how neural networks can learn from the spatial regularities present in such data.
What's included
8 videos1 reading3 assignments1 programming assignment
Offered by
Why people choose Coursera for their career
Recommended if you're interested in Data Science
DeepLearning.AI
University of Michigan
DeepLearning.AI
Open new doors with Coursera Plus
Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Advance your career with an online degree
Earn a degree from world-class universities - 100% online
Join over 3,400 global companies that choose Coursera for Business
Upskill your employees to excel in the digital economy