Deep Learning is the go-to technique for many applications, from natural language processing to biomedical. Deep learning can handle many different types of data such as images, texts, voice/sound, graphs and so on. This course will cover the basics of DL including how to build and train multilayer perceptron, convolutional neural networks (CNNs), recurrent neural networks (RNNs), autoencoders (AE) and generative adversarial networks (GANs). The course includes several hands-on projects, including cancer detection with CNNs, RNNs on disaster tweets, and generating dog images with GANs.
Introduction to Deep Learning
This course is part of Machine Learning: Theory and Hands-on Practice with Python Specialization
Instructor: Geena Kim
10,961 already enrolled
Included with
(27 reviews)
Recommended experience
What you'll learn
Apply different optimization methods while training and explain different behavior.
Use cloud tools and deep learning libraries to implement CNN architecture and train for image classification tasks.
Apply deep learning package to sequential data, build models, train, and tune.
Skills you'll gain
Details to know
Add to your LinkedIn profile
4 quizzes
See how employees at top companies are mastering in-demand skills
Build your subject-matter expertise
- Learn new concepts from industry experts
- Gain a foundational understanding of a subject or tool
- Develop job-relevant skills with hands-on projects
- Earn a shareable career certificate
Earn a career certificate
Add this credential to your LinkedIn profile, resume, or CV
Share it on social media and in your performance review
There are 5 modules in this course
We are starting off the course with a busy week. This week's module has two parts. In the first part, after a quick introduction to Deep Learning's exciting applications in self-driving cars, medical imaging, and robotics, we will learn about artificial neurons called perceptrons. Interestingly, neural networks are loosely modeled on the human brain with perceptrons mimicking neurons. After we learn to train a simple perceptron (and become aware of its limitations), we will move on to more complex multilayer perceptrons. The second part of the module introduces the backpropagation algorithm, which trains a neural network through the chain rule. We will finish by learning how deep learning libraries like Tensorflow create computation graphs for gradient computation. This week, you will have two short quizzes, a Jupyter lab programming assignment, and an accompanying Peer Review assignment. This material, notably the backpropagation algorithm, is so foundational to Deep Learning that it is essential to take the time necessary to work through and understand it.
What's included
6 videos7 readings2 quizzes1 programming assignment1 peer review1 discussion prompt
Last week, we built our Deep Learning foundation, learning about perceptrons and the backprop algorithm. This week, we are learning about optimization methods. We will start with Stochastic Gradient Descent (SGD). SGD has several design parameters that we can tweak, including learning rate, momentum, and decay. Then we will turn our attention to advanced gradient descent methods like learning rate scheduling and Nesterov momentum. Besides vanilla gradient descent, other optimization algorithms include AdaGrad, AdaDelta, RMSprop, and Adam. We will cover general tips to reduce overfitting while training neural networks, including regularization methods like dropout and batch normalization. This week, you will build your DL toolkit, gaining experience with the Python library Keras. Assessments for the week include a quiz and a Jupyter lab notebook with an accompanying Peer Review. This assignment is your last Jupyter lab notebook for the course. For the next three weeks, you will build hands-on experience and complete weekly mini-projects that incorporate Kaggle challenges.
What's included
6 videos2 readings1 quiz1 peer review1 ungraded lab
This module will teach a type of neural network called convolutional neural networks, suitable for image analysis tasks. We will learn about definitions, design parameters, operations, hyperparameter tuning, and applications. There is no Jupyter lab notebook this week. You will have a brief quiz and participate in a clinically relevant Kaggle challenge mini-project. It is critical to evaluate whether cancer has spread to the sentinel lymph node for staging breast cancer. You will build a CNN model to classify whether digital pathology images show that cancer has spread to the lymph nodes. This project utilizes the PCam dataset, which has an approachable size, with the authors noting that "Models can easily be trained on a single GPU in a couple of hours, and achieve competitive scores." As you prepare for the week, look over the rubric and develop a plan for how you will complete it. It will be necessary for a project like this to work on a timeframe that allows you to run experiments. The expectation is not that you will cram the equivalent of a final project into a single week or that you need to have a top leaderboard score to receive a good grade for this project. Hopefully, you will have time to achieve some exciting results to show off in your portfolio.
What's included
11 videos2 readings1 quiz1 peer review
This module will teach you another neural network called recurrent neural networks (RNNs) to handle sequential data. So far, we have covered feed-forward neural networks, including Multi-layer Perceptrons and CNNs. However, in biological systems, information can flow backward and forwards. RNNs do a backward pass closer to biological systems. Using RNNs has excellent benefits, especially for text data, since RNN architectures reduce the number of parameters. We will learn about the vanishing and exploding gradient problems that can arise when working with vanilla RNNs and remedies for those problems, including GRU and LSTM cells. We don't have a quiz this week, but we have a Kaggle challenge mini-project on NLP with Disaster Tweets. The project is a Getting Started competition designed for learners building their machine learning background. The challenge is very doable in a week, but make sure to start early to run experiments and iterate a bit.
What's included
4 videos2 readings1 peer review
This module will focus on neural network models trained via unsupervised Learning. We will cover autoencoders and GAN as examples. We will consider the famous AI researcher Yann LeCun's cake analogy for Reinforcement Learning, Supervised Learning, and Unsupervised Learning. Supervised Deep Learning has had tremendous success, mainly due to the availability of massive datasets like ImageNet. However, it is expensive and challenging to obtain labeled data for areas like biomedical images. There is great motivation to continue developing unsupervised Deep Learning approaches to harness abundant unlabeled data sources. This week is the last week of new course material. There is no quiz or Jupyter notebook lab. Generative adversarial networks (GANs) learn to generate new data with the same statistics as the training set. This week, you will wrap up one final Kaggle mini-project. This time, you will experiment with creating a network to generate images of puppies.
What's included
4 videos2 readings1 peer review
Instructor
Offered by
Recommended if you're interested in Machine Learning
MathWorks
DeepLearning.AI
Coursera Project Network
Build toward a degree
This course is part of the following degree program(s) offered by University of Colorado Boulder. If you are admitted and enroll, your completed coursework may count toward your degree learning and your progress can transfer with you.¹
Why people choose Coursera for their career
New to Machine Learning? Start here.
Open new doors with Coursera Plus
Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Advance your career with an online degree
Earn a degree from world-class universities - 100% online
Join over 3,400 global companies that choose Coursera for Business
Upskill your employees to excel in the digital economy
Frequently asked questions
A cross-listed course is offered under two or more CU Boulder degree programs on Coursera. For example, Dynamic Programming, Greedy Algorithms is offered as both CSCA 5414 for the MS-CS and DTSA 5503 for the MS-DS.
· You may not earn credit for more than one version of a cross-listed course.
· You can identify cross-listed courses by checking your program’s student handbook.
· Your transcript will be affected. Cross-listed courses are considered equivalent when evaluating graduation requirements. However, we encourage you to take your program's versions of cross-listed courses (when available) to ensure your CU transcript reflects the substantial amount of coursework you are completing directly in your home department. Any courses you complete from another program will appear on your CU transcript with that program’s course prefix (e.g., DTSA vs. CSCA).
· Programs may have different minimum grade requirements for admission and graduation. For example, the MS-DS requires a C or better on all courses for graduation (and a 3.0 pathway GPA for admission), whereas the MS-CS requires a B or better on all breadth courses and a C or better on all elective courses for graduation (and a B or better on each pathway course for admission). All programs require students to maintain a 3.0 cumulative GPA for admission and graduation.
Yes. Cross-listed courses are considered equivalent when evaluating graduation requirements. You can identify cross-listed courses by checking your program’s student handbook.
You may upgrade and pay tuition during any open enrollment period to earn graduate-level CU Boulder credit for << this course/ courses in this specialization>>. Because << this course is / these courses are >> cross listed in both the MS in Computer Science and the MS in Data Science programs, you will need to determine which program you would like to earn the credit from before you upgrade.
MS in Data Science (MS-DS) Credit: To upgrade to the for-credit data science (DTSA) version of << this course / these courses >>, use the MS-DS enrollment form. See How It Works.
MS in Computer Science (MS-CS) Credit: To upgrade to the for-credit computer science (CSCA) version of << this course / these courses >>, use the MS-CS enrollment form. See How It Works.
If you are unsure of which program is the best fit for you, review the MS-CS and MS-DS program websites, and then contact datascience@colorado.edu or mscscoursera-info@colorado.edu if you still have questions.