In this 1 hour long guided project, you will learn to create and train multi-task, multi-output models with Keras. You will learn to use Keras' functional API to create a multi output model which will be trained to learn two different labels given the same input example. The model will have one input but two outputs. A few of the shallow layers will be shared between the two outputs, you will also use a ResNet style skip connection in the model. If you are familiar with Keras, you have probably come across examples of models that are trained to perform multiple tasks. For example, an object detection model where a CNN is trained to find all class instances in the input images as well as give a regression output to localize the detected class instances in the input. Being able to use Keras' functional API is a first step towards building complex, multi-output models like object detection models.
(75 reviews)
Recommended experience
What you'll learn
Creating multi-task models with Keras
Training multi-task models with Keras
Skills you'll practice
Details to know
Add to your LinkedIn profile
Only available on desktop
See how employees at top companies are mastering in-demand skills
Learn, practice, and apply job-ready skills in less than 2 hours
- Receive training from industry experts
- Gain hands-on experience solving real-world job tasks
- Build confidence using the latest tools and technologies
About this Guided Project
Learn step-by-step
In a video that plays in a split-screen with your work area, your instructor will walk you through these steps:
Introduction (3 min)
Create Dataset (8 min)
Dataset Generator (7 min)
Create Model (18 min)
Training the Model (7 min)
Final Predictions (4 min)
Recommended experience
Prior programming experience in Python. Conceptual understanding of Neural Networks. Prior experience with TensorFlow and Keras is recommended.
6 project images
Instructor
Offered by
How you'll learn
Skill-based, hands-on learning
Practice new skills by completing job-related tasks.
Expert guidance
Follow along with pre-recorded videos from experts using a unique side-by-side interface.
No downloads or installation required
Access the tools and resources you need in a pre-configured cloud workspace.
Available only on desktop
This Guided Project is designed for laptops or desktop computers with a reliable Internet connection, not mobile devices.
Why people choose Coursera for their career
Learner reviews
75 reviews
- 5 stars
76%
- 4 stars
18.66%
- 3 stars
4%
- 2 stars
0%
- 1 star
1.33%
Showing 3 of 75
Reviewed on Feb 24, 2023
Fantastic course and very easy to follow on implementing multi-task learning on the MNIST dataset. Thank you very much!
Reviewed on May 14, 2021
Amit is awesome. You are one the best instructors/teachers , I have ever seen in my life.
Reviewed on Jun 30, 2021
This course is pretty good, that I learned many concepts in one hour. The instructor too very good that his way of explanation made me to understand it quickly.
You might also like
DeepLearning.AI
University of London
University of Washington
DeepLearning.AI
New to Machine Learning? Start here.
Open new doors with Coursera Plus
Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Advance your career with an online degree
Earn a degree from world-class universities - 100% online
Join over 3,400 global companies that choose Coursera for Business
Upskill your employees to excel in the digital economy
Frequently asked questions
Because your workspace contains a cloud desktop that is sized for a laptop or desktop computer, Guided Projects are not available on your mobile device.
Guided Project instructors are subject matter experts who have experience in the skill, tool or domain of their project and are passionate about sharing their knowledge to impact millions of learners around the world.
You can download and keep any of your created files from the Guided Project. To do so, you can use the “File Browser” feature while you are accessing your cloud desktop.