Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
63,187 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

HD

Dec 5, 2019

I enjoyed it, it is really helpful, id like to have the oportunity to implement all these deeply in a real example.

the only thing i didn't have completely clear is the barch norm, it is so confuse

AS

Apr 18, 2020

Very good course to give you deep insight about how to enhance your algorithm and neural network and improve its accuracy. Also teaches you Tensorflow. Highly recommend especially after the 1st course

Filter by:

7126 - 7150 of 7,254 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By John D

•

Feb 13, 2021

The content was solid, but some of the labs seemed a bit buggy (getting full credit even though my code didn't run). I also wish the TensorFlow tutorial used TensorFlow 2.0

By Debjit G

•

Jun 19, 2020

The course was amazing as expected. But the quality of videos needs improvement. Also if programming part was explained in the videos then that would be great. Thank you.

By Sagar B

•

Jun 15, 2020

Too many issues with the auto grader system. Need to improve the know errors and save the time pf users. I spent more than 3 hours total just to fix the grader bugs.

By Yogeshwar j

•

May 24, 2020

It could have been more detailed and interesting. Compared to the first course of the specialization, This course's material didn't clear all the concepts clearly.

By Madhur S

•

Aug 4, 2020

Great course for a beginner like me. I wish however that sizing of hidden layers/units should have been addressed as it is very difficult to achieve the optimum

By Aniruddh B

•

Apr 15, 2020

Docked one star because of using Tensorflow 1.4 instead of 2.0. Docked another star because I found the course content less interesting than the first course.

By Kishore K

•

Sep 17, 2018

Some of the videos are very abstract and needs a bit of mathematical intuitions. These intuitions are best obtained by calculations rather than a lecture :)

By Yazid H

•

Oct 12, 2019

A bit too theoretical for my taste, lacks practical homework and getting our hands dirty. Really appreciated the final week's structure and topics.

By harmouchi m

•

May 6, 2018

ike usual andrew ng perfect explanation simple go to essential stuff.

the minus points some troubles with notebook

big thanks for andrew ng's team.

By Marco B

•

Apr 20, 2020

There are errors on some exercises (adam of week 2) still unsolved after over 1 year (found same error reported on the forum/discussion)

By Christian K

•

May 22, 2018

The lecture videos are good but the assignments are not that useful as they provide th answers within them and are somehow repetitive.

By Pavel K

•

Aug 3, 2019

Lectures are good. Programming exercises are too easy. Too mechanical, no much thinking required, à la "fill the gaps" exercises.

By rupamita s

•

Jun 1, 2020

I would give five starts if not for that grade error issue. I hope it gets resolved for good. Otherwise. Great course as usual.

By Robert D

•

Jan 4, 2022

Mostly good, last programming assignment had some issues with shapes required for various code sections not lining up properly

By Carsten B

•

Jun 9, 2020

Interesting, but not nearly as good as the first one. Disjointed topics, unconnected exercises made this less digestable.

By Jean-Michel P

•

Jun 17, 2021

Decent class, but the last module(week) felt a bit rushed. Hopefully it was simply an introduction for the next class.

By Kang L T

•

Jan 25, 2019

I think more should be done regarding the TensorFlow framework with more explanations given to what the functions did

By Moustafa M

•

Dec 9, 2017

Lake of practice, Lake of intimations with good examples

Less in Ternsorflow don't know how to implement and deploy it

By Mohammad E

•

Aug 14, 2020

The course and the material are great. However, the codes in the labs have serious problems which should be solved.

By Lucas N A

•

Mar 6, 2020

Really helpful advises. I felt it was too focus on the implementation side but I liked the intuitions parts better.

By Rishabh G

•

Apr 28, 2020

Week 3 of the course does not have a practice problem for batch normalization. Wanted to implement it and learn.

By Ramachandran C

•

Oct 6, 2019

I found the video lectures useful to understand the concepts, but the programming exercises are over-simplified.

By Edmund C

•

Sep 30, 2024

Not as good as the other courses in the specialization track. It seems there was a good amount of repetition.

By Carlos V

•

Jun 20, 2020

Would give more stars if the final assignment used Tensorflow@ and not an outdated version that is not in use.

By Pranshu D

•

Mar 6, 2018

More tensorflow related tutorials should have been there. The lectures turned a little boring and redundant.