Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
63,175 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

AS

Apr 18, 2020

Very good course to give you deep insight about how to enhance your algorithm and neural network and improve its accuracy. Also teaches you Tensorflow. Highly recommend especially after the 1st course

XG

Oct 30, 2017

Thank you Andrew!! I know start to use Tensorflow, however, this tool is not well for a research goal. Maybe, pytorch could be considered in the future!! And let us know how to use pytorch in Windows.

Filter by:

6026 - 6050 of 7,253 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By Bilal M

Dec 11, 2017

:)

By MohammadSadegh Z

Jun 15, 2021

By 홍승은

May 7, 2021

-

By eduardo e

Jul 5, 2020

d

By Vinish R

May 27, 2020

g

By Yuanfang S

Jun 8, 2019

-

By Sarah Z

Nov 7, 2018

V

By 张帆

Aug 25, 2018

By Insoo K

Jul 26, 2018

.

By Gilles A

Apr 13, 2018

G

By Hamidou S

Feb 28, 2018

V

By veera Y

Jan 28, 2018

G

By Yujie C

Jan 4, 2018

By Harish K

Nov 25, 2017

G

By Sanguk P

Oct 27, 2017

w

By Ed M

Sep 20, 2017

E

By StudyExchange

Aug 20, 2017

V

By D. R

Oct 1, 2019

(09/2019)

Overall the courses in the specialization are great and provide great introduction to these topics, as well as practical experience. Many topics are explained clearly, with valuable field practitioners insight, and you are given quizzes and code-exercises that help deepen the understanding of how to implement the concepts in the videos. I would recommend to take them after the initial Andrew Ng ML course by Stanford, unless you have prior background in this topic.

There are a few shortbacks:

1 - the video editing is poor and sloppy. Its not too bad, but it’s sometimes can be a bit annoying.

2 - most of the exercises are too easy, and are almost copy-paste. I need to go over them and create variations of them in-order to strengthen my practical skills. Some exercises are quite challenging though (especially in course 4 and 5), and I need to go over them just to really nail them down, as things scale up quickly. Course 3 has no exercises as its more theoretical. Some exercises have bugs - so make sure to look at the discussion board for tips (the final exercise has a huge bug that was super annoying).

3 - there are no summary readings - you have to (re)watch the videos in order to check something, which is annoying. This is partially solved because the exercises themselves usually hold a lot of (textual) summary, with equations.

4 - the 3rd course was a bit less interesting in my opinion, but I did learn some stuff from it. So in the end it’s worth it.

5 - Slide graphics and Andrew handwriting could be improved.

6 - the online Coursera Jupyter notebook environment was a bit slow, and sometimes get stuck.

Again overall - highly recommended

By Rameses

Nov 4, 2019

I have taken Machine Learning courses earlier from Andrew Ng via Coursera. I have always felt that the delivery of the material and the pedagogy are superb and have always rated a 5 star as also for the first course in this specialization. This second course had several interesting topics I had never learned in my previous NN courses at universities. The programming exercises for weeks 1 and 2 were excellent in helping recap the material in the videos and slides. However as far as TensorFlow is concerned, I was a bit disappointed because it seemed like we were muddling through the various code snippets rather than getting a firm grasp of what is obviously a very complex Deep Learning programming framework. But I understand the time limitations and I realize that this intro to TensorFlow is merely to whet one's appetite and encourage us to explore more about this framework as well as other frameworks. I believe it is up to each individual to explore the concepts further and get a better understanding.

The technology behind the courses is awesome as well as the programming assignment notebooks which were well documented and must have taken gargantuan amount of time and effort in prepping.

In summary, I learned a lot from this course and while my course objectives were not fulfilled, almost all of them were.

By Alexander W

Oct 24, 2021

First for the worst: All discussion for this course has been moved out of coursera into Discord. I consider Discord as overloaded and confusing to navigate, especially in comparison to the neat and clear Coursera discussion groups. It requires an additional registration. Quite annoying for me, as I use Coursera mostly with the mobile app, where opening Discord involves an extra step and switching apps back and forth. No more support, help or assistance is given here on Coursera. Guys, please reconsider this.

It seems this course is a little out of it's creators focus. Corrections are still made, but on separate pages instead of editing the video, which would make it so much easier. The assignments are still maintained and updated regularly.

Apart from this, this course was great! Quality and explanations are excellent, many examples, (not too easy) exercises and assignments to practice what you learned. I would recommend it to someone with at least some contact with machine learnung and Python.

By Alberto G

Nov 16, 2018

One more brilliant course by Professor Andrew Ng. It covered all important topics in model training and testing and hyperparameter tuning, ending with an introduction to TensorFlow. But even though Professor Ng stressed that the choice for a Deep Learning framework should consider the belief that the company supporting it has a long term commitment to keep it open sourced, he doesn't explain why he believes that Google will do so with TensorFlow. Even though I've been dealing with TF for a couple of years now, having complete both AI and Deep Learning Nanodegrees at Udacity, I still struggle to understand its cryptic innings. I hoped that Professor Ng could bridge that gap, but unfortunately that was not the case. Maybe in later versions of this course he can explain a little better how exactly TensorFlow works - we know it does, but it is very frustrating to depend upon tons of Stackoverflow queries to get syntax right and build even simple models. Thanks God he uses Keras in later modules.

By Christopher T

Jan 6, 2018

I gave this less than perfect stars, just for a few technical issues. It really does deserve 5 stars, and I hope I can come back and update my review once the issues are addressed.

There seem to be a few bugs in the final TF project, but using the forums it was possible to figure out what I did wrong. For me, I was computing the cost function using the untransposed forward prop and target vectors. I'm actually glad that this issue came up, as it gave me a better idea of where difficult to find issues can appear.

Despite not completing the last project successfully, I was able to score 100% on submission. Hopefully they can improve on this mis-scoring issue.

After reflecting a bit, the TF material seems a bit tacked on. Perhaps it deserves its own space? There are certainly a great deal of concepts that can be covered on this modern library.

Otherwise, it's an excellent course. Just a few rough edges to hammer out.

By Sebastiaan v E

Nov 17, 2017

Very good course in terms of material and explanation.

The autograder for grading the problems is not very good:

- it fails a lot due to "technical difficulties",

- it sometimes fails even though the solution is correct, e.g., a = a + b succeeds but a += b fails,

- It sometimes passes even though the solution is incorrect (I used gradient descent instead of Adam, but it still passed with a 100% score).

There were also quite some typos and other errors in the last assignment, so it seems like it was made in a bit of a hurry. The forums list all the corrections, so it should be easy to fix this.

It would also be really nice to have an option to reset the notebook to its initial state. Now if you accidentally mess it up or delete it, it's gone. My recommendation to students is to make a copy of the notebook before you start working on it.

The automatically generated captions also contain lots of errors.

By José D

Sep 7, 2019

This is Course2 of the Deep Learning Specialization. In Course1, we learned how to code the algorithm in Numpy. Most of Course2 show how to optimize and tune the algorithm and how to use and tune the hyper-parameters. Most assignments are well-designed and easy to perform as they focus more on the understanding than "finding how to code it". However, the last assignment introduces TensorFlow where we re-implement the algorithm using TensorFlow concepts. I have to say I expected TensorFlow to simplify things but it turns out I find the Math/Numpy implementation way easier to understand than TensorFlow. I'll have to dig deeper in TensorFlow concepts to understand it better. I would have liked more TensorFlow introduction. I hope the following courses will go into deeper details. Nevertheless, great course and very instructive.

By Jörg N

Oct 19, 2019

I liked the course a lot and I really adore the way Andrew Ng teaches the subject. As an improvement suggestion I would extend the course to four weeks to deepen the practice on Hyperparameter tuning as well as the introduction to Tensorflow. The Programming exercises of week 3 were really challenging. First since there were partially misleading statements in the comments (Z before activation) and second because variables were given the same names as tf parameters and partially even function definitions. So you could see things like a = a, b = b in tf function calls which just does not fit for beginners in portentously both Python (local variables concepts, etc.) and TF. I am more than grateful though that I could do this course of the specialisation and I would really like to express my deep gratitude to Andrew Ng.