Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
63,068 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

YL

Invalid date

very useful course, especially the last tensorflow assignment. the only reason i gave 4 stars is due to the lack of practice on batchnorm, which i believe is one of the most usefule techniques lately.

NC

Invalid date

Yet another excellent course by Professor Ng! Really helped me gain a detailed understanding of optimization techniques such as RMSprop and Adam, as well as the inner workings of batch normalization.

Filter by:

6101 - 6125 of 7,238 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By Anton D

•

Oct 24, 2017

The course content was of very high quality. There were just some issues in the notebooks that are already covered in the forums. I think it's worth fixing them. In the videos there are also some small mistakes made but nothing serious. Also, about the programming assignment, I think it would be useful to have some in which less of the code is readily written and more is left to the student.

By Varad P

•

Apr 6, 2020

As usual this course was really good, but at some parts I had a feeling that Professor Andrew Ng was a bit vague in explaining some concepts. So, I had to spend a lot of time on thinking about it (which I feel could have been avoided). It will really help if the instructors are able to provide additional references regarding the hyperparameters and the other topics discussed in this course.

By Ryan M

•

Oct 29, 2020

This course did a good job of covering much of the material. I felt like the explanations of most of the concepts in the videos were good. The last programming assignment, on TensorFlow, felt like a lot of guesswork for me. The basic ideas of TensorFlow were not really covered well in the lectures. Other than this assignment, I thought the programming assignments were generally helpful

By Frankie P

•

Jul 17, 2023

A very good, in-depth course. Only small complaint is some of the slight nuance in the final programming exercise, where some details (i.e. the from_logits parameter) being a little bit unclear. Personally, a bit more clarity on how to use that function specified, and the fact that the inputs required transposing should be highlighted more clearly. Other than that though, brilliant!!

By Mahnaz A K

•

May 30, 2019

The best thing that I get from these courses is to learn about intuitions all the time. Although I really enjoyed that part on optimization and parameter tuning , the same standard wasn't kept in TF part. What is tf graph? why do we need it? why session? .... Unfortunately the tf docs fail on explaining these concepts as well. If I don't get answer to those questions here, then where?

By Race V

•

Nov 26, 2017

I am slow on the uptake on the maths side of the equation, while the repetition of the class lectures is most appreciated. No, it is not repetitive, Andrew keeps expanding on our prior knowledge for each week.

Even with 30 plus years since I did Calculus I am able to follow and understand thanks to the team.

Though, they do need help with correcting some minor mistakes in the webpages.

By KISHOR

•

Mar 29, 2020

i learnt a lot about tuning Neural Networks through various optimization and regularization methods in this course. this helped me a lot in understanding the working and derivatives of optimizing neural networks through various algorithms. this course is making the foundations of deep learning look easy and understandable than other sources to the person who is taking up this course.

By Pierre C

•

Dec 13, 2022

I appreciated the whole course, particularly when Andrew NG explains step by step a new topic (for instance Adam algo).

This course has a very good level.

However I wish some general functions from SkLearn of Tf... were presented to go from from-scratch functions written in this course (essential to understand the pedagogy) to generally used ones (in particular by ML-DL engineers).

By Guillermo

•

Jul 15, 2020

Well structured course. It shows a great overview on hyperparameter tuning.

It would be great if the lecturer could keep the voice tone as he speaks, especially in long explanations where you can feel how his voice tone is going lower and lower, and then suddenly on the next video cut it goes back to normal exploding you ears ( as you had to increase the volume of your speakers)

By Roy W

•

Sep 13, 2019

Great course on hyperparameter tuning. Some of the code projects used the same variable names repeatedly in different contexts, which, to me, at least, is a bad practice to encourage in students. Also, in the Tensorflow project, some additional numerical calculations would have made it easier to catch issue earlier. But Andrew Ng was amazing, as always - clear and informative.

By João P B D

•

Sep 23, 2020

Excellent content provided by a world-class expert in the field whose didactics is on point. Great selection of applications. Not much mathematical formality and programming assignments not really challenging as an assessment tool. It's definitely the theory one might need to amass upon the first course's content, however what was previously easy enough is now even more so.

By Martin P

•

Dec 23, 2017

The course is well organized and I've learnt quite a lot related math knowledge. The only thing I felt need to improve is that the assignment was too easy and I can easily pass even though I didn't fully understand all the concept and details. Hope we can make it hard and more opportunities for the learner to make mistake and correct in order to learn more.

Thanks

Martin

By Supriya S

•

Sep 27, 2017

Good coverage of the practical aspects of Neural networks. Happy to be introduced to the latest research on the topic. Not the course's fault but there seems to be reuse of the same variable names in different papers. Wish the course introduced some consistency.

The introduction to TensorFlow was useful. However, wish there was more coverage / exercises for this topic.

By Ido S

•

Nov 26, 2017

Andrew Ng's courses are a real delight - he's a natural teacher that explains well and can get students excited about a subject. In this class there were some problems with the last exercise (the TensorFlow tutorial) - it was too simple and yet cryptic, with some unaddressed errors and a lot of loose ends (thus only 4 stars - all his other classes are definitely 5)

By Sebastián J C

•

Sep 15, 2020

Only detail is that programming exercises are way too simple, copy-paste kind of things. I could understand that being the case for the first, introductory course, but it would've been nice to have a little bit more of a challenge to get used to the programming implementation details. Also, it is outdated in the sense that you are using version 1 of TensorFlow.

By Shuai X

•

Dec 15, 2017

This course subsumes relevant contents in Stanford Machine Learning Course. Some useful addition to the Stanford Course are briefs on Gradient Descent With Momentum, RMSdrop and Adam as well as elementary practices on Tensorflow. People with basic knowledge of linear algebra can complete this course in a day (i.e. 10 hours) by skipping less important videos.

By Crawford F

•

Dec 7, 2020

The final lab is somewhat confusing in that the TensorFlow syntax is poorly explained and the results for the final module would be well served by noting what your first epoch should be as well as the 100th (I spent a long time trying to find non-existant bugs because I had misread the output of my model as epoch 100!!).

Other than that excellent as ever.

By Satyam k

•

Aug 18, 2020

This course provide very deep and good knowledge that how to increase speed of your neural network and how we do hyperparameter tunning. But one thing lags in this course is that it won't provide much knowledge about frameworks like Tensorflow and people face difficulty while doing programming exersice because tensorflow knowledge is not provide in depth

By Vishak A

•

May 14, 2020

I wish more of TensorFlow had been included in the course content. Aside of that major point, I wish the complex mathematical portions had been explained with more precision and codes like "X[0][0]" had been explained with more precision as well. But overall, I think it was hugely worth learning all the thoroughly taught concepts and I am very grateful.

By Chinmay h

•

May 8, 2020

Topics are explained very well. There may be a false sense of accomplishment coming after doing the assignments, which are pretty straightforward. I am going to add in personal tasks which might help me understand the topics more in depth. On a similar front, could you add in a video explaining what to do next. And I don't mean the next course in line.

By jim

•

Nov 8, 2017

gain quite a lot of insight into the deep neural network, the tunning, regularization and so on.

one remark on this course, we talked a lot about tunning processes in wk3. However, not much practical exercises on this part, e.g. we didn't try to implement the batch normalization ourselves and to incorporate batch normalization with other parameters etc.

By Aurangazeeb A K

•

Sep 30, 2019

Although I loved this course, I believe there are certain parts that could be broken down into even simpler intuitions. If such a change a possible, this course will be the best one out there. Anyway, I really enjoyed the course and it was a great learning experience. Tensorflow was introduced very finely and it aroused my curiousity to learn more.

By Manish M

•

Mar 22, 2020

Really informative course to learn about the various kinds of optimizations and the differences between the optimization techniques. Learnt how to tune the hyper parameters for effective training . Also got a chance to learn about mini-batches and the corresponding gradient descent and the difference between batch and mini-batch gradient descent.

By Alejandro F

•

Feb 3, 2020

Un curso muy bueno, el instructor tiene dominio del tema y sobre todo el final del curso es muy bueno en cuestión de poner en practica la teoría que al principio te explica. En ocasiones el instructor va un poco rápido en los términos teóricos y puede llegar a abrumarte. Creo quería ideal poner más ejemplos prácticos cada que explica un concepto.

By Yix L

•

Dec 20, 2019

Materials are good and Professor Andrew presents the course in the really understandable level, so I still learn a lot throughout the course even if I have taken similar mooc courses on other platforms. Programming Assignments are much easier than the fourth course (Convolutional NN), but it gives many inspiration to me. Great thanks to the team!