Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
63,068 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

YL

Invalid date

very useful course, especially the last tensorflow assignment. the only reason i gave 4 stars is due to the lack of practice on batchnorm, which i believe is one of the most usefule techniques lately.

XG

Invalid date

Thank you Andrew!! I know start to use Tensorflow, however, this tool is not well for a research goal. Maybe, pytorch could be considered in the future!! And let us know how to use pytorch in Windows.

Filter by:

7051 - 7075 of 7,238 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By Minglei X

•

Oct 22, 2017

Some process that was discussed in details in previous courses are mostly omitted in new context. While it is sometimes nice for saving time and focusing on new ideas, I feel like there are sometimes subtleties in them. Like I could not imagine how backward propagation should be implemented in batch norm. I'm not sure if it's because there are really some subtleties that you think it's too tedious and not necessary to introduce in the short video. If it is the case, I still hope you could provide more detailed information about them somewhere, just for curious people like me.

By Ashvin L

•

Aug 24, 2018

The course builds up on the first course and provides some ideas on how to tune the networks to perform better. However, at the core, I find the number of parameters overwhelming and it appears that by changing the parameters we can get any answer we want. There is no "formal" and mathematical basis for changing the parameters. This is a bit disconcerting.

The assignments were trivial. More importantly, at least one assignment appeared to indicate that the results are entirely dependent on weights chosen (at random) on the first iteration. This should not be the case.

By Foad O

•

Nov 2, 2021

The course is pretty good overall. However, the programming assignments need much improvement. I realize that teaching Python syntax and programming is not really part of this course, but if students are expected to do coding, there needs to be some more detailed lessons/sections to cover the basics. While providing vague, inconsistent and riddle-like "hints" in the middle of the programming exercises make for some interesting brain exercises, they are certainly not helpful at teaching the students what they need to know in order to write correct code.

By Vikash C

•

Jan 28, 2019

Content was good.

But the system that checks our submitted our code checks wrongly even when I wrote it correctly.

In week 2 assignment, when I submitted the code, it gave many functions as wrong coded.

I resubmitted the code after few changes, for instance a+= 2 changes to a = a+2 and string text like 'W' changes to "W". It worked fine and gave 100 points.

In short, what I observed is that the code checking system is taking a+=2 and a=a+2 as differently, also 'W' and "W" are considered different, but they are not in actual output.

By W B K

•

Oct 1, 2018

I thought the content was well-chosen and typically presented clearly. However, unlike the previous course in this specialization, the assignments had an egregious number of typos and missing information. I found these errors confusing and time-consuming.

From the staff's forum activity, it looks like they are no longer actively involved in this course. I hope that Coursera will hire someone—an intern would probably be plenty capable—to take this course and carefully fix as many of the errors in it as she or he can find.

By Zbynek B

•

Jun 9, 2020

This is my third course by Prof. Ng, which I passed all with 100% score track. So far, I gave always 5 stars. This time, however, just three because of (1) weak explanation of the Dropout method (intuition) and (2) missing gradient for the extra gamma parameter (Batch Norm method). It isn't a big deal for the student to derive the gradient. However, I expected Andrew at least to mention that gradient for the back propagation step.

All in all I love the teaching style by Prof. Ng and I fully recommend them.

By Kristof B

•

Apr 8, 2021

While i like the theoretical part of the course, the programming assignments need a lot of work. Foremost there is the issue of TensorFlow 1 being used. It isn't even the latest version of TensorFlow 1, but a very old one at that. Aside from that courses use too much hand holding, i find myself deliberately scrolling past information blocks so that i actually need to do some work. Otherwise it would just be copy pasting, or in other words, a waste of time.

By Robert M

•

Jan 12, 2022

I enjoyed the lectures by Dr. Ng. There are very clear and well explained. I feel I have a good theoretical understanding of the concepts. The practical aspect is quite different. The exercises lack explanations, especially TensorFlow. You write a few lines of code and "congratulations, you have written your own NN!" while they seemly randomly transform and transpose your data without explanation. You hardly leave the course feeling like an expert.

By Iggy P

•

Apr 19, 2020

This was an interesting course in that it taught me a lot about hyperparameter tuning and how to improve my models in general. My main issue was that the optimization assignment couldn't open properly due to jupyter notebook issues and I didn't receive any support or direction on the issue. I just stumbled on the solution myself and this significantly messed up with my timelines. I wish there was more support for technical issues as well

By Dimitrios G

•

Nov 28, 2017

The course continues on the same path the previous Deep Learning course has set but I found the use of TensorFlow somewhat limiting. It is a great tool that simplifies the training and running of NNs but it does not allow for easy debugging or for easy looking within the built-in functions to spot problems. I felt that we were treating many tf.functions as black boxes and I am not so fond of this. Otherwise the course was fairly useful.

By George S

•

Jun 27, 2022

Please.... Andrew is awesome, deeplearning.ai is awesome, DLS is also awesome. BUT why tucking that last programming assignment about tensorflow, it's ruined the whole course..... man what did i learn after that assignment...nothing! abs. nothing... lots of crammed coding, keep getting answers from forums and now I pass the grading and remember abs. nothing.... too non-idea the tensorflow commands are like greek...

By Hamad U R Q

•

Sep 12, 2019

too easy.

One thing about Week 3 that I want to say, I had some confusions in the lectures but was hopeful that while going through the assignment I will clear out the concepts about tuning Hyper-parameters but instead, the assignment was ALL about tensorflow basics and nothing about tuning Hyper-parameters. I was really disappointed with that!

Other than that, course contents are great and worth the time and effort.

By Fermin B

•

Mar 7, 2021

The course it's very good, but the reason I didn't put 3 stars is because it was difficult. I had the impression that the course was going too fast and I wasn't able to fully understand all the contents that the teacher gave. I think the assignments should be more similar to the first course, where you go step by step, understanding everything about the code. More explanations about tensorflow would be appreciated.

By Younes A

•

Dec 7, 2017

Wouldn't recommend because of the very low quality of the assignments, but I don't regret taking them because the content is great. Seriously the quality of deeplearning.ai courses is the lowest I have ever seen! Glitches in videos, wrong assignments (both notebooks and MCQs), and no valuable discussions on the forums. Too bad Prof Ng couldn't get a competent team to curate his content for him.

By Christian M

•

May 15, 2022

The theoretical part was clearly understandable but the programming assignment was very poor in my opinion.

Did I miss the introduction to tensorflow somewhere? I could not find it in the cousre. It was possible to solve the assignments with guessing and reading some forum posts. But honestly I did not understand very much...

I'm a bit disappointed about the introduction to tensorflow.

By Gadiel S

•

Sep 21, 2018

The course is good. It covers important ideas, and they are well explained in the videos. However, the formulation of the assignments is sloppy. There are mistakes and inconsistencies, in some cases necessary explanations are missing, and in some cases the instructions are misleading (I suspect the assignment has changed over time, but the instructions have not been consistently updated).

By Ha S C

•

Oct 28, 2018

A much sloppier and poorer course than previously. Grading mishaps (on the fault of the grader), a few errors in the lectures (the variance in the normalization), and very basic and unhelpful feedback from staff made for a course that did not live up to the level of the previous one. If at any point you need further help, it is generally unavailable, or difficult to find at best.

By Ashkan R

•

Dec 23, 2020

I really like the course material, topics discussed, and neural networks in general. I also have a lot of respect and gratitude toward Andrew, but the way he organized quizzes and programming assignments are rather a monkey-see-monkey-do strategy. You rarely get challenged. Overall the course is worth taking, but I would not recommend this to more advanced practitioners.

By Siddharth D

•

Apr 24, 2020

I have written this before in the discussions. I feel, there should be assignments to implement everything from scratch. I feel, i can fill in the code, and understand ,most of the mathematical functions, and reasoning, but i am still not confident that i can "CODE" a new problem from scratch. I was really hoping this certification will give me practice to achieve this.

By Maysa M G d M

•

Mar 4, 2018

Some exercises were wrong , like Z3 em tensorflow model, you said z3=w*z2+b3, but it was A2 ,not Z2.

Several exercises did not check the result for each function, so when I arrived at the huge model function, it was hard to discover where I was wrong.

I think this third week could be two. I missed exercise with normalization, there were all with tensorflow.

By Dartois S

•

Aug 17, 2017

A bit less good than the previous course. It would have been good to have a chance to concretely implement Batch normalization. Then I think the tutorial on tensorflow needs more details and explanations of the what and why of the conventions. Anyway I was really happy to learn a bit about tensorflow, I hope I will use it more through the course.

By Ali I

•

Sep 4, 2021

this course provided me with very fair insight, however, i felt that the Tensorflow portion was covered ina hurry. I had no background of tensor flow, and I am believing that the way it is covered might be the right way and I will build up on it. Even while covering the last assignment i had not much familiarity with the syntax of tensorlfow....

By Amit C

•

Nov 20, 2019

The fact that the lectures are not available to keep is problematic. Also, the programming assignments leave too little to do. Only few lines of code, that in most cases are simply copied from the problem description. It would make sense to broaden the programming tasks, and let the students really cope with many of the real-world challenges.

By Volodymyr B

•

Sep 19, 2021

The last programming assignment in the course is a bit better than the rest, while lectures are of rather high quality. In Quizes some questions are confusing. E. g. Andrew Ng several times said that parameters should be revised from time to time, but there is a question that (in couple with correct answer) states the opposite:(