Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
63,187 ratings

About the Course

In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically. By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

HD

Dec 5, 2019

I enjoyed it, it is really helpful, id like to have the oportunity to implement all these deeply in a real example.

the only thing i didn't have completely clear is the barch norm, it is so confuse

AS

Apr 18, 2020

Very good course to give you deep insight about how to enhance your algorithm and neural network and improve its accuracy. Also teaches you Tensorflow. Highly recommend especially after the 1st course

Filter by:

7201 - 7225 of 7,254 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By QUINTANA-AMATE, S

Mar 11, 2018

Again, nice videos but not

By Matthew P

Sep 3, 2021

Focused a bit on minutia.

By Adam G

Jul 11, 2020

Multiple grading issues.

By Chaitanya M

Jul 1, 2020

could be more engaging

By José A G R

May 23, 2023

Estoy muy emocionada

By Cory N

Jan 8, 2020

Update for TF2.0 :)

By Алексей А

Sep 7, 2017

Looks raw yet.

By Ilkhom

Mar 21, 2019

awful sound

By Akhilesh

Mar 14, 2018

enjoyed :)

By Sai R

Nov 10, 2022

Good

By zhesihuang

Mar 3, 2019

good

By CARLOS G G

Jul 14, 2018

good

By Hoàng N L

Feb 12, 2019

N/A

By KimSangsoo

Sep 17, 2018

괜찮음

By Dave L

Oct 3, 2024

The videos are good and go into a lot of the details. However, the programming tasks are not very useful. They basically expect you to plug certain lines into an almost complete programme, without really understanding what the individual lines - let alone the rest of the programme - really mean. Also, there is the occasional reference to lecture notes. However, those lecture notes are just the slides that are being used with hand-written annotations on top, and without the accompanying videos, they are not useful.

By Fabrizio N

Dec 7, 2018

Good course content and clear exposition by Andrew. The course material however is not of a good standard. The slides can be downloaded but after all the hand scribbles by the tutor, they are barely decifrable. Some are just blank pages that need to be filled in with screenshots from of the videos. The assignements are often just a copy and paste exercise, and Jupyter crashes cause frequent loss of work.

By Goda R

Feb 14, 2020

The video content is very good to get a good hang of theoretical aspects but the programming assignments are too spoon-fed because of which after doing filling the blanks, you don't feel confident enough to implement the same on your own. Instead the assignments should be changed to cases where instructions are given in words and entire function should be implemented by students.

By André Ø

Nov 30, 2017

The TensorFlow part of the course felt out of place and not of the same quality as the previous material. It would have been better if another week was spent using TensorFlow to actually improving a NN and not just copy-paste an example into the assignment. Even after using TensorFlow in the assigment and passing, working with TensorFlow still

By Sergey K

Jul 6, 2020

In general the course isn't deep enough. There are no summaries of the lectures, there are no excercises during lectures, the programming assignments are very weak, they don't challenge the use of lectures or anything - all necessary data are in the notebooks. All this course will be lost in a week.

By Ivan I P

Apr 28, 2023

Sadly the last course assignment didn't compile properly in the sections that were outside the "graded functions" (more specifically in Week 3). Also some of the questions of the quizzes were intentionally misleading but for very unimportant details.

By Cristian G

Sep 28, 2022

They should improve the programming labs.

There's too much repetition in things like programming a function that initializes the parameters while important things, like model definition, are already written in the notebook.

By James H

Nov 18, 2018

The whole tensor flow introduction is weak - it clearly requires further reading, which is fine but totally out of kilter with the videos so far, which have taken things from first principles very clearly.

By Martin B

Nov 16, 2017

A technical problem with the grader caused my grade to be artificially lower on the last project. Although I was instructed to resubmit, the course ended with a lower grade than I should have received.

By Anne R

Oct 9, 2019

Not much implementation required of the students. More testing of the methods would be useful or if the concepts are the focus then this course should be merged into another course in the sequence.

By Brian R

Jun 10, 2018

The course material is good but Jupyter notebook interface does not work correctly. You will waste a lot of you precious time fiddling and redoing work that you lose when the notebook fails to save.