Chevron Left
Back to Neural Networks and Deep Learning

Learner Reviews & Feedback for Neural Networks and Deep Learning by DeepLearning.AI

4.9
stars
122,222 ratings

About the Course

In the first course of the Deep Learning Specialization, you will study the foundational concept of neural networks and deep learning. By the end, you will be familiar with the significant technological trends driving the rise of deep learning; build, train, and apply fully connected deep neural networks; implement efficient (vectorized) neural networks; identify key parameters in a neural network’s architecture; and apply deep learning to your own applications. The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI....

Top reviews

AS

Jul 10, 2021

I have learned a lot of thing in deep learning such as neural network , deep neural network , forward propagation , backward propagation , broadcasting and vectorization.This is very important for me.

AD

Dec 5, 2020

This course helped me understand the basics of neural network. After this course I learned to built base neural network model. Looking forward to do the next course of the deeplearning specialization.

Filter by:

176 - 200 of 10,000 Reviews for Neural Networks and Deep Learning

By Kevin M

•

Apr 10, 2020

Terrific course with 4 solid weeks of learning. The journey includes logistic regression for classification, shallow neural networks, deep neural networks, and building your own picture classification NN.

The virtual classroom lectures, quizzes, and programming assignments test your knowledge every week.

NN initialization, forward propagation, cost function, loss, backward propagation, gradient descent, and prediction using your trained model to classify pictures (in this case cats)

The coverage of the Calculus and Linear Algebra that are the basis for algorithms is explained in a way that builds a solid foundation without deep knowledge of the fundamental math behind the activation functions (sigmoid, relu, and tanh). A good understanding of matrix math, especially matrix multiplication, is a benefit to help navigate the course

The programming assignments use numpy python and are conducted in a Coursera frame Juypiter notebook. Strongly recommend beginners take the python tutorial as the syntax challenges can burn a lot of effort that can take away from the NN learning experience. Also be mindful of stability issues that can cause erroneous results (restarts of kernel) and can cause lost worked due to failed auto saves

The volunteers that help on the message board are quite good! Thanks Paul!

Finally, Professor Andrew Ng truly knows his stuff, presents in an understandable way, and communicates the excitement he has for the topic. Having taken a previous machine learning course (Stanford Machine Learning also offered by Coursera), Professor Ng is a world class instructor and Data Scientist

Best of luck!

By Sebastian S

•

Dec 15, 2017

I found it very helpful as it confirmed most of the things I had already learned by doing deep learning projects on my own, as well as browsing additional literature on machine learning / deep learning and having done some internships where I had to apply these things. So for me personally, this course did not teach me anything ew, but organised and structured the knowledge in my head nicely by summarizing it very neatly. Also, some of the hints on implementation where helpful (like the numpy reshaping issue with arrays of shape (n,) as opposed to (n,m)). One thing I found is that deep learning can only really be understood if the covering of back propagation includes the low level derivatives + chain rule discussions; otherwise, you dont really "understand" whats going on. I appreciate that the course (just like the original "Machine Learning" one, which was excellent) tries to reach a broad audience that does not necessarily know analysis to the extent required for backprop, but maybe it would be a nice idea to include a "mathematician's point of view" on the backprop as an optional part. I found that in my personal studies, looking at backprop from the pure analysis point of view helped me a lot in "demystifying" deep learning and seeing it for optimization approach that it is. Having said that, I found the course very nicely structured, with very clear explanations and relatable applications. Thanks to coursera and Andrew for providing this great source of knowledge for free, I really appreciate these efforts! Sebastian

PS: I gave it 4/5 stars, but for some reasion the rating keeps getting stuck on 5.

By Sven A

•

Dec 24, 2021

Lectures and programming assignments were great! All the maths behind Deep Learning was explained intuitively and illustrated with good examples. I liked the fact that the lecturer went into details for those who are interested - I really recommend these additional materials, even if you have hard time understanding the calculus and linear algebra. Programming assignments had thorough explanations and detailed guidance how to approach each exercise. I would say that at some point explanations gave away the answer (if you were well acquainted with materials, of course). I liked that during programming assignments I had to implement every functions myself. This really helped me to gain a general understanding about Deep Learning. And thanks for having tests in the assignments! This really helped a lot, because I was able to debug my code before moving on.

There is only one suggestion I would like to make: perhaps it would be easier for students if there are smaller programming assignments right after a topic is covered in a lecture. E.g. forward propagation lectures would have a small programming assignment where one would have to implements few examples for forward propagation. The benefit I would see is learning through repetition - it is easier to learn materials through repeating small chunks at a time. Moreover, programming assignment at the end of a week would not be so difficult to approach anymore and would serve as a final repetition, where practiced and known materials are finally tested.

By Ryan F

•

Dec 31, 2017

This was a very well-thought-out course for beginners in Neural Networks / Deep Learning. Andrew Ng sets a good pace; I was able to complete each week's lecture videos and assignments in less than 10 hours. Lectures were always clear and often went over things which would not directly be needed for assignments, but which will be useful to anyone planning to do work in this field. Andrew Ng was also very good about explaining where the mathematical equations came from, while stressing that it's not super-important to understand fully where they come from, as long as you're able to implement them.

I should add that I'm probably not the typical audience for this class --- I have an extensive math background but only just started programming a few months ago. Python code was scaffolded and commented in such a way that even a noobie to programming can follow and complete coursework, and I can say I've not only learned about NN/DL algorithms, but also a good deal about programming in python as well. One major topic that still blows me away is the speed boost we get from avoiding for loops and using vectorization instead.

The post-assignment interview videos were also interesting. Andrew Ng would interview a guest 'powerhouse' at the end of each week, and the topics covered there often went way beyond the scope of this individual course, and gave a much more broad overview of where we are now and where we seem to be heading in the near and long term.

By Dilip R

•

Mar 16, 2020

This is a wonderful course. I have been reading passively for about a year on resources related to ML and DL, but never got the full grasp of the concepts the way Prof. Andrew explained them. The quizzes where entertaining and insightful, as well as the programming examples.

I completed this 4-week course in about 2 days straight; some of the quizzes were 70/100 at my first try but then got to 100/100 after 1-2 tries. On the programming assignments I got 100/100 on the first try (except for the first one which didn't register my last 3 code answers -even though I typed and ran correctly - for which I had to restart the kernel, launched it again in incognito mode and after I was done re-ran all the snippets one last time just to be sure).

The hardest part of the course for me was to understand derivatives and overall calculus development and factorization because I had the necessary classes a few years ago, and honestly I wasn't very good at it back then either.

One thing I would suggest would be to improve audio quality, as well as editing the videos instead of providing a warning message before a video with errors because sometimes it's hard to follow the course part let alone spot the error itself.

Again, I would like to thank Professor Andrew Ng. and the Deeplearning.ai team, as well as the Coursera platform for providing such great realtime capabilities like the jupyter notebook and the automatic grading system.

By Baili O

•

Jan 23, 2018

This is a great course which covers some popular machine learning techniques such as regression. And then it moves to deep learning with neural networks with the techniques of forward and backward propagation. It is a good course for beginner and the homework are fairly easy. However, this course still leaves some unanswered questions which might be covered in the future courses such as how to select hyperparameters, why we choose this specific cost function, are there any other deep learning framework other than neural networks structure, any other application other than image recognition. In addition, for those who have some background in machine learning, the interview section is a bonus which talked about GANs.

There are somethings not very enjoyable as well. For example, the notebook is unable to download so people have to write it down otherwise when the course expired, it is quite hard to get the course material. The lecture notes are badly organized: you have to download every slides one by one. (or I just didn't find the right place to download). Thirdly, this course didn't talk too much about techniques that the industry is using. What I am trying to say is that I don't know if the techniques in this course is applicable in the industry, is it too simple or is it too old etc.

Overall, it is a very fun and educated course. I can't wait to jump into the next course.

By Melissa B

•

Aug 16, 2020

This course has been very interesting and engaging for me. Dr. Ng explains everything very thoroughly and provides compelling examples of real world application for the material. Occasionally he can be a bit redundant, but I found that helpful, since sometimes it takes more than one pass through the material to understand it clearly. I am also taking his Machine Learning course (Stanford) concurrently and I found the courses to complement each other very well.

Additional practical notes: The linear algebra involved in the course is relatively basic and explained thoroughly enough that it can be picked up along the way if you don't have much math background. Also, the programming assignments help expose you to Python syntax. If you don't have much Python experience, use the discussion forums. If you do, you'll find the assignments to be incredibly easy, as they have ample starter code with a few plug-and-chug "your code here" sections. These are mostly just to demonstrate how the material is applied. The primary value in the course is conceptual. For someone with very little coding experience like myself, I appreciate how thoroughly the code was annotated so I could get a grasp on what it was doing without having to stab through trying to produce it myself.

Overall, I really loved this course, and I learned so much!

By Felipe P C N

•

Jul 10, 2020

O curso concilia uma abordagem compreensiva - partindo de conceitos intuitivos/elementares - com uma profundidade respeitável - construindo com cuidado técnicas mais aprofundadas. As aulas são extremamente didáticas e bem elaboradas, o que torna a familiarização com os conceitos de redes neurais (e a matemática por trás deles) mais natural. As avaliações têm tanto um componente teórico (quizzes, mais voltados para verificação de conhecimentos) quanto um prático (exercícios de programação). Por mais que seja positivo haver o componente prático, eu diria que ele poderia ser um pouco menos "guiado" - na prática, o que eles pedem é que o aluno "complete as lacunas" em um programa já (muito bem) estruturado (e praticamente completado) pelos professores. Senti a necessidade de complementar esses exercícios com tentativas por conta própria, programando os algoritmos do zero, para de fato assegurar que consigo implementar o que aprendi. Esse passo é sensivelmente mais difícil que os programas do curso (afinal, implementar um projeto do zero é essencialmente desafiador). Mesmo assim, vejo que esse tipo de "iniciativa" de levar os conteúdos aprendidos a alguma aplicação/prática para além do pedido pelo curso é algo que idealmente deveria ser feito ao se estudar qualquer disciplina.

Excelente curso.

By Zhenwei Z

•

Feb 28, 2020

This course from the basic to the advanced, leading us to understand deep learning. From the initial logistic regression, then to the shallow neural network, and finally to the deep neural network, we gradually learned the neural network representation and calculation process, and finally began to implement the cat image recognition binary classifier. The course is very clear and logical, eliminating the tedious mathematical derivation, but still allowing us to understand all the mathematical details including calculation and vectorization. The assignments are done step by step, starting from the basic functions and gradually encapsulating, and finally constitute a complete neural network, which enables students to have a deep understanding of neural network and master knowledge from practice. It is worth mentioning that the setting of course gradient is reasonable, and the details that are difficult to understand in the previous course do not need to be understood all at once. In the later courses, the understanding of the previous knowledge points will be deepened repeatedly, and due to the foreshadation of other knowledge points, a more complete and comprehensive supplement will be provided to the previous knowledge. Looking forward to series two.

By Ronald A R

•

Dec 17, 2021

It would be difficult to overstate the value of truly laboring through this very well designed and conducted course. I've an advanced degree in cell/molecular biology with thesis work in immunochemistry and physical biochemistry (UCSD). I did undergraduate work including the regular chemistry coursework including thermodynamics, statistical mechanics, & quantum mechanics. I consider myself fairly mature at applied math in my areas of training and work. But while not ill prepared, it took some real effort to gain a deeper understanding of the application of the underlying math for backward propagation. This course with a little help from 3brown1blue, provided a great introduction to the "mysteries" of NN, "deep learning", and AI. Dr. Ng provides an most pleasant presence as he works to give a new student access to what is arguably among the most sophisticated and promising realms in computer and artificial cognitive science. I could not have gained access to this space without his course. I will move on to the next course in the sequence of four. BTW, I'm 75 years old, semi-retired, and pursuing this only out of the desire to know and understand. Thanks yet again to Dr. Ng, his colleagues (named and unnamed), and Coursera.

By Xiao G

•

Oct 7, 2017

This is my first completed course on Coursera.org!!! and I win a certificate with nearly full marks! I was very bad at coding .... although now I still not good at it, this course convinces me that even a poor guy in coding like me can finish and establish a neural networks! it truly boosts my confidence! Thanks Andrew Ng.

One thing to note for this course.... maybe it can improve later in this side. I feel quite easy and comfortable with all mathematical deductions.... however, when I code in week 4, the backward propagation, I nearly get lost.... I spend the whole afternoon and night to solve it and then finish assignment 2 in week 4 (from 2 pm till 10:03 pm)... I bet many people get stuck on that area. hmm, It's really easy to get lost and puzzled there. I guess there maybe some points we can figure it out more easily. Just a little advice.

Anyway, I love this course. It's a trigger for my coding area, though I'm a physics student and now in electrical engineering.... I still feel very comfortable about this course.

Thanks Andrew Ng!!! I don't know how to express my gratitude to you! If I did not take this course, I may never ever to attempt on Deep Learning, such a complex and advanced thing.

I will continue to study~~ see you

By Luca C

•

Jan 26, 2019

Pros: + You will understand clearly how things work and why they work

+ Provide mathematical insights for those who are interested (really a big plus wrt other courses)

+ Overall a simple introduction to Neural Nets. Even those who already have experience can benefit from this brush up (even at a fast pace: you can complete it in 2-4 days)

Cons: - Since it is quite basic material, those who are already accustomed to NN might want to jump to the second course of the specialization.

- You learn python by doing, but you will not get an understanding of python. I would suggest to get a little bit familiar with it in other ways (however, this is not a requirement to master the course).

Clear, quick overview of the basics of Neural Network. Provide even some mathematical justification, even if it is really not a requirment to fully understand it in order to succesfully achieve the course. However, IMHO it is always good to have at least some insights of the mathematics behind.

This course sets well the basics, but to be really able to work on your own projects I think it is a must to take the second course of this specialization.

By Amit R B

•

Nov 27, 2019

This course is truly deserving of its high ratings. Prof. Andrew Ng's extensive breakdown of the structure and function of neural networks work is unparalleled. For me personally this course has been of great help. The theory lectures made me understand just how these networks "learn". This course is a great beginning and I think, prepares the student well to learn more in depth and advance concepts of deep learning.

However, if you are looking to get hands-on experience building and training deep learning modes I would recommend checking out some free resources on YouTube with the Keras framework. I played around with with Keras following the YouTube channel Sentdex's Keras tutorials. Then took this course to get a more mathematical and theoretical understanding. Some students might find themselves a bit unprepared for the coding exercises, since the lectures are more focused on theory and math, showing little to no code. This is why I thinks this is a great (if not the best) 2nd course, but maybe not as helpful as a first introduction.

For a free first introduction, check out the channel 3blue1brown's videos on Neural Networks to get your feet wet, before diving further deep. ;P

By Ali M

•

Mar 8, 2023

The course is very well organized and the explanations are both clear and concise enough for almost any learner with some basic backgrounds. As a person with high skills and strong background in math, I did not get bored with the derivations and explanation of gradient decent and back prop which is nothing but differentiation chain rule. I think oppositely it is not also so hard for non-skilled people to grasp the ideas without fully understanding calculus. Also the python homework were peanuts and I like them a lot. Especially I can say that people who are afraid of programming should not be worried. There is almost no nuanced programming and complicated syntax or concepts. Very basic knowledge and hands-on the python is enough. The only piece of comment I can have is that perhaps the vectorization part and notions from very basic linear algebra can be slightly improved. Linear algebra is the main body of math I do everyday for my job and I know that how better notation can help people to understand concepts much easier. But there is a trade-off and maybe in this amount of time and considering the diversity of learners background this is already the best job one can do.

By charles

•

Jun 8, 2020

I am seeing that people are complaining in the top review saying that the course is bad because there isn't stoppages for you to think and all that. Or like the code is relatively simple to write.

What I want to say is that this is how, in general, what you learn in school. And this is the exact style that is being taught in school. The professor isn't going to pause here and there for you, unless you raise your hand to ask questions. In my machine learning class there isn't even quiz to check your understanding. The only thing missing in this course is PROJECT. Which is something that a online course can't provide, and there is no way the tutors can mark all your project.

When you complain about the lab assignments, this is exactly the type of lab assignment that you get in school, just that perhaps in school the variables isn't even given (but that is because tutors are within range and you can ask them any time you want, but this isn't the case for online learning, they have to make this do-able for everyone)

For something that is free, I can't believe that people are complaining about the quality, this is on par with my school's standard, and my school isn't bad at all.

By Randall S

•

Oct 5, 2017

Dr. Andrew Ng is brilliant and it is so amazing to have access to this type of knowledge for less than I spend on Starbucks in two to three weeks. I am taking some online courses at a big name university (to the tune of $4,000 per course), and for the money, this is a real bargain and just as good if not better!

The thing I liked most about this particular course is that it showed us what's happening under the hood, and not just a course on how to use tools, nor is it all theory. Dr. Ng also introduced us to Geoffrey Hinton, the pioneer of backward propagation which was worth the price of admission alone.

That said, it was not so tough that I couldn't keep up. I would say that having some exposure to calculus would help, but it is not required. Also, you need to be more than just familiar with Python, but if you can spend a few extra hours per week on the course, you can work your way through it with just a familiarity with Python.

It has challenged me to keep going to the next level and complete the specialization -- AI is not rocket science -- at least not at the level of applying this knowledge. Being at Dr. Ng's level might be a different story.

Highly recommended!

By Anand R

•

Jan 29, 2018

To set the context, I have a PhD in Computer Engineering from the University of Texas at Austin. I am a working professional (13+ years), but just getting into the field of ML and AI.

I completed Dr. Ng's course on Machine Learning on Coursera first. I recommend that students of this course should first complete that course (or an equivalent one). This course was an excellent review of the basic concepts of Neural Networks. The lectures were well presented and the maths/equations were explained intuitively. The problem solving assignments were in Python (as opposed to Matlab). As before, Dr. Ng walked us through the assignments: hand-holding us through the solution. The quizzes were fairly challenging and helped me reinforce the concepts quite well.

I wish there were a few open problems (Kaggle style) at the end of the course so that the class students could compete with each other. It would be a good addition to the course. I would appreciate more real world examples throughout the course as well.

I look forward to completing the remaining courses! Thank you, Dr. Ng. Thanks you, teaching assistants. Thank you, Coursera. This is truly a wonderful course.

By MD A

•

Jul 18, 2019

Thorough and simple explanations that help internalize the deep learning concepts. Video lectures are very helpful. Listen more than once to clarify concepts. Very useful jupyter notebook exercises with solutions that provide knowledge reinforcement. Vectorized form of deep learning neural network equations enable development of clutter-free and faster scalable solutions. Before taking the course refresh your knowledge of linear algebra esp. basic matrix operation such as matrix size, transpose, and implementation in Python via numpy such as numpy.dot for matrix multiplication, numpy.multiply for element-wise multiplication. Familiarity of Python key:value dictionary data structure and retrieval of values via keys. This knowledge will build confidence to code the functions and methods for forward propagation, back propagation, and gradient descent to update weights and biases. Also pay some attention to how indices in square brackets are used to identify matrices for inputs, outputs, parameters (weights and biases), activation values/models, various layers of a neural network, and nodes in a particular layer (all explained well in lectures.

By Alex M

•

Oct 15, 2018

I've been impossibly busy and first thought this was something i could play in the background while I did other work. Quickly it became apparent that data I had been used to with M.shape = (user/observation/etc, feature) was now the transpose. This took a simple few examples on paper to convince me why this was a superior notation for D/RNN architectures given numpy notation. I also at first thought that the bias should be added to W, X, for greater expressibility of the relationship y = g(WX) and for the backprop updates that require 'estimating' the W.T*g^-1(y) and g^-1(y)*X.T (where y is understood as the general activation after layer l and X is the general output of the previous layer), but now I see why separating the bias is useful -- it estimates the 'scale' of all the data at the output layer at once (estimating the unbalance in the marginal distribution, for example), whereas the other gradients come from estimating the perturbative deformation in the input layer, thus they are slightly different from the perspective of forward backwards distributional learning. Bravo, and thank you!

By Mahesh G

•

Aug 29, 2017

Thanks for the course. Very neatly explained on the background maths that happens in neural networks. This course will help you understand the step by step what happens within the network. The step by step procedure which is explained by Professor is great and he has repeatedly stressed the important steps to make it clear. Along with the explaining the formulas the assignment helps in implementing the formulas step by step and converting the whole thing to a neural network model, this is a great learning. One of the important thing covered in the beginning of the course is about vectorization, python broadcasting which is the key for neural network.

The pace at which Professor explained the concepts is good and easy to follow and the structure of the course is well laid-out which helps for the beginners.

One thing that could have been better is the assignments, current assignments are definitely helpful for beginners like me, but could have some more assignments which increases the complexity level (may be it is there in subsequent courses).

Overall very good course and helped me

By Krishna k N

•

May 18, 2019

I admire Professor Andrew Ng's patience in helping the students take baby steps by painting a big picture from each small pixel, just as how a neural network is built.

This course has given me great exposure to how neural network, although I realize I need to take a Python course to type code more freely and easily.

I'm going to do that next and then come back to the remaining courses in this specialization.

feedback - it's really hard to visualize some of these matrices and their dimensions used in a large neural network with so many parameters such as nx features, m training examples, n iterations, L layers with (nL, NL-1) weights, (nL,1) biases etc. I understand it's hard to show these matrices by writing as they are very large. I wish someone would develop a more "animative" way of illustrating these matrices that will make the intuition more stronger. for example, calculating forward_activation for all layers and all neurons across these layers by just passing X and parameters is a massive operation and the intuition stumbles purely by the scale of such a matrix operation.

By Eung P

•

Mar 9, 2021

This is a really excellent course. I am surprised to have learned the fundamentals of deep learning in only one month! I am very happy that I could try myself a real-world problem on my PC w/o GPU. This was totally beyond my expectations.

Prof. Ng teaches all the details step-by-step without any logical jump. Even those who does not know high-level math can catch up. He tried to relieve students' concerns and burdens, any time I lost confidence. After completing the Programming Assignments, I could be comfortable. The assignments come with detailed explanations so that students can follow up without much trouble.

His fast hand-writing was sometimes a little hard to read (e.g. "[", ), but typed summaries could clear them out.

Having an interview with Prof. Jeff Hinton was also another big addition to this course. Actually, this was the time I started to gain more trust in this course. Frankly I didn't know about Coursera or Prof. Ng before.

The mentors (esp. Paul) are excellent with quick responses with sufficiently detailed guides.

Without any hesitation, I throw 5+ stars.

By Nikos S

•

Dec 23, 2019

I am a CS student with an interest in AI; therefore, I was no stranger to Neural Networks even before I finished the first course. I decided to choose the DL specialization (my first in coursera) to learn more about NN and perhaps also to solve some questions I had (e.g. "Why a NN with an input layer, a hidden layer and an output layer is called 2-layer NN?" ). Proffessor Andrew Ng answered those questions and also made me think of the NN more intuitively rather than mathematically.

I found really impressive that while students with no prior knowledge in calculus can take this course with no problem, others, with good knowledge of calculus, like myself, are not bored even when the most basic concepts are explained (such as derivatives). Also, I liked a lot the cat image binary classification exercises, especially since the classifier was able to classify the well-know cat meme image as a cat picture :P .

I am looking forward for the next courses of the specialization to also learn about other DL concepts, such as momentum and regularization, I am not already familiar with.

By ANUJ K J

•

Jan 27, 2020

Things that I learned: 1) Introduction to deep learning 2) Logistic Regression, gradient descent on logistic regression 3) Forward and Backward Propagation 4) Computational graph and how to use the computational graph to calculate the forward and backward prop 5) Shallow Neural Network, and how to work on it end to end 6) Deep Neural Network and it's end-to-end implementation on an application (Classification of cat vs non-cat). Pros: 1) The course is in Python 2) The way Andrew Ng sir teaches is simplistic and memorable as he starts from small concepts and relates the same concept in complex problems too. For example:- He started teaching forward and backward prop using logistic regression then he carried the same in shallow neural net and finally on a deep neural net. Cons: 1) The number of questions in the video is less as compared to the Machine Learning course by Andrew Ng. 2) There could have been more clarification on a few small topics. 3) The video didn't provide any sort of notes or lecture slides. ( Note that I found useful: https://lnkd.in/f3fZGCy )

By Peter D

•

Dec 3, 2017

As usual, Prof. Andrew Ng knocks it out of the park!!! He would argue otherwise, but he's a natural born teacher whether he admits it or not. This was a challenging course, but I found the objectives to be achievable with a bit of hard work and cool-headed thought. Having taken Prof. Ng's Machine Learning course already, most of the material from the first two weeks of NN4DL was review. Unlike the broader ML course, DL was much more narrowly focused on concepts leading to mastery of deep neural nets. It also ditches the MATLAB/Octave used in ML for a more portable Python environment. I had basically no knowledge of Python when I started, so I guess I learned it in 4 weeks! :D My advice: take ML first, or you may be lost. I had the math and ML background for this stuff to make sense, so Python was the only thing entirely new to me. If you're fuzzy on calculus, or ML, or programming, I don't recommend starting with this course. But if you have a strong background on those things, you'll find this course is well worth your time! Good luck.