MN
May 24, 2021
Great Course,
Very few courses where Algorithms like Knn, Logistic Regression, Naives Baye are implemented right from Scratch . and also it gives you thorough understanding of numpy and matplot.lib
SJ
Jul 17, 2020
One of the best introductions to the fundamentals of NLP. It's not just deep learning, fundamentals are really important to know how things evolved over time. Literally the best NLP introduction ever.
By Forest L
•Jun 19, 2020
Lectures are too short and the topics are overly simplified. Assignments are toy examples.
By Anand R
•Jun 19, 2020
This course seemed rushed, and navigated across depth and breadth very unsystematically. There were errors in the assignments and instructions, and the python code in the assignments was also very non-pythonic in many places.
By Frozhen
•Jun 21, 2020
I came to this specialization from Andrew's twitter post, wanted to give it a try since Andrew's DL specialization is very good. However, this course is not taught by Andrew, and the video lectures sound like the instructor is just reading the script, not really inspiring me to follow the lecture since the videos are very dry.
By Juan d L
•Jun 25, 2020
The course is interesting, and it is built carefully in all the aspects (videos and code), even though I followed it in their first days of live.
Grading process is more on programing than on understanding NLP (classification and vector spaces). For instance; Slight changes in the code (spaces, repeated codes of questions identification, lack of use of the proposed functions...) derive on fail to pass. Error information are frequently uninformative. It is not possible to check part of the code (see W4)
The use of "Slack" is not proper; information it is not easily accessible, unorganized and it demands on students the learning of new tools and additional payments.
Definitively I used more time with the code than with the NLP content
Hopefully this comment is useful for students and teachers.
Thanks a lot
By Sarvesh K
•Jun 19, 2020
i would have liked if the week 4's LSH and Approximate Hashing was explained more clearly.
By Achkan S
•Aug 29, 2020
This course has too many problems as it stands:
1) They haven't chosen an audience: the concept that they explain are trivial for anyone having (even basic) machine learning (or even basic linear algebra) knowledge. However, it doesn't meant that this explanations would be useful for beginners: they are too short and incomplete (the "videos" are on average 3 minutes long!!) and what they focus on is often not the most relevant part.
2) There is no reading material: no books, no papers, no theory. It wouldn't be a problem if the videos themselves were decent, but most of them are about 1 minute long. You can't explain machine learning in such a short time.
3) The code of the assignments, especially assignment 4, is unclean (e.g. unused variables) and contains minor bugs.
4) The script that grades the assignments has very strict requirements: as an example, very often, if you use x.dot(A) instead of np(x,A), then it complains and says you've failed. This happens for a lot of numpy functions, and it makes the process of submitting results tedious.
5) Again, regarding the course material itself, many of the key aspects are not discussed. For example, word embeddings are given that have some nice properties, but its never explained how they have been obtained.
Overall, it seems completely rushed.
By Oleh S
•Jun 26, 2020
Quiet good starting course for those who decided to study NLP. Materials are qualitative, but also too short. The course lacks of depth, lectures are too simple, hence in order to deepen knowledge and understanding one have to read a lot of additional resources, which are not provided here. I have ambiguous impressions about this course. Seems, the best DeepLearning.ai courses are those taught by Prof. Andrew Ng.
To sum up, I think, lectures duration should be increased and more deep intuition should be provided. Programming assignments are peace of cake for experienced programmer, but are OK for beginners. Also, there are many incomprehensible mistakes in programming tasks, which I suppose will be fixed later. Nevertheless, I recommend this course for those who want to start a journey to NLP world.
By Agrita G
•Jul 1, 2020
The course is interesting and useful, however I have to admit that I was expecting more. More and in-depth lectures, more tests, more coding. Sort of felt that currently it is too easy to pass all of the assignments and get the certificate without actually understanding concepts thought in the course
By ES
•Jun 25, 2020
The content is interesting. However, the assignments are too simple - the majority of the code is already written which defeats the purpose.
By Clement K
•Jun 30, 2020
A bit too easy, I wouldn't say no to more of mathematical formalism so that it does not cover just the tip of the iceberg (especially for LSH)
By Dmitry Z
•Jul 13, 2020
The auto grader is ridiculous - e.g., in insisting that np.sum(X) is used instead of X.sum() [this is just one of many style examples].
By Zachary B
•Aug 7, 2020
I have to say I was pretty disappointed with this course. I think there are two main issues. 1) The choices about what to dive deep on were not helpful. I don't feel like I have a high level understanding of most of the topics covered. 2) The assignments were not helpful in furthering understanding. I hope the next courses in this sequence are better.
By sukanya n
•Jul 10, 2020
Pros: Good amount of subject coverage and many tips and useful demo notebooks.
Cons: Sometimes it feels like one has to guess sometimes what they grader wants in the exercise at other times it feels like spoon-feeding. There can be better balance and error output explanations. More test cases for the assignments are needed to verify intermediate function outputs. Also, sometimes the course instructor mentions that you do not have to understand this concept or you can easily look it up online. This is not very encouraging because of course, one can look things up on the internet. It would be better suited to explain things however briefly it may be. Cue from Andrew's course "Neural Networks and Deep Learning", Andrew always explains the things even if it is brief and always gives you the intuition behind things.
All in all, recommended course to get started.
By Mohamed M
•Jul 25, 2020
Lacks depth and reading material
Still the same as all the recent watered down MOOCS, I miss the deep courses that resemble university courses
By Tanay G
•Jul 4, 2020
This is the first deeplearning.ai course that I found really boring, maybe because the material presented was quite superfluous.
By kk K
•Jul 15, 2020
Many of the assignments had codes (like functions and concepts) which were explained in the lectures after the assignments. This was not good. Also, the videos were very short and unsatisfactory. Need a longer duration with some detailed explanation.
By Sharan N
•Jul 9, 2020
not worth it. the content is not related to the latest deep learning methods
By Sathvik J
•Jul 18, 2020
One of the best introductions to the fundamentals of NLP. It's not just deep learning, fundamentals are really important to know how things evolved over time. Literally the best NLP introduction ever.
By Owais A
•Aug 17, 2020
Awesome. The lecture are very exciting and detailed, though little hard and too straight forward sometimes, but Youtube helped in Regression models. Other then that, I was very informative and fun.
By mayur n
•May 25, 2021
Great Course,
Very few courses where Algorithms like Knn, Logistic Regression, Naives Baye are implemented right from Scratch . and also it gives you thorough understanding of numpy and matplot.lib
By Harsh A
•Aug 9, 2020
one of the Best course that i had attented in deeplearnig.ai the last week assignment was
to good to solve which cover up all which we studied in entire course waiting for course 4 of nlp eagerly
By John J M
•Aug 1, 2020
Video lectures are short and concise. The basic ideas are well presented. Some references for the details of vector subspaces and spanning vectors would have filled out the mathematical framework.
By Robert S
•Jul 24, 2020
General Comments on Course 1All of the linguistic and semantic knowledge that we were mining in Course 1 was encoded in the vectors. The coding was just using statistical methods to draw linguistic inferences from these vectors. I find it unfortunate that we didn't have an opportunity to learn how the vectors themselves are made (or did I miss something?), but merely got them out of a can from Google.It took me a while to figure out that the homeworks are graded by an AI, not a TA, and that one can submit the homework assignments numerous times until getting the grade you want. Given the large number of students enrolled, I can understand that hand grading would not be an option. It would probably be helpful to explain this to newcomers like me.I liked the way the assignments are structured. the fill-in-the-blanks approach, followed by some sort of numerical unit test to let us know if our solution was correct is good pedagogy (androgogy?). My only criticism is that sometimes the unit tests are not very sensitive to common coding errors. But now I know that we always have the option of running our assignment through the autograder for more complete feedback.The autograder is often overly prescriptive. For example, "Function 'np.sign or np.heaviside' not found in Code Cell UNQ_C17." Python is a rich language and there are many ways to code C17 without using those particular functions. The goal should be to get students to solve a problem creatively -- not to follow a particular path.I found the week 4 assignment a real bear -- too long (22 completion sections) in comparison with the others. I'm sure it was difficult for whoever codes the autograder as well. They need to do some more code checking. In UNQ__C8 we were asked to use the pre-coded cosine_similarity function. Initially this was returning the cosine difference, which is quite the opposite. In the middle of last week, after I and others reported the error, it was corrected but this seems to have created a cascade of other errors. For example, in cell UNQ_C9, We are toldExpected Output:[[9 9 9] [1 0 5] [2 0 1]] Which is wrong. A look at the vectors is enough to see that (2,0,1) is closest to (1,0,1) by cosine similarity. I believe the autograder makes the same confusion. I mention this not to criticize our hard-working programmer, who is otherwise doing an excellent job, but so that the errors get fixed ;-) (edited)
By Фридман Р Г
•Jul 5, 2020
Pretty simple course with some basic concepts superficially explained. Good feedback and help system through Slack channels for all weeks in the course. Although it fails to provide a deep understanding of the concepts it is trying to present.
By Chengzhi L
•Jun 22, 2020
A fair level of difficulty for people with no background in NLP. The assignments are carefully designed to help the student to understand what he/she is doing. Looking forward to the next course!