MN
May 24, 2021
Great Course,
Very few courses where Algorithms like Knn, Logistic Regression, Naives Baye are implemented right from Scratch . and also it gives you thorough understanding of numpy and matplot.lib
SJ
Jul 17, 2020
One of the best introductions to the fundamentals of NLP. It's not just deep learning, fundamentals are really important to know how things evolved over time. Literally the best NLP introduction ever.
By Sajal J
•Jul 23, 2020
Course is too easy.
By Asanka R
•May 11, 2021
well explained !!
By Luis M A P
•Jul 4, 2020
really good
By samiha e m
•Oct 14, 2023
excellent
By Haoxiang Z
•Jul 7, 2020
decent
By MoChuxian
•Oct 19, 2020
nice!
By M n n
•Nov 1, 2020
Nice
By ramalingom
•Aug 6, 2020
Good
By Mark J O
•Dec 3, 2021
It's really hard to rate this course.
Pros:
- I think the coverage of the material in the lectures is excellent, and it does a good job of simply explaining some pretty complex topics.
- The instructors did a good job of pulling together real-word-relevant examples of applications, which made me feel more motivated to continue working on the material.
- The pacing is fast, but I didn't feel overwhelmed.
- There are nice visualizations
- The instructors are really friendly and enthusiastic.
Cons (serious, nearly crippling cons):
- The autograding tests are broken on at least one lesson, meaning that even someone who meets the specifications may lose points. Nobody seems to be in a hurry to fix these problems.
- The quality of the code is frequently ATROCIOUS. Whoever wrote the code failed to understand basic things like the fact that search time in dictionaries is O(1) and you don't need to use the keys() method to iterate through a dictionary. It's not all bad- there is plenty of reasonably well-written code in the course, as well as code that looks wonky but isn't really that bad. But there's also a lot of code that is very poorly written and inefficient.
- There's also a lack of consistent style in the code, which isn't wrong per se, but really makes the content harder to read. In particular, there should be no "'string1'+str(x) + 'string2'" syntax, which is a bad habit that I had to break a while back. f-strings are the way to go.
- To conclude, this is as SERIOUS PROBLEM because some students will be learning "good coding practices" from this course and others like it, and if they learn about some of the relevant Python libraries from this course, they may learn terrible habits from this course.
By Simon P
•Nov 14, 2020
It's clear that the creators of this course could not decide who it was going to be aimed at, or what level it would be. So, you end up with a course that is too light on the NLP but assumes anyone doing the assignments knows the little numpy and dictionary tricks that they do. Consequently, the assignments do not test your understanding of NLP, only your understanding of how the notebook creators code.
The videos are far too short, a common complaint I can see from other reviews. Additionally, they fell into a common trap that plagues script writing for education. What you absolutely must not do, and is exactly what they do, is just machine gun through the information and terminology when presenting. If you watch good lecturers, they leave time for concepts to settle in and they will reinforce key points by restating them in a different way. They know how to hit the beats because they know how people learn. An information dump, as we have here, is a poor didactic method.
The assignments are mostly okay and use notebooks where you have to 'fill in the blanks'. There are some flaws with this, the first being that you have to write the code in the format they want, so alternative methods are marked as wrong. Even more severe is that there are insufficient checks in some of the later notebooks. It is possible to get far into one and obtain the expected results, only to have one cell give the wrong result. This means the error is in an earlier cell and you have no way of knowing where it is without spending a long time exploring. This problem is especially bad in the final week's assignment, which is overly long and has an insufficient number of checks.
By Maury S
•Feb 22, 2021
This course has a lot of promise as an introduction to NLP methods. It does a clear job of introducing logistic regression, Naive Bayes, and basic concepts of embeddings. However, I have some significant reservations about the current state of the course.
First, the course is introduced by Andrew Ng as being taught by Younes Bensouda Mourri and Lukasz Kaiser, and heavily promoted by Andrew's marketing through deeplearning.ai and The Batch. In reality, Andrew is barely involved (except for a couple of excellent optional interviews), Lukasz says a sentence or two at the beginning of each lecture, and Younes handles the lectures. Younes is just fine as a teacher, but it is clear he is reading from scripts and one feels as if the course was advertised as being taught by a more senior team. It does not have anything like the feel of authority of Andrew's classic Machine Learning course on Coursera.
Second, there are various small errors in the materials. For example, one slide set that has numerous calculations wrong because a column of numbers is summed to 12 rather than 13, and the course has a small notice about the error rather than correcting the slides. There are various confusing instructions (and some small errors) in the programming assignments.
Third, some of the choices of content were odd. I did not understand why week 3 spent much of the programming assignment on the details of implementation of PCA (which is a visualization technique not an NLP technique), without really teaching the underlying math.
In sum, this is a good introduction to NLP concepts but as yet below the standard that one expects in the Andrew Ng universe.
By Sandie C
•Jan 10, 2022
There is room for improvment in this module.
Please find below, some comments that may help improving the content:
1. I found that there is a lack of hindsight on the place of these methods in NLP processing chains and how they are combined. 2. Errors are present in the slides and should be corrected in the content as well (especially in week 2 Vclass vs V + incorrect addition of frequencies which made several slides inacurate). 3. Using the term "dot products" limits the understanding of the transformations performed in vector space. In my opinion, it would be worthly to speak/ write "projection" instead on the slides.
4. Make a clear plan within a week that recall the place of each concept into NLP processing. It will give a better insight of the importance and the place of each concept by looking to the utlimate goal of producing an NLP processing chain. 5. Make a better disctinction between sentences (tweet) and document.
6. the content is, in my opinion, too fragmented
By Eliecer A M V
•Jun 23, 2021
The notebooks are regular, some cells marked as "hint" and those contains info about functions required to pass tests when I submit the notebook. In the other hand, some functions are equivalent, for example np.dot is equivalent to dot function for a numpy array instance. However, some tests evaluate if cells contains np.dot even when the output is correct. This is so confusing and I spend a lot of time checking my cells when even the output is correct. Concluding, the notebooks can be better, instead of say "print the variable x to know about the structure" you can use visual aids to show what we are doing, I'm lit tired of checking my variables, check the structure and check and check every cell. Another advice, please stop of using list, set and np arrays together without any mean, I mean just use dictionaries and np arrays and this is all! I'll qualify this as 3 stars because the notebooks. I learned about and thanks teachers.
By Santiago H
•Sep 5, 2020
The video explanations are too mathematical, and maybe a bit short, as they are complex information that could be explained further.
I think the final test after the week's videos was a good idea, and I missed it here.
The "intermediate" code "assignments" should also ask you to code stuff, not just read what other person has coded, in my opinion.
Finally, I think the final programming assignments are not very well explained (especially compared to those of the Deep Learning Specialization), I was very lost most of the time and didn't really know what I was doing beyond applying some math. Actually, for the last assignment, which was supposed to be a translation from English to French, I didn't see it at any point at all. I basically don't know what I coded there.
By Wenzhe X
•Sep 14, 2021
The course is clear, understandable for people have basic ML background. It's very practical too. Never get bored.
I'd give 3 star as I think the grading system for assignment should be better. For example, when it's grading KNN, it requires the output cluster element is following an exact order, so you won't get the point if the output cluster(as a list) in a different order, even though it outputs the same cluster of element, which makes little sense. Funny thing is the grading standard for KNN requires the order of cluster element following distance from largest to smallest, not the other way around. Meanwhile, I did suspect some of the assignment answers were incorrect. However, no one has replied for my post in the forum.
By Kiril P
•Jul 20, 2022
It looks like first scetch of the course. All programming assignments has "try your own examples, check what you just implemented". Week 4 translation - after performing gradient decent I didn't have oportunity to test translation. Or the buggy example with Country Capital matching "if you use euclidian distance Ankara is capital of Japen". After Deep Learning specialization where Andrew Ng so passionately explain you everything, guys that are reading from a teleprompter looks boring and unprofessional. Like "if you can't explain it in YOUR OWN words you don't understand it." But Labs were useful and hints sometime too. That is good. I guess most of the topics were covered, thanks for that.
By Matt R
•Jan 28, 2022
I had a problem with the grading for assignment 4 in course 1 and after quite a bit of troubleshooting and posting a forum question I found a similar problem that had a suggestion that helped me fix it. I would have liked to be able to get some feedback from an instructor directly to save 2-3 days of struggling to resolve the grading glitch myself.
With respect to a first course in NLP this it is ok but as others have said most of the code is already written. If you are wanting to understand the intuition of NLP this can be good but then difficult to apply yourself in practice.
By Reza D U
•Mar 11, 2021
if you compare this course to that taught by Andrew directly, this course is somewhat lacking. I love how Andrew teaches his students (like he did in ML course and DL Specialization) using direct writing on the screen and using natural speaking rather than speaking like a robot (yeah you just read texts when explaining something).
Many mathematical concepts but lacking explanations.
You placed too many coding assistants in the programming assignments, making doing the assignments is just like a fill-in-the-blank question. not challenging.
By Alberto S
•Aug 4, 2020
Some videos could be better presented. For instance, start explaining that k-NN will not be implemented the usual way, but using a fast approach.
Also, the validation of the submissions could do better. np.array(list) works just like np.stackv(list). del(foo[bar]) works just like foo.remove(bar) and matrix.squeeze() works just like np.squeeze(matrix).
I know it is difficult to get all the possible code combinations. But probably the code could be tested more as input/output than grepping the code for keywords.
By Moustafa S
•Aug 31, 2020
the material is super basic and from scratch, like a stone aged one, many many ways we could have talked about KNN and other ML methods using sklearn, we wasted so much time implementing those, and also please work on your comunication skills as i was not comfortable with the instructors being so uncomfort in front of the camera, not a huge deal but it's just a tip, hopefully the rest of the specialization is better, looking forward to them.
By Laurence G
•Mar 10, 2021
Course is good, not great - mainly let down by a few presentation issues, mistakes in slides and lab code, and a rather picky grader. This sounds worse then it is - there's only a few of each issue and in some ways it's a good test to see if you're paying attention :) Overall decent course to brush up on some NLP foundations, but you better have a good background in Linear Algebra if you don't want to take a large tangent.
By Jakub S
•Dec 6, 2021
I must admit that I am quite disappointed with this course. The explained material was interesting, but there were many errors in the assignments. It happens that people get 0 points for some exercises because of internal problems with the tests. In such cases, it is not even known what the problem is exactly. Hence you must do exercises exactly as the professors wanted, even if there are other approaches.
By Michal
•Jun 27, 2023
Course touches important concepts, but only on the surface, not really explaning things in-depth.
3min videos explaining PCA or LSH seem just too short.
Also, big minus for no support/forum for the course - one assignment contained typo and unless fixed, you'd always get negative grade. But there was nobody to talk to...
I'm sure Deeplearning.ai can do better.
By Dmitriy D
•Apr 21, 2021
Not bad for beginners. Assignments are quite easy as they are almost done for you, however usefull to look carefilly at them to understand the idea behind.
I rated 3 stars as this course is not really about NLP, but more about other stuff. For example week 4 hashing tables exercise is interesting but not NLP related, more about efficient KNN finding
By Andrew D
•Dec 28, 2021
I found too many issues along with a lack of clarity in the programming assignments. Definitely needs to be refined.
The lectures are decent and the reading sections are nice, which displays the previous presentation contents.
The additional labs are just OK, not great. Some contain a lot of code that was not exactly clear so much of it was skipped.