Chevron Left
Back to Natural Language Processing with Probabilistic Models

Learner Reviews & Feedback for Natural Language Processing with Probabilistic Models by DeepLearning.AI

4.7
stars
1,714 ratings

About the Course

In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming, b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics, c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

NM

Dec 12, 2020

A truly great course, focuses on the details you need, at a good pace, building up the foundations needed before relying more heavily on libraries an abstractions (which I assume will follow).

HS

Dec 2, 2020

A neatly organized course introducing the students to basics of Processing text data, learning word embedding and most importantly on how to interpret the word embedding. Great Job!!

Filter by:

26 - 50 of 294 Reviews for Natural Language Processing with Probabilistic Models

By Andrei N

Jul 11, 2020

A great course in the very spirit of the original Andrew Ng's ML course with lots of details and explanations of fundamental approaches and techniques.

By Minh T H L

Jul 31, 2020

Thanks for sharing your knowledge. I am happy during the course and I also leave a couple of feedback for minor improvement. All the best.

By Ajay D

Aug 17, 2020

Course was very insightful about the latest enhancements in the field of NLP. The exercises designed was very hands on and I loved that. However I felt a bit incomplete as I didn't see any large dataset in action, maybe my expectation was wrong. I was also wondering if I can get to see some more applications of these language model and word embeddings in the course.

By Kravchenko D

Aug 21, 2020

Nice course, but assignments in this course are less practical than in the first course of this specialization. The last assignment in this course was implementing the word embeddings generation using your own neural network. The whole process of writing your own neural network is nice except the resulting word embeddings that look very bad and ungrouped on the plot and the text in the notebook says: "You can see that woman and queen are next to each other. However, we have to be careful with the interpretation of this projected word vectors" without an explanation of what's wrong with the results. So I think that the last assignment should be reworked by the reviewers to have illustrative results at the end, not just "Word embeddings at the end are bad. Bye-Bye, see you in the next course"

By Vitalii S

Jan 9, 2021

Good information and presentation, but one should work on functions grading.

The problem is when you did something wrong in function C7, but from code and results point of view grader thinks it is OK. Only at C11 you figure out that something went wrong and waste time searching what is incorrect, all way down to the C7.

I think you should make kind of unit testing of a functions to make sure that it is really correct.

By Kartik C

May 8, 2021

The content and lectures are very good, but the assignments are overly restrictive, forcing one to do it in exactly one way and giving them no room to try (and maybe fail sometimes) while exploring different ways of doing something. Feels like during the assignments you are not learning anything just doing what you are being told to do.

By Amlan C

Sep 17, 2020

Too many gaps in the course. Many concepts not covered in the mathematical sense basic Grad. Desc. math would have been helpful. Also if you want to omit it totally you should have atleast one lab on how one would do it in real life using which library? Pytorch? Keras? What? Rest of the course is okay. Younous is great in explanation.

By Sina M

May 14, 2023

Compared to prior deepLearning Ai courses. the lecturers were very robotic and un natural. The explanations were much less clear and less effort was made to explain the intuitons behind formulas.

By Benjamin W

Jul 14, 2023

I can not give a good rating for this specialization. The teaching style seems antiquated to me. Instructors are reading text like an AI from a text ticker (you can even see the eyes moving). Sometimes the quality is not good ("plopp-noise" because a bad microphone or headset was used). The Jupyter Notebooks for the Assignements do hang often. Moreover I think the quality of the code could be better. I'm working through the Specialization because my employer paid it. I can not say, it's very motivating. Despite my negative review, I learned something from it.

By Rajaseharan R

Jan 28, 2022

All quizes and assignments need to be revised and tested again. There seem to be multiple errors in the provided functions. There are also some quiz questions that don't make sense (there is no actual question.) There are also quizes that don't follow the weeks lectures. Some of the exercies need to have more help in the code comments section as they are not covered in the lecture, e.g. Week4 back_prop assignment.

By SERIN A

Mar 22, 2021

Do not waste your time, these very basic explanations of concepts barely teach you anything.

By Rahul P

Apr 11, 2022

poor theory and the instructutor speaks fast. No lecture from lukaz

By Dave J

Jan 25, 2021

This course gradually ramps up the sophistication and interest from the first course in the NLP specialization.

Week 1: Autocorrect and Minimum Edit Distance is OK, nothing to write home about but gives you a sense of how a basic autocorrect mechanism works and introduces dynamic programming.

Week 2: Part of Speech Tagging explains Hidden Markov Models and the Viterbi algorithm pretty well. More of a sense here of learning something that will be a useful foundation.

Week 3: Autocomplete and Language Models explains what a language model is and builds a basic N-gram language model for autocompleting a sentence. Again, good foundations.

Week 4: Word embeddings with neural networks was for me the most interesting part of the specialization so far. The amount of lecture & lab content is considerably higher than in the previous weeks (which is a good thing in my view). The pros and cons of different ways of representing words as vectors are discussed, then different ways of generating word embeddings, from research papers dating from 2013 to 2018. The rest of the week focuses on implementing the continuous bag-of-words (CBOW) model for learning word embeddings with a shallow neural network. The whole process, from data preparation to building & training the network and extracting the embeddings, is explained & implemented in Python with NumPy, which is quite satisfying.

I found that the labs and assignments worked flawlessly. They are largely paint-by-numbers though, I would have liked to have been challenged and made to think more. The teaching is pretty good, though there's room for improvement. It tends to focus a little narrowly on the specific topic being covered and has the feel of reading a script. What I would like to see is more stepping back, thinking about and explaining the larger context of how the topic fits into current NLP and the student's learning journey; then engaging with the learner on this basis. I did feel this course was a little better than course 1 in that regard. Overall 4.5 stars but as there are no half stars, I'm going to let week 4 tip it up to 5.

By Yuri C

Dec 29, 2020

I enjoyed very much this second course in the NPL specialization! I must say, once again the balance between mathematical formalism and hands-on coding is just on point! This is also not easy to achieve. I quite enjoyed also the infographics about the word embedding model developed during the course. I have been reading blog posts and papers about the technique for some time now and I did not see any best explanation than the one in this course, chapeau! Nevertheless, there are also points of improvement to consider. One of my main concerns is that at the end of some assignments, there is very little discussion about the validity and usefulness of what we get at the end. Although in the motivation a lot is being put forward. For example, while building the autocomplete, there were a lot of time dedicated to motivating why is this useful and why one should learn, but at the very end of the week, when we finally build one with tweeter data, there is very little analysis on these results. This is a bit frustrating. Of course, one cannot build very useful models while in an assignment in a Jupyter notebook, nevertheless I am positive that you can find also here a good balance between analyzing the model's outputs and inquiring if indeed we achieved the goal we set at the beginning, and if no, why not, etc. Clearly, assignments are not research papers, but a bit more careful treatment on that end will make this course achieve its full potential. Keep up the good work!

By John Y

Dec 8, 2021

I really enjoyed the first two courses so far. I finished the DL Specialization before I took this NLP sequence. I'm glad that they tended to focus on the basic and essential concepts and not on the details too much like data cleaning although they do show you how these things are done too. But there are really a lot of things to digest. These courses can potentially fit into semesters in school but I think they successfully covered the important materials well. I especially enjoyed learning new things like hashing, dynamic programming, and Markov models. I found the labs to be very helpful because they helped divide the amount of material to digest into smaller portions. I also found them very helpful for the homework. Although some people didn't like the short videos, I actually liked them. They were mostly to the point and they were easier for me to review. People comment that they miss Andrew Ng's lecture. Unfortunately, I don't think Andrew can teach many more classes because he's busy with many things, although that would be awesome. However, I think that Younes did a great job of teaching. If I understood what was said and am able to do the quizzes and homework, then I'm good. Eventually, we're gonna have to work and think independently, anyways. I think the NLP courses tended to wean us towards that kind of work habit. Thanks Younes, Lukas, and the rest of the team for these awesome and wonderful classes! On to courses 3 and 4!

By Jonathan D

May 1, 2022

Really good course. Covers many topics thoroguhly and succinctly. Very detailed implementations.

As feedback, I'd say it's often more difficult to follow and hit the nail in terms of the "rigidity" of the implementations than the concepts or notions being taught. It can feel like a course on how well do I understand the way this was designed and implmented than the ideas about natural language processing that underpin them. In the same breadth I'd say that these implementation methods are of course now in my toolbox and they have shaped my way of thinking.

By Leena P

Oct 5, 2020

I enjoyed Younes's teaching style and the specializations course structure of asking the quizzes in between the lectures. Also the ungraded programming notebooks give grounding and hints while allowing the graded work to be challenging and not completely obvious. Thanks to all the coursera team for sharing such deep knowledge so universally and easily. This knowledge sharing to all that seek it, is what I think is the hope for AI to stay relevant and not get lost in hype.

By Shrikrishna P

Aug 15, 2023

Thanks to the instructors Łukasz Kaiser, Eddy Shyu, and Younes Bensouda Mourri for their lucid way of explaining mathematics and NLP concepts. I would definitely use the learnings from this course.

This course has given me the confidence to work on NLP challenges and understand the mathematics behind the algorithms. Looking forward to reading research papers and implementing them.

Thank you once again. Long live Coursera. Long live DeepLearning.AI.

By SHASHI S M

Dec 25, 2020

I learned auto-correction using minimum edit distance algorithm, part of speech tagging by Viterbi algorithm, autocomplete using n-gram model, word embedding by applying Continuous Bag of words models. This week was a little tough and got great hands-on experience in NLP. Change my thought about NLP. This week was amazing. I work on nltk library, created a neural network to train a model for word embedding.

By Kshitiz D

Dec 7, 2021

Thank you Coursera for the financial aid! I was able to dive deeper into NLP, learn Autocorrect, Markov chains, Autocomplete, and word embeddings. The course was amazing throughout and practical assignments were nicely prepared to give one a complete overview of the things taught in a particular week. The thing I didn't like about the course was the repetition of problems in quizzes.

By Nishant M K

Mar 31, 2021

Great course! As course 1 in this specialization, the REAL value lies in the Python/Jupyter notebooks that have a great mix of filling out key steps, along with very detailed and pointed descriptions. The lecture material is also very helpful in 'orienting' students and the coding assignments are where the actual learning happens. I would very much recommend this course!

By Nilesh G

Jul 29, 2020

Greta Course, Nice Contents from basic to advance NLP...coverage of topics about word embedding,POS, Auto Completion was very good, assignments are challenging one but learn lot of things by hand son practice, hints are useful ..looking forward to complete remaining courses from this NLP specialization...Thanks to all instructors

By Cristopher F

May 8, 2021

This is an exciting course. This course will not make you 100% ready for the real world, but it will give you directions that you can follow by yourself. I think the purpose of learning is not to be stuck somewhere while losing your mind. It's to build a foundation where you can find your own path.

By Jessica

Jan 8, 2021

Most knowledge is new to me, but I really enjoy all the course content. I hope the autocomplete model could also instruct me how to predict new 2-5 or even more words based on N-gram models. The assignment of autocomplete only includes cases that predict the next 1 word.

By Soopramanien V

Sep 30, 2020

Great course to learn word embeddings, the instructors are excellent at explaining key concepts in a very clear and concise way and the accompanying assignments and labs serve their purpose in getting hands-on experience with implementing many of these NLP models