Chevron Left
Back to Natural Language Processing with Probabilistic Models

Learner Reviews & Feedback for Natural Language Processing with Probabilistic Models by DeepLearning.AI

4.7
stars
1,714 ratings

About the Course

In Course 2 of the Natural Language Processing Specialization, you will: a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming, b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics, c) Write a better auto-complete algorithm using an N-gram language model, and d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

NM

Dec 12, 2020

A truly great course, focuses on the details you need, at a good pace, building up the foundations needed before relying more heavily on libraries an abstractions (which I assume will follow).

HS

Dec 2, 2020

A neatly organized course introducing the students to basics of Processing text data, learning word embedding and most importantly on how to interpret the word embedding. Great Job!!

Filter by:

1 - 25 of 294 Reviews for Natural Language Processing with Probabilistic Models

By Kabakov B

Sep 6, 2020

It is the worst course on deeplearning.ai ever. It is too simple for those who already took DL specialization and too difficult for new ones. The 'lectures' are too superficial and you will barely understand a thing. But tasks are huge -- a lot of spaghetti code with few levels of enclosed IF's, with constructions like A[i][j:k][l+1]. You will spend your time doing the bad implementation of 100K times implemented things that will not provide you with enlightenment on how they are implemented because of a lack of the theory. And nobody will teach you to use standard tools on simple and understandable examples.

By sukanya n

Jul 21, 2020

Viterbi algorithm could be explained better and Week 4 seemed very rushed with lots of details just glossed over. The assignment of week 4 compared to previous weeks seemed pretty easy.

By Gabriel T P C

Aug 3, 2020

To lessons are shallow, exercises to repetitive.

By Dan C

Jul 8, 2020

Lots of Quality Control issues. using paid customers as proofreaders is tacky.

By Oleh S

Aug 3, 2020

This course is more mature than the first one. Materials are very interesting and provide nice intuition for the probabilistic models. One can study the basics of auto-correction, Markov and Hidden Markov Models as well as the N-gram models and very important approach - Word2Vec, which is the essential part of the modern deep learning algorithms. I really enjoyed this part.

However, there are some minor suggestions:

1. Lectures duration could be longer - it will help to provide more depth in materials in both math and code side. I know that this is a simple version of real academic course, but in order to increase the quality you should consider the increasing duration;

2. Programming assignments are not balanced and there are still some minor ambiguities. For instance, the first and HMM assignments are tough, whereas the last one is a piece of cake.

3. The course can be enhanced with the additional part dedicated to Probability Theory, maybe a few lectures more.

I recommend this course to everyone interested in NLP. Note, you should read and study the additional resources to reinforce your knowledge, here is just the starting point for a good researcher. Keep going, guys!

By Manik S

Aug 13, 2020

Although the content is great but the way of teaching is lacking relative to how Andrew teaches in Deep Learning specialization. More interactive teaching with a pen tablet would be more engaging. The whole course seems like a recital of the slides. And the intructor's voice is a little bit irritating to listen to over longer durations. Otherwise the course provides a lot of learning if you can bear it.

By Greg D

Dec 27, 2020

The lecture videos are slow and shallow with little focus on building intuition. Similar with the assignments, instead of relying on existing libraries (that are popular for a reason) it painfully goes through implementing things in detail which doesn't really help you in any way later on.

100% recommend to save your time and money (and the sanity wasted on

meticulously hand-rolling things) on this and do something else instead

By ES

Jul 7, 2020

Homework is too easy. The answers are pretty much given to us.

By Dimitry I

Apr 14, 2021

Very superficial course, just like the rest in the specialization. Quizzes and assignments are a joke. Didn't want to give negative feedback at first, but now that I am doing course #4 in the specialization, which covers material I don't know much about (Attention), I've realized how bad these courses are. Very sad.

By Zhendong W

Jul 10, 2020

A great course indeed! However, it would be even nicer to have the lecture videos in a slower pace, maybe go through the examples in more detail. Sometimes it felt too quick to jump directly through the theory to examples.

By Saurabh K

Jul 14, 2020

I have a wonderful experience. Try not to look at the hints, resolve yourself, it is excellent course for getting the in depth knowledge of how the black boxes work. Happy learning.

By Mark M

Jul 19, 2020

This second course like the first feels like a first or second year university course. Sometimes the explanations are weak or missing. There was no explanation for why the Viterbi algorithm works, no explanation for how to decide which embedding extraction method (W1 columns, W2 rows, or average of the two) method to use. There seemed to be little or no TA support. Many people were posting questions and not receiving answers from TAs. I posted the mistakes I identified in the course content, but I don't think anyone is going to act on this. It would have been good if the last exercise were repeated in Tensorflow. Also it would have been good to actually use the embeddings for something in the last exercise. From the PCA graph, the embeddings looked pretty poor.

By Laurence G

Mar 16, 2021

The material covered provides a good tour of probabilistic language models, however the course needs work. Some issues were: Excessive reading off of mathematical formulas without providing the intuition behind it, the section on Viterbi was awful, a large chunk of week 4 could be replaced with a single block of pytorch/tensorflow with a note saying: "For more detail go take the deep learning course".

By Slava S

Jan 11, 2022

I actually unpleasantly surprised by this course. Completing the whole DL specialization, I used to certain quality of courses there. This one, however, has a lot of bugs (like when I finished week 1, quiz was or may be still is from week 4), quizes are just repetitions of questions from lectures (even answers are the same).

Also, all of the weeks except week 4 is more about programming in python, than NLP. Even the last week assignment is more about writing basic backprop for simple shallow network, than working with embeddings.

Assignments are too easy, splitting every topic in 5 minutes video makes it easier to watch, however I think this format does not allows providing a lot of details on topic, so in the end I feel like this course is to shallow for a 4 week course.

By John A J

Sep 25, 2020

It was a good course to introduce on AutoCorrect, AutoComplete, and Create your own Word Embeddings. However, I feel that the instructor focused too much on the implementation details. The concepts of why the pioneers are able to formulate the solution or train of thought for the different algorithms is lost. Although it taught me a little bit of implementation, but for me the implementation is just cherry on top as these things can easily be googled. So, It would have a better impact if it also teaches the concepts/thinking behind these algorithm so that I can re-use its underlying idea. Overall, it is good course to get started.

By Andreas B

Oct 4, 2020

Too many autograder issues. For instance in week 4 even if all code is correct, you get incorrect error messages about results being expected a completely incorrect type. Also, some minor maths errors and missing deeper insights concerning maths and motivations.

By Simon P

Nov 27, 2020

Simply, it's not great.

The assignments are long and complex, with insufficient checks to debug when there's an error. The theory is poorly explained in both the videos and the labs. They clearly do not know who this course is aimed at; is it software engineers who want to better understand NLP? In which case they may find the assignments easy but the content lacking. Is it people with a basic understanding of NLP but want to take it further? In which case they will not get that, given that the concepts are only briefly discussed. Is it as a general introduction to NLP? In which case the coding aspect is at too high a level, you have to be familiar with all the little python tricks they know and to think in the same way they do. This leads to a frustrating experience.

By hovering the cursor over the names of contributors in the discussion forums, it is clear that most of the people who start this course never finish it. This level of attrition reflects poorly on the course creators.

By P G

Oct 26, 2021

This course is unfortunately a waste of time. The lectures could be compressed into a 60 min video on basics of NLP with probabilistic models and uploaded to YouTube. You will feel initially like you learned a lot of things, yes, but then quickly forget this knowledge as everything is rushed and touched only superficially and you didn't develop solid understanding.

Also, the videos have to be revamped and recorded by someone a bit better in delivering lectures. The lecturer reads a script in a monotone voice and doesn't engage the learner. It feels like sitting through a boring slideshow at work rather than learning SOTA AI stuff from the world's leading tech institution...

This is just my personal opinion, perhaps it will work for you.

By François D

Jul 17, 2020

Great teacher, good pace in lectures and assignments. There are of course some redundancies wrt the previous specializations but it's nice to feel that you understand the content a bit better every time. Didn't find the forums (internal & slack) very useful, could be better structured. Can't wait for the next 2 courses.

By Manzoor A

Aug 20, 2020

Excellent! I know this course is the beginning of my NLP journey, but I can't expect more than this . The ungraded labs are very useful to practice and then apply it to the assignment. I am giving 5 star because There is only 5.

By Sohail Z

Aug 19, 2020

Brilliant course!!!! love it every aspect of the course. i am really grateful to the deeplearning.ai team for such amazing courses. they are easy to digest and provide sufficient math knowledge to understand the models.

By Alan K F G

Aug 21, 2020

Professor Younes really makes easier for me to go along the lectures and to be focus. The structure of the course helped me a lot to constantly review the same concepts as I went further in order to learn new things.

By Sazzadur R

Aug 4, 2021

Another great course introducing the probabilistic modelling concepts and slowly getting to the direction of computing neural networks. One must learn in detail how embedding works.

By Aniruddha S H

Sep 29, 2020

Very good course! helped me clearly learn about Autocorrect, edit distance, Markov chains, n grams, perplexity, backoff, interpolation, word embeddings, CBOW. This was very helpful!

By Kritika M

Aug 10, 2020

This course is great. Actually the NLP specialization so far has been really good. The lectures are short and interesting and you get a good grasp on the concepts.