Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
1,024 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into Portuguese using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, and created tools to translate languages and summarize text! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

LL

Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

Filter by:

51 - 75 of 251 Reviews for Natural Language Processing with Attention Models

By Rukang X

Sep 24, 2021

It would be better using TensorFlow as an implementation tool of these cutting edge algorithms for its popularity both in academia and industry.

By Amit J

Jan 2, 2021

Though the content is extremely good and cutting edge, the course presentation/instructor hasn't been able to do justice to the course. [1] Teaching concepts through assignments (and not covering them in detail in lectures is) an absolutely bad idea. [2] Lecture instructions are ambiguous and immature at times. Instructor is an excellent engineer but a bad teacher is very evident from the way of presentation. [4] Only if input output dimensions were mentioned at every boundary in network illustrations, would have made a lot of difference in terms of speed of understanding without having to hunt through off-line material and papers. [5] Using Trax I think is not a good idea for this course. The documentation is kind of non-existent and lot of details of functions are hidden and the only way to understand them is to look at the code. A more established framework like Tensorflow or pytorch would have been much more helpful.

Overall a disappointment given the quality of other courses available from Coursera.

By Laurence G

Apr 11, 2021

Pros: Good choice of content coverage. Provides a historic overview of the field, covering the transition from early work on seq2seq with LSTMs, through the early forays into Attention, to the more modern models first introduced in Veswani et al. Week 4 covers the Reformer model which was quite exciting. Decent labs

Cons: Videos aren't great, there are a lot of better resources out there, many actually included in the course's reference section. Trax is not a good framework for learners in comparison to Pytorch, but if you plan on using TPUs and appreciate the pure functional style and stack semantics then it's worthwhile. The labs can be a bit copy-pasty. Some of the diagrams are awful - find other resources if this is a problem.

Overall: I'd probably rate this course a 3.5 but wouldn't round up. The videos really let things down for me, but I persisted because the lesson plan and labs were pretty good.

By Christine D

Jan 22, 2021

Even though the theory is very interesting, and well explained the videos dive too deep in certain concepts without explaining the practical things you can do with them too very well.

The practical stuff, especially the graded assignments, are very centered around Trax, and the only things you have to know and understand are basic python and logic. You don't really get to make your own stuff, you just fill in stuff like "temperature=temperature" or "counter +=1".

I preferred and recommend the first two courses in this NLP-specialization.

By Thomas H

May 21, 2021

While the course succeeds in getting the most important points across, the quality of both the video lectures and the assignments is rather disappointing. The more detailed intricacies of attention and transformer models are explained poorly without providing any intuition on why these models are structured the way they are. Especially the lectures on current state-of-the-art models like BERT, GPT and T5 were all over the place and didn't explain these models well at all.

By Eymard P

Jul 21, 2022

The course is okay, but to be fair, nothing compared to what Andrew Ng had done. The explanations are too vague, I feel a lot of details are missing. I now have a basic understanding of transformers, but it is pretty shallow. The assignments are too mechanical, I was just understanding locally, but not much globally.

The bottom line is that this course is an okay introduction to Attention and Transformers, but you'll need to work on the side to refine the knowledge...

By Junhui H

Nov 15, 2022

Course four is way more advanced than the previous three courses. If you are not very familiar with tensorflow or the architect for deep learning, it will be a bit hard to keep up with content. Also, the videos do not cover enough detail and sometimes it is difficult to understand the upgrade notebooks.

That said, I can see the course is well prepared, and if you have enough knowledge in deep learning, it will still be quite useful for you.

By Zhuo Q L

Jul 4, 2021

It is exciting to learn about the state of the art approach for NLP, but as the last course of the specialization, one can feel that the quality/level of details of descriptions just dropped significantly. I like how the course introduces useful things like SentencePiece, BPE, and interesting applications, but some of them felt abrupt and wasn't elaborated.

By Dan H

Apr 5, 2021

Pros: Good selection of state of the art models (as of 2020). Also great lab exercises.

Cons: The video lectures and readings are not very helpful. Explanations about the more tricky parts of the models and training processes are vague and ambiguous (and some times kind of wrong?). You can find more detailed and easier to understand lectures on Youtube.

By dmin d

Jan 7, 2021

Have to say, the instructor didn't explain the concept well. A lot of explanation doesn't make sense, or just give the final logic and skip all the details. I need to search on youtube or google to understand the details and concept.

But, it covers state-of-art models for NLP. It's a good starting point and helped save time.

By Oleksandr P

Apr 4, 2021

Although this course gives you understanding about the cutting edge NLP models it lacks details. It is hard to understand a structure of the complex NLP model during the few minute video. This course should have step by step explanations in the bigger number of lectures or increase their duration.

By martin k

Apr 26, 2021

Low quality programming assignments, but considering the price it's good overall

By Randall K

Jun 14, 2021

In the previous 3 courses, the HW was a natural extention of the lectures and provided solid reinforcment of the course material. However, in this course, I found the courses did not prepare me for the HW. Furthermore, I found the lectures too terse, often incoherent, and the homework tried to introduce new concepts that were not discussed in the lectures. Also, the code in the labs was poorly organized and the lack of a consistent and coherent style between assignments and even previous courses, which made it difficult to follow the logic. I often spent a lot of time sorting out tensor indexing issues, which is very difficult in Jupyter without a debugger.

By Greg D

Dec 31, 2020

Even though this is better than the other 3 courses in the specialization it's not really any different from reading a few posts on popular machine learning blogs about the technologies they present here. I would understand if the instructors brought some insights, but it's largely just repeating what they have in the slide which in turn is just bare minimum about how to make these concepts work (which again can be found through papers + free resources).

Overall, I would recommend against taking this course since there are better or equal materials available.

By DAVIDE M

Mar 9, 2022

This course is good if you want to be theoretically good with Transformers model. I mean now I can explain those concepts to my colleagues or pair. It lacks with the practical parts, a lot of exercises are too guided e there is no project that you can show off. The hugginface part is the most interesting for practicing but there are only a few lessons. In the end, do not expect to make a chatbot in week four, it is "just" a model that generates dialogue between two persons.

By Tianpei X

Nov 1, 2020

the homework is way too simplified esp. in week 3 and week 4. My impression is that the ungraded lab was actually the real homework but was put aside to allow more people to pass. That is not a good compromise.

By George G

Dec 6, 2020

Week 1 jumps into material that is better explained in Week 2. Attention deserves a more gradual and a more deep explanation. Weeks 3 and 4 cover a lot of ground, without going into depth.

By Gary L

Oct 20, 2020

Disappointed. Course 4 is much more difficult to follow than other courses in this NLP specification plus other deeplearning.ai course.

By Omar H

Jan 16, 2021

The Course topics are great but it could be much better by explaning the topics with much more details and providing more examples.

By Sharad C

Apr 25, 2021

Probably one of the worst courses I have ever taken. By week3 I am completely lost and cant make head or tail of the content. Fine tuning the model ? My foot! The videos seemed mug up and the colab material doesnt work, I cant see any fine tuning at all.

All in all horrible course which fails on all fronts of learning.

This has been a glorius level of time sink. I will issue a chargeback from my credit card .

By Rajeev R

Apr 26, 2021

Course content was not educational and the assignments only evaluated python skills rather than DL knowledge.

By Vadim P

Dec 26, 2021

As much as Andrew's AI course is good, that much this one is bad. Poorly presented useless material.

By Weizhi D

May 31, 2021

This course sucks. The instructor cannot express concept clearly. Don't take this course.

By Rabin A

Apr 19, 2021

The course was pretty good. It introduced me to the state-of-the-art algorithms and techniques needed to have a sound understanding of NLP. One thing I didn't like about the teaching method in the whole specialization is that Younes was the one teaching the course content to us but Łukasz talked as if it was he giving some of the lectures, although we could clearly find out it's Younes from his voice. Thanks especially to Younes for doing all the hard work for the specialization. You deserve a 5 star.

By Dustin Z

Dec 17, 2020

A very good and detailed course. Definitely the most challenging course I have taken by DL.ai. Gives a good overview of Transformers, the current cutting-edge of NLP models. Also, provides great insight into Trax, Google Brain's ML framework, which was helpful in understanding how deep learning frameworks are built. One of the teachers is one of the authors of Trax!