Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
1,024 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into Portuguese using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, and created tools to translate languages and summarize text! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

LL

Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

Filter by:

201 - 225 of 251 Reviews for Natural Language Processing with Attention Models

By Audrey B

Jan 4, 2022

Great content, although the focus is definitely more on the attention mechanisms and on the Transformer architecture than on the applications themselves. Still really enjoyed it and I now feel like I have a better grasp of Transfer Learning and its associated methods. Content is very clear and well explained

By Ankit K S

Nov 30, 2020

This is really an interesting specialization with lots of things to learn in the domain of NLP ranging from basic to advanced concepts. It covers the state of the art Transformer architecture in great detail. The only thing with which I felt uncomfortable is the use of Trax Library in assignments.

By Vijay A

Nov 20, 2020

Covers the state of the art in NLP! We get an overview and a basic understanding of designing and using attention models. Each week deserves to be a course in itself - could have actually designed a specialization on the attention based models so that we get to learn and understand better.

By Alexandre B

May 20, 2023

This course is quite complete as it presents major hot NLP tasks with transformers, but unfortunately it presents only one framework: Trax, and not Hugging Face's which is also really useful and used in the field. I would have liked to have a lesson about chatGPT like models.

By Naman B

Apr 28, 2021

It would have been better if we use standard frameworks like PyTorch instead of Trax. Also, the Course Videos are a bit confusing at times. It would have been great if the Math part would have been taught as Andrew Ng Taught in Deep Learning Course.

By Tom W

Oct 9, 2024

A good introduction to the transformer model. Some of the exercises felt a bit more lead (more code written by the instructor) than earlier courses - this may be to do with the volume of material necessary to cover transformers.

By Cees R

Nov 29, 2020

Not new to NLP, I enjoyed this course and learned things I didn't know before. From an educational perspective, I didn't like that the two "optional" exercises were way harder than the too easy "fill in x here" assignment.

By Zicong M

Dec 14, 2020

Overall good quality, but seems a bit short and content are squeezed.

I don't like the push of Trax neither, it is has yet become the mainstream and personally I don't find that helpful for my professional career.

By Jonas B

Apr 11, 2023

A good course, that I can recomment without a doubt. I would strongly recomment to complement it by reading the additional ressources (see week 4 -> "References") as well as hugging face tutorial for NLP.

By Gonzalo A M

Jan 21, 2021

I think that we could go deeper in the last course because you taught a lot of complex concepts but I did not feel confidence to replicate them. It was better to explain transformers with more detail

By Cornel M

Jun 18, 2023

The lectures need more insights to understand not only the 'how' but a reasonable amount of the 'why', too. Andrew is very good at doing this in his lectures and provide his intuitions and insights.

By Vishwam G

Mar 10, 2024

It could have been better if Transformers library from hugging face is explored more. and topics like Vision Transformers and utilization of Transformers for computer vision is explored.

By CLAUDIA R R

Sep 7, 2021

It's a great course, more difficult than I thought but very well structured and explained. Although more didactic free videos can complete the lessons from others websites.

By Anonymous T

Oct 15, 2020

great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate

By Qiao D

Nov 4, 2022

The content is great, but it will be even better if we have a more in-depth understanding of the knowledge rather than a very quick crash course.

By Moustafa S

Oct 3, 2020

good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job

By Mohan N

Mar 28, 2021

The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.

By Rahul J

Sep 29, 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

By veera s

Mar 18, 2022

need more detailed explanation in the last course of this specialization, especially Attention and BERT models.

By Chip G

Nov 7, 2024

This is the toughest AI course I have taken. I hope your Python coding skills are better than mine.

By Vaseekaran V

Sep 20, 2021

It's a really good course to learn and get introduced on the attention models in NLP.

By David M

Oct 25, 2020

An amazing experience throughout the state-of-art NLP models

By Roger K

May 17, 2022

Labs required a bit more context, to understand.

By Shaojuan L

Dec 18, 2020

The programming assignment is too simple

By Fatih T

Feb 4, 2021

great explanation of the topic I guess!