Chevron Left
Back to Sequence Models

Learner Reviews & Feedback for Sequence Models by DeepLearning.AI

4.8
stars
30,411 ratings

About the Course

In the fifth course of the Deep Learning Specialization, you will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more. By the end, you will be able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering. The Deep Learning Specialization is a foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to take the definitive step in the world of AI by helping you gain the knowledge and skills to level up your career....

Top reviews

JY

Oct 29, 2018

The lectures covers lots of SOTA deep learning algorithms and the lectures are well-designed and easy to understand. The programming assignment is really good to enhance the understanding of lectures.

WK

Mar 13, 2018

I was really happy because I could learn deep learning from Andrew Ng.

The lectures were fantastic and amazing.

I was able to catch really important concepts of sequence models.

Thanks a lot!

Filter by:

3626 - 3650 of 3,697 Reviews for Sequence Models

By Rahul T

Aug 9, 2020

Programming exercises was very confusing.

By Ritesh R A

Feb 2, 2020

Course should have have more descriptive

By Moha F T

Aug 28, 2021

it's not bad but it's also not perfect

By Alfonso C

Mar 27, 2023

I found coding exercises not usefull.

By Liang Y

Feb 10, 2019

Too many errors in the assignments

By guzhenghong

Nov 17, 2020

The mathematical part is little.

By Julien R

May 25, 2020

second week was hard to follow

By stdo

Sep 27, 2019

So many errors need to fix.

By ARUN M

Feb 6, 2019

very tough for beginners

By Wynne E

Mar 14, 2018

Keras is a ball-ache.

By Ehsan G

Sep 10, 2023

Amazing experience

By Monhanmod K

Mar 17, 2019

too hard

By CARLOS G G

Jul 26, 2018

good

By Debayan C

Aug 23, 2019

As a course i think this was way too fast and also way too assumptive. I wish the instructions were a bit slow and we broke down more into designing bilstms and how they work and more simple programming excercises. As a whole i think 1 full week of material is missing from this course which would concentrate on the basic RNN building for GRUs and LSTMs and then move on to applications. I usually do not review these courses and they are pretty standard but this course left me wanting and i will consult youtube and free repos to learn about it better. I did not gain confidence on my understanding. Barely scraped through the assignments after group study and consulting people who know this stuff (which defeats the purpose of this course i believe. It is to enable me with concrete understanding and ability to build these models . It shouldn't lead me to consult others and clear out doubts .)

By Ian B

Oct 15, 2023

The lectures are pretty good up to Week 4, when the Transformer architecture is thrown at us way too fast. The increase in complexity is abrupt and huge. The programming assignments are far less helpful. I didn't really have to engage with the logic of the various deep learning models involved or understand the bigger picture of how the code works, because the starter code takes care of all that. All I really needed to do was type the obvious functions into the fill-in-the-blanks places, and then fiddle their arguments with trial-and-error until the error messages went away.

By 象道

Sep 16, 2019

i really learned from this course some ideas on recurrent neural net, but the assignments of this course are not completely ready for learners and are full of mistakes which have existed for more than a year. those mistakes in the assignments mislead learners pretty much if they do not study some discussion threads of the forum. this course has the lowest quality among all of Dr. Andrew Ng's. before the updated versions, a learner had better have a look at the assignments discussion forum before starting the assignments.

By Luke J

Mar 31, 2021

The material really is great, but work needs to be done to improve the assignments, specifically submission and grading. On the last assignment I spent way more time troubleshooting the grader than the content of the assignment. It can be very frustrating to have to do this on a MOOC where no human support is available. It appears, specifically for this assignment based on discussion that this has been a problem for a very long time.

By Pakpoom S

Dec 29, 2023

The week1 and 2 are good. I don't understand week 3 in the sense that most of the content is about beam search, but we get no exercise about it. Instead, we get speech data processing. I don't like this. The week4 is worse. I don't even know how to put it in words. I think you simplify things too much. The instructor should put more detail into showing the shape of matrices. I still don't know the output shape of MultiHeadAttention.

By daniele r

Jul 15, 2019

The subject is fascinating, the instructor is undoubtly competent, but there is a strong feeling of lower quality with respect to the other 4 courses in the Spec (in particular the first 3). Many things in this course are only hinted to, without many details. Man things are just said but not really explained. Many recording errors as well. Maybe another week could have helped in having a little more depth in the subject

By Amir M

Sep 2, 2018

Although the course lectures are great, as are all the lectures in this specialization, some of the assignments have rough edges that need to be smoothed out. It is particularly frustrating for those trying to work on the optional/ungraded programming assignment sections that have some incorrect comparison values, as much time will be wasted trying to figure out the source of the error.

By David S

Dec 19, 2020

Excellent lectures, terrible exercise material. E.g. "You're implementing how to train a model! But we've done the actual training for you already! Your exercise is to add numbers A and B! Number A is 4. Number B is 11! Enter A + B in the box below!" Also, someone did a search-and-replace and converted every sentence into an individual bullet point to reduce readability.

By Sergio F

May 16, 2019

Unfortunately, this course is the less valuable in the specialization. Programming assignment very interesting but no introduction to Keras. To pass the assignments, forum support has been vital. I also found lectures not clear even to the point that to catch some concepts you have to google around for more resources. Unfortunately, I could not suggest this course.

By Guruprasad K

Mar 9, 2022

Compared to the other courses in the specialization, this appears to lack depth and clarity one could expect. LSTMs and GRUs are somewhat out-dated now, given the speed of innovation in the field, and Transformers are here to stay (for now). Unfortunately, Transformers are very poorly covered.

By Peter B

Feb 20, 2018

Getting the input parameters correct for the Keras assignments is on par with the satisfaction of dropping a ring, contact lens, or an expensive object into the sink, and spending an hour looking for it inside the disassembled pipes, through built up hair debris and molded dirt.

By SARAVANAN N

Mar 19, 2018

Overall a great course, thanks to Andrew NG for his great explanations. But a very bad support, I faced many issues in submitting the assignment due to technical issues (notebook not saving) but no dedicated resource to help me. I spend lot of time in resolving my self.