Chevron Left
Back to Sequence Models

Learner Reviews & Feedback for Sequence Models by DeepLearning.AI

4.8
stars
30,203 ratings

About the Course

In the fifth course of the Deep Learning Specialization, you will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more. By the end, you will be able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering. The Deep Learning Specialization is a foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to take the definitive step in the world of AI by helping you gain the knowledge and skills to level up your career....

Top reviews

AM

Invalid date

Excellent course! This course extensively covers all of the relevant areas of NLP with a strong practical element allowing you to applying Deep Learning for Sequence Models in real-world scenarios.

GS

Invalid date

So many possibilities will be presented in front of you after this course. The only limit is the boundary of my imagination and creativity, that is how I feel now upon the completion of this course.

Filter by:

2801 - 2825 of 3,671 Reviews for Sequence Models

By wangdawei

•

Mar 30, 2018

赞

By Mathias S

•

Apr 22, 2018

The Sequence Models course was the one I sought out in the deep learning specialization. Very interesting assignments, e.g. neural machine translation, music composition, etc. - much more interesting than the convolutional network models, in my opinion. However, it is also much more difficult to follow; probably the most difficult one of the five courses.

Prof. Ng did a wonderful job in the delivering the materials, as always. However, I expected a lot more details about the sequence models, and recurrent networks as much as the ones given in the previous courses. I was looking forward to learn more in-depth about this model, but I didn't feel I get all that I wanted. For example, I wish there an example, step-by-step walkthrough of the backpropagation through time (BPTT) algorithm, especially for the LSTM and GRU models.

The assignments were a little more difficult to follow, I think. To me, the instructions were not as clear as the previous courses (in my opinion), especially when using Keras objects/layers - "use this *object/layer*" but it wasn't clear whether or not to fiddle with the arguments. Usually when it does require a specific value for the argument (e.g. axis=x), it will be mentioned either in the text or code comments. I guess it's a good challenge, but I find myself doing more trial-and-error with the coding to get it to work instead of having some guidance on how to use those Keras objects/layers. The discussion forums do help, however. Lastly, some of the assignments involved building a recurrent model using Keras layers, I felt like there was not enough explanation why such architecture, layers, or hyperparameter values were chosen.

Overall, I liked the course, I did learn a lot from the course, and enjoyed the models we get to play with in the assignments. I think I will still run into problems trying to devise my own sequence models, and fumble with Keras. I wish there is a more in-depth course on the sequence model. Prof. Ng's delivery was excellent; I enjoyed listening to every one of his lectures (even at 2x speed) :)

Thank you to Prof. Ng, and all the people who worked hard to develop the course.

By D. R

•

Oct 1, 2019

(09/2019)

Overall the courses in the specialization are great and provide great introduction to these topics, as well as practical experience. Many topics are explained clearly, with valuable field practitioners insight, and you are given quizzes and code-exercises that help deepen the understanding of how to implement the concepts in the videos. I would recommend to take them after the initial Andrew Ng ML course by Stanford, unless you have prior background in this topic.

There are a few shortbacks:

1 - the video editing is poor and sloppy. Its not too bad, but it’s sometimes can be a bit annoying.

2 - most of the exercises are too easy, and are almost copy-paste. I need to go over them and create variations of them in-order to strengthen my practical skills. Some exercises are quite challenging though (especially in course 4 and 5), and I need to go over them just to really nail them down, as things scale up quickly. Course 3 has no exercises as its more theoretical. Some exercises have bugs - so make sure to look at the discussion board for tips (the final exercise has a huge bug that was super annoying).

3 - there are no summary readings - you have to (re)watch the videos in order to check something, which is annoying. This is partially solved because the exercises themselves usually hold a lot of (textual) summary, with equations.

4 - the 3rd course was a bit less interesting in my opinion, but I did learn some stuff from it. So in the end it’s worth it.

5 - Slide graphics and Andrew handwriting could be improved.

6 - the online Coursera Jupyter notebook environment was a bit slow, and sometimes get stuck.

Again overall - highly recommended

By Kayne A D

•

Mar 6, 2020

Note: this review also applies to the specialization as a whole. I thoroughly enjoyed the courses and learnt so much. The content delivery was excellent. I am not sure how unfeasible it was to re-produce the videos, though it would have been nice to see fewer corrections. In regards to the programming assignments, I think that they are great overall. The way the code was mostly pre-filled to ensure logical workflow helped me to stay on track rather than trying to skip straight to the final model (I assume beginner programmers do this a lot). It provided a great template for thought and I will definitely be referring back regularly. However, I also think that descriptions and exercises could be integrated better, especially in the later assignments where students are compiling more complex models and prior skills in programming (from earlier courses) are expected. Specifically, there are certain Exercise descriptions with multiple parts 1a,b,c 2a,b,c etc and hints before you even get to the start of the actual implementation. I think this is inefficient and was a bit frustrating at times. It felt a bit disjointed and overwhelming to read the instructions without no context (i.e. reading about step 2c before implementing step 1a). On the whole, I know that the knowledge, understanding and skills I obtained through this specialization will serve me extremely well throughout my PhD. Thank you very much.

By Gary G

•

Mar 5, 2018

I enjoyed Prof Ng's excellent lectures, but felt the material moved too quickly. This 3-week course could easily have been extended to (say) 5 weeks to allow for more depth in covering the various RNNs, applications and model details.

The homework/programming assignments were more difficult and time-consuming than prior courses, particularly for implementing models with Keras. The structure of these programs was hard to understand (a bit of spaghetti-code, in my opinion). Some experience with Keras and tensorflow is essential. I spent a lot of time just trying to construct the programs with correct syntax, etc...while this is useful to know, I'd rather focus more on fundamentals of the learning algorithms.

However, its clear that a lot of effort went into constructing the programming exercises for this course, and they covered a lot of ground, with a bit more sophistication than the exercises from most of the earlier courses.

By Sidharth W

•

Dec 16, 2018

While I loved listening to Andrew Ng's lectures and I find him very lucid in his presentation and pedagogy, I feel that the practical aspect has suffered -by giving enough hints on how to solve the programming exercises, the challenge is reduced. There were quite a few issues I also faced when connecting to the server which resulted in rewriting the code a couple of times (in hindsight, I should have always made a local copy and tested it before submitting it). I would rather that each of these courses becomes a 2 month course (much like Stanfords convolutional networks course) so that the practical aspect is also given equal weightage. While presenting the lectures, Prof Andrew Ng could also lay it out how you would implement in a particular framework like tensorflow and there should be enough exercises that walk a person through them before attempting the programming exercises.

By Ken K

•

Oct 6, 2019

This course has great material on sequence models, presented with the usual energy and enthusiasm that Andrew Ng brings to every course. The model diagrams are great for visualizing what's actually going on in the complex assignments, and the assignments are generally designed with 1) additional code and commentary to make the examples informative and 2) the "guard rails" (e.g., insert code here, with related hints) to clarify the specific lines to edit. Having said that, I feel that the assignments were a little less polished/refined relative to earlier courses in the concentration, and I spent significantly more time in the discussion forums than I had in prior courses. I also recommend investing in additional training in Keras and Tensor Flow as a prerequisite or in parallel to this course to help get the most out of the practical applications of the material.

By Damianos L G

•

Apr 12, 2021

As i progressed through the specialization i liked the content less for the following 3 reasons. 1) With the exception of the first course there are no in-video questions which would help learn/exercise on the spot. There could easily have been one question per 3-4 minutes of video content. 2) The programming exercises although useful, became more disconnected from the lectures as the specialization progressed- ie they focused on keras documentation which could have been accompanied by a lecture video dedicated to that goal (as in the second course with tensor flow). The result was that I just tried to get through the programming exercises in courses 4 and especially 5 without understanding much of what I was doing. 3) There were too many errors in the lecture. Overall a good specialization but at least points 1 and 2 should be fixed

By Victoria D

•

Jan 10, 2020

this was a really interesting course. Too bad the details of the math are not really there.

That being said, its greatest redeeming factor is that Andrew cites the research papers, and with his overviews of the various models, I can read those papers, and build up my own library of relevant material.

I don't really care for Jupyter Notebooks after all...I much prefer the Spyder IDE, as it has intellisense, true debugging, and is not prone to crashing the kernel.

I came across some of Andrew's course lecture notes for his CS courses at Stanford - now those have much more mathematical detail - perhaps Coursera can provide the links to the online material? ( unless, of course, that is problematic due to copyrights?)

All in all, I did enjoy the entire deeplearning.ai material as it is....the rest I can dig into myself.

By Miguel P F A F

•

Aug 25, 2019

It is a good course and a very important one. However, I needed to mark it a bit lower than most other courses in this specialization because I felt sometimes confused with Keras. Navigating in such higher level of abstraction would require a stronger support for the Keras part. I believe we could have explored a bit further the sequence models and yet I was sometimes struggling understanding some basic Keras instructions. Perhaps it could be included an extra programming assignment tutorial (for Keras) or extend the existing Keras tutorial.

Being this the last course of the specialization, I believe not only this course is worthwhile, but the whole specialization is of great value. Congrats to all Deeplearning.ai team. Keep going.

By Sourav M

•

May 18, 2020

First of all I would like to convey my thanks to Andrew Sir for not only this course but for the whole specialization.You are fantastic teacher and I will try to pay you back by solving real world problems with the help of knowledge you have imparted.

The only short comming I can think of is the disconnection between your theory videos and the real codes in python.It would be very helpful if you can include some code snippets in your theory videos.I think this will make the learners better bridge the gap between the theoritical concepts and real life coding. Maybe some optional hands on coding videos summarizing the week's course can be included.

Once again thank you very much and I would be ever grateful to you.

By Peter H

•

Mar 8, 2018

nice course as always! I need really thanks Andrew and team for this, it is very well structured & informative, provide good intuition and solid base for future self-learning of this area.

However to get full 5 for this course, there are some thing to improve ( video cuts ~ repeatable sections, sometimes mistakes, long pauses ) , also courses some of them was harder to pass trough aka from descriptions and template was not certain what to do ( one thing it is good that you need to think more and reread x-times, however sometimes grader vs 'official' output are not aligned which results in wasted time ~ hours ) ~I guess most of it was because it was rushed out too soon, but evendo very good one!

By Francois T

•

Aug 9, 2020

Overall, I liked the Machine Learning Stanford class' programing assignments better than the one in the deep learning specialization. For me, coming up with a full implementation of a function (and then having it unit tested by the grader), is more conducive of learning and more entertaining than a step by step, line by line guidance, as we get in the Jupiter notebooks. That said the notebook themselves are incredibly well designed and put together. I love how Andrew Ng, beyond his stature, unmatched knowledge, and outstanding teaching skills, puts his whole heart to work. That makes the world of a difference to me and helps me do the same with others. Thank you for everything!

By Nicholas P

•

Nov 28, 2020

This was a VERY thorough overview of the machine learning architectures required to tackle a wide range of natural language processing problems. It's quite dense and I had to watch each lecture several times and break it down into chunks to avoid getting lost, but now that I'm finished there I feel like a lot of technology has been demystified. The assignments really hold your hand and mostly just test your ability to follow instructions with even a hazy understanding of the weekly concepts, so you shouldn't expect to graduate and then immediately build a machine translation system from the ground up, but I do feel very ready to dive into technical interviews.

By David T

•

Oct 18, 2022

This was a good course mainly on Recurrent Neaural Networks, including LSTMs, and Encoder/Decoder networks, and the Attention model. In the last week there a brief overview of the Transformer networks and a long exercise fleshing that out. This is an interesting combination of merging Recurrent Network and a convolution style network.

The one think that has me leaving off the last star is that there were no lecture style classes training users how to use tensorflow, or how tensorflow works. That was all learned 'on the programming exercises', so it is helpful to have deep experience with Python, and with using different libraries and styles of coding.

By karan

•

Apr 24, 2020

Review of the 5 courses:

Good:

Well summarized lectures that are easy to understand. Everything is broken down into small problems making most of the content accesible.

Interesting programming assignments, which are well structured.

Bad:

Jupyter notebooks, where the programming assigments are done crash often.

On rare moments I did require extra material from youtube or medium to understand what was going on.

On the quizes, formulas are not correctly visualized and I can still see the markdown code, making it hard to read the formulas correctly.

Some technical issues in the course but I would highly recommend overall.

By Brad M

•

Aug 22, 2019

A very helpful and enlightening course, though it felt a bit "hand-wavy" at times. It never really felt like we were getting the full story, like I was missing something the whole time. Word embeddings cleared up a lot, but the entire course was a lot of information to digest at once. Coming from an image processing background, most of the terminology was unfamiliar, and the programming assignments weren't quite as guided as previous ones.

In the end, I think it was a great course, and I'd recommend it highly to anyone interested in the field. If you can't apply it to your work, it probably isn't as beneficial.

By cricel472

•

Oct 18, 2023

It's a lot of great content for becoming aware of all the various concepts in deep learning. And it does a great job of pointing at the original papers explaining all of those things. The homeworks are often very pointless: a lot of learning exactly what things they want you to copy paste, but minimal understanding of the actual algorithms, especially for the later more complicated ones. (Transformers remain a complete mystery to me, this course did not explain them sufficiently or connect them to CNN or anything else clearly, and the homework was especially meaningless in this regard.)

By Marc A

•

Mar 26, 2019

I'm a fan of Andrew Ng's machine learning classes on Coursera. This was my least favorite. I'm not sure if it's because of the complexity of the material or that so much material is presented in a short time, but I feel that I'm not as confident about my knowledge of the material in this course compared to the earlier courses. In the last few assignments, I felt like I was mechanically plugging stuff in without really understanding the thought process. His teaching style seems much the same as the other courses though, so it's possible this could be due to me rather than the course.

By Deleted A

•

Jul 11, 2018

I am grateful for the opportunity to have learned from an exceptional instructor, and one of the luminaries, in artificial intelligence. Insofar as this particular course is concerned, theory was well explained, as always. I feel like there was a bit of a disconnect in the implementations, though. Some of this was just the sheer challenge of using a still-unfamiliar platform (Keras). And, in concert with this latter point, some was due to a sort of "fill in the blank" approach to using the platform. Nonetheless, that I have learned, and learned a lot, is undeniable!

By Emil H

•

May 28, 2023

the last exercise is a bit hard to understand especially

the Exercise 4 - EncoderLayerhttps://ntwjrryqcvtz.labs.coursera.org/notebooks/W4A1/C5_W4_A1_Transformer_Subclass_v1.ipynb#Exercise-4---EncoderLayer

which says You will pass the Q, V, K matrices and a boolean mask to a multi-head attention layer. Remember that to compute self-attention Q, V and K should be the same. Let the default values for return_attention_scores and training. You will also perform Dropout in this multi-head attention layer during training. 

altough Q,V,K does not come in as function inputs.

By Raja K

•

Nov 30, 2020

a more intuitive materials been used while teaching would be helpful to more effieciently and enjoyably grasp the concepts. what i mean is that the description or the summary the lessons been taught in a week are in the corresponding week's assignments; those summarys were more clear and visually pleasing than the inclass presentation. for example, usage of pens for drawing networks and the likes can be migrated to better animations ,etc. the crux is that the content in the course is great, but it feels like there is a good scope for improvement in presentation.

By Deni

•

Apr 21, 2018

Firstly, thank to the course instructors and Dr.Ng for teaching us deep learning. You are all a gem. I enjoyed this course, and how simple it made coding RNNs. However, I believe the concepts could be simplified some more, even in the form of a pseudocode or conceptual outline. This is my 3rd course from Andrew Ng, so I know he's skilled at distilling deep learning concepts with ease. Week 1 was the best for me as the operation of the LSTM, GRU RNNs were succinctly outlined and set a solid foundation , Week 2 could be presented a less abstract way though.

By Nkululeko N

•

May 2, 2020

I think with sequence models, the course details were very challenging. I strongly believe that do take a course in Deeplearning Specialization, one must at least learn Python from basics to advanced level. However, Andrew Ng has made it easy for a first time student with programming background to understand most of the concepts in this specialization. Thank you Deeplearning.ai for this course. I have learned some of the cutting-edge skills that can't be easily found anywhere. I have learned a skill that will set me apart from the crowd.

By John O

•

Jan 30, 2021

I really enjoyed this course. I'm not crazy about the fill-in-the-blank style of the programming assignments. I think I'd learn the material better if it just gave me the arguments and returns of the functions and forced me to write everything in between. I think it makes sense to emphasize keras in the later parts of this sequence, but I feel like I never got a basic introduction to how models in keras are supposed to be structured. Maybe there should be an assigned reading on this, if not a video or an optional programming assignment.