JH
Oct 4, 2020
Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks
LL
Jun 22, 2021
This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.
By LK Z
•Oct 20, 2020
very good
By अनुभव त (
•Sep 27, 2020
Very good
By Hoang Q T
•Jul 25, 2022
Awesome!
By Justin H
•Jul 12, 2023
Brutal.
By M n n
•Nov 22, 2020
Awesome
By Alphin G I
•Sep 14, 2023
Awsome
By Jeff D
•Nov 15, 2020
Thanks
By Ayush S
•May 25, 2023
good
By Md P
•Apr 19, 2023
nice
By Pema W
•Nov 11, 2022
good
By Rifat R
•Sep 30, 2020
Best
By Thierry H
•Oct 25, 2020
I was a bit disappointed by the fact that although 3 instructors are mentioned, in practice we only see Younes. Lukasz just says some intro and conclusion for each lesson, I would have liked seeing him really teach. And we don't see at all Eddy.
There are also some typos in the text in some notebooks and some slides but they don't hurt the quality of the course.
Overall the course is well made. I like the fact that it teaches recent architectures like Reformer. I was surprised that trax is used, at a time where the community is only starting to tame Tensorflow2. It would have been nice to have some words about where we are in the set of frameworks: why trax and how it compares to Tensorflow2, what's the trend and priority comparing TF vs trax (features support, flexibility, targeted audience, production readiness...etc...)
Some notebooks are only about filling the blanks with a big hint a couple lines before but I don't know how to make it more complex without leaving many people stuck, especially with a new framework like trax. I also liked the diagrams very much, especially for the last week with the complex transformations for LSH.
Quite a good course overall. Thanks!
By Brian G
•Oct 31, 2020
I really enjoyed the course, thank you Younes, Lukasz, Eddy and all the staff of Deeplearning.ai and Coursera. I applaud your effort in trying to teach what seem to me cutting edge NLP techniques like Transformers, even though it is a very new and complex topic. The reason I didn't give you five stars is because I didn't feel the final course on Transformers and Reformers does not seem self contained, feels incomplete, a bit too haphazard for me, unlike the first 3 courses in the specialization. I don't feel enough foundation was covered for student to appreciate the topic being discussed or what choices led to the current design e.g. why Q, K, V and not just Q, V? Why not feed NER output as context instead of just the source input? I have to search for supplementary content on the internet to round out my understanding.
By David M
•Apr 2, 2023
I think the labs could either be a bit less cook-book (though it was satisfying to work through them successfully) or go into greater depth about the mechanisms at work. The ungraded labs in the last course of the specialty were *great*, and I think captured the right balance. More labs like that, please.
I come away with a decent understanding of what Attention models are, but I'm not really sure how they do what they do, with the end result that they seem a bit magical. The same is true of Locality Sensitive Hashing, *why* should similar items hash to the same bucket.
Well, I'll spend some time investigating these on my own.
By Laurenz E
•Jan 19, 2023
The course explores important concepts. It felt this week was however, less polished than the previous weeks. I was missing the summaries after the videos, and the reason for some concepts (what is Query / Key / Value, or why is bert only the encoder and gpt only the decoder and why the not both combined).
It is of course still a great course and I learned a lot from it. Thanks for taking the time and effort for creating it. It is really helpful to have this kind of material in the professional way you explain it.
By Woosung Y
•Nov 7, 2020
Great course for the understanding basic concept of attention module in NLP. What I learned in this course mainly based on text data processing. (I feel that the voice or sound data will be a little different to apply.) I was able to make a solid understanding through practical examples.
One thing that I felt was lacking is
There is no theoretical background on the convergence. I don't understand why such NLP model can be converge to optimal solution. It may work. But why? I need to search more literature.
By Anna Z
•Jul 12, 2022
I loved the instructor's clear explanation! Thanks for extracting the essence of these cutting-edge models and teaching that to me.
Towards the end of the course (e.g. Course 4), I felt that the assignments were not designed to allow me to really understand what's going on inside the neural network. Perhaps that's not one of the course's goals -- in that case, it'd be nice to get an idea of to what extent do people working at different positions in industry master the mechanisms inside each network.
By Fan P
•Nov 18, 2020
The materials in Week 3 are not sufficiently clear to explain the BERT model. The instructors sometimes repeated itself and only explain the surface but never going inside. The assignment of week 3 was designed lack of comprehension. One thing is very interesting but didn't mention is that how BERT model can achieve to be trained by the self-supervised method on almost any dataset. Would be good to show how does BERT prepare the training dataset, can it be generalized to other types of dataset?
By Luis F C d L
•Apr 3, 2022
The course is one of a kind in the sense that there's very limited courses that try to aboard the transformers/attention subject.
I really enjoyed the content but there were some times that I'd like to have some more depth about the implementation side of things...
I know the subject is very complex so in general I apreciate the efforts of putting that kind of content for us. Thanks to that I rope I can evolve the models we use in my company to these state of the art transformers.
By Bilgehan E
•Mar 31, 2024
The syllabus and lab exercise outlines are commendable. However, the quality of the lectures and lab descriptions falls short. For a reference to the expected standard, consider Andrew Ng's previous machine learning courses. At a minimum, please use ChatGPT to enhance the language quality of the Jupyter descriptive text. Despite the highly enjoyable topic, following this course was as frustrating as Andrew Ng’s courses were enjoyable.
By Jerry C
•Nov 3, 2020
Overall the course is a nice introduction to new or cutting-edge NLP techniques using deep learning with good explanations and diagrams. The course is a bit too easy in terms of hand-holding; a large part of the assignments can be easily completed given the hints without deeply understanding what is going on. Also, occasionally there are typos or incoherent wording which detract from the overall experience.
By Lucas B P
•Mar 10, 2023
The course has a really rich content, and it is very well explained for the most part.
However, there are some errors in some tasks, that the team has not fixed for many months now. The labs run on low-end machines that can really get in the way of finishing some of the tasks. The third week is a bit rushed as well.
Overall, it is a really good course and totally worth it.
By Stephen S
•Jun 21, 2021
Content wise it's excellent as always, I am not giving 5 stars, because of two reasons: a) audio including transcript is sometimes not of best quality (in english) as it would be generated by a machine b) readings are very brief and just quickly summarizing what has been taught in the video (could go in more depth). I would give 4,5 stars if that would be possible.
By Keith B
•Aug 4, 2022
I don't think I got much from the lectures or the assignments in the last two weeks of the course (weeks 3 and 4). However, the ungraded labs in week 4 (Reformer LSH and Revnet) were brilliant and really helped me to better understand much of the material from weeks 3 and 4. If I were doing it again, I would probably skip the lectures and just do those labs.
By Amey N
•Oct 4, 2020
The course gives an encompassing overview of the latest tools and technologies which are driving the NLP domain. Thus, the focus gradually shifts from implementation and towards design.
Since the models require specialized equipment, they go beyond the scope of a personal computer and create a requirement for high-performance computing.