Chevron Left
Back to Machine Learning: Classification

Learner Reviews & Feedback for Machine Learning: Classification by University of Washington

4.7
stars
3,732 ratings

About the Course

Case Studies: Analyzing Sentiment & Loan Default Prediction In our case study on analyzing sentiment, you will create models that predict a class (positive/negative sentiment) from input features (text of the reviews, user profile information,...). In our second case study for this course, loan default prediction, you will tackle financial data, and predict when a loan is likely to be risky or safe for the bank. These tasks are an examples of classification, one of the most widely used areas of machine learning, with a broad array of applications, including ad targeting, spam detection, medical diagnosis and image classification. In this course, you will create classifiers that provide state-of-the-art performance on a variety of tasks. You will become familiar with the most successful techniques, which are most widely used in practice, including logistic regression, decision trees and boosting. In addition, you will be able to design and implement the underlying algorithms that can learn these models at scale, using stochastic gradient ascent. You will implement these technique on real-world, large-scale machine learning tasks. You will also address significant tasks you will face in real-world applications of ML, including handling missing data and measuring precision and recall to evaluate a classifier. This course is hands-on, action-packed, and full of visualizations and illustrations of how these techniques will behave on real data. We've also included optional content in every module, covering advanced topics for those who want to go even deeper! Learning Objectives: By the end of this course, you will be able to: -Describe the input and output of a classification model. -Tackle both binary and multiclass classification problems. -Implement a logistic regression model for large-scale classification. -Create a non-linear model using decision trees. -Improve the performance of any model using boosting. -Scale your methods with stochastic gradient ascent. -Describe the underlying decision boundaries. -Build a classification model to predict sentiment in a product review dataset. -Analyze financial data to predict loan defaults. -Use techniques for handling missing data. -Evaluate your models using precision-recall metrics. -Implement these techniques in Python (or in the language of your choice, though Python is highly recommended)....

Top reviews

SM

Jun 14, 2020

A very deep and comprehensive course for learning some of the core fundamentals of Machine Learning. Can get a bit frustrating at times because of numerous assignments :P but a fun thing overall :)

SS

Oct 15, 2016

Hats off to the team who put the course together! Prof Guestrin is a great teacher. The course gave me in-depth knowledge regarding classification and the math and intuition behind it. It was fun!

Filter by:

426 - 450 of 589 Reviews for Machine Learning: Classification

By Navinkumar

•

Feb 23, 2017

g

By MARIANA L J

•

Aug 12, 2016

The good:

-Good examples to learn the concepts

-Good organization of the material

-The assignments were well-explained and easy to follow-up

-The good humor and attitude of the professor makes the lectures very engaging

-All videolectures are small and this makes them easy to digest and follow (optional videos were large compared with the rest of the lectures but the material covered on those was pretty advanced and its length is justifiable)

Things that can be improved:

-In some of the videos the professor seemed to cruise through some of the concepts. I understand that it is recommended to take the series of courses in certain order but sometimes I felt we were rushing through the material covered

-I may be nitpicking here but I wish the professor used a different color to write on the slides (the red he used clashed horribly with some of the slides' backgrounds and made it difficult to read his observations)

Overall, a good course to take and very easy to follow if taken together with the other courses in the series.

By Hanif S

•

Jun 2, 2016

Highly recommended course, looking under the hood to examine how popular ML algorithms like decision trees and boosting are actually implemented. I'm surprised at how intuitive the idea of boosting really is. Also interesting that random forests are dismissed as not as powerful as boosting, but I would love to know why! Both methods appear to expose more data to the learner, and a heuristic comparison between RF and boosting would have been greatly appreciated.

One can immediately notice the difference between statistician Emily, who took us through the mathematical derivation of the derivative (ha.ha.) function for linear regression (much appreciated Emily!), and computer scientist Carlos, who skipped this bit for logistic regression but provided lots of verbose code to track the running of algorithms during assignments (helps to see what is actually happening under the hood). Excellent lecturers both, thank you!

By Amilkar A H M

•

Nov 27, 2017

It's a great course, but the programming assignments are a little too guided. That is good, to some extend, as it allows you to focus on the concepts, but at the same time, it leave little space for actually practicing your coding skills. I know they said from the beginning that this course was not focused on the implementation of the algorithms, however, how are you going to be able to use what you've learned without knowing how to implement the algorithms on your own.

When it comes to coding, nothing replace implementing the algorithms yourself. That is my only complaint. Other than that, it's great. I loved it. The concepts were well explained and they covered a lot of material. I wish they had spent more time in certain topics, but I guess this is just an introduction. Anyway, take this course by any means if you have some programming experience and have little to no machine learning knowledge.

By Daniel C

•

Apr 24, 2016

This series is taught by Emily and Carlos. Course 2 was Emily and this course 3 is Carlos. Carlos takes a more practical approach by showing how things are related using pictures, trial and error, what happens when we do this vs. that. Emily on the other hand dives down into the math and actual facts. I feel Emily is more difficult overall - but once I got through it, I had a better foundation and intuition as to how things work and better overall understanding. So - giving this class 4 stars as compared to Emily's class that is 5 stars. I feel if they would mix it with Emily doing the math immediately followed by Carlos explanations it would be best. Finally - I don't feel this course on classification had as much content. We could've done more.

By Jaiyam S

•

Apr 24, 2016

Thank you Prof. Carlos for this amazing course. You covered the topics in a very easy to understand way and the course was full of cool applications and humor! The only downside that I felt was that the programming assignments sometimes felt too easy. Even as a complete Python novice (I started learning Python with the first course), I felt the programming assignments could have been made more interesting. But in the larger scheme of things it doesn't matter because the course was really well taught and easy to understand. I'm really looking forward to the next course! :)

By Lech G

•

Apr 26, 2016

Not as good as the Regression Course, but still very good.

While I appreciate prof Guestrin's enthusiasm, I missed a little rigor and mathematical depth of the Regression's course by prof. Fox.

I learned a lot, but I feel that regression clicked with me a little better than classification.

But that's probably me.

In either case, the whole series are awesome so far, better, in my opinion, than Anrdrew Ng's ML course on coursera,

A small suggestion would be to switch the main toolset from the Graphlab to something more common, like Sci-kit learn and Pandas.

By Alessio D M

•

Apr 17, 2016

The course is definitely high-quality and the topics are covered in a good way. I'm not giving 5 stars because I would have expected SVMs and neural networks. Mentioning the many different algorithms for learning decision trees would have been nice, without necessarily focusing on each of them in depth. An entire week spent on precision/recall seems a little bit too much, without touching other metrics like F-score. Overall though a very nice course for beginners, and it definitely gives a good sense of classification challenges and approaches.

By Subikesh P S

•

Jun 11, 2020

This course was very useful for learning machine learning, as this describes classification models deeply and also about other important ML techniques like Online Learning, handling missing data, precision-recall, etc. The weekly programming assignments were elaborate and explained all the topics nicely. The classes were also made interesting by Mr. Carlos by cracking puns in between.

The only problem I face is that using turicreate over sklearn. Since turicreate is depreciated for windows, it's hard to complete programming assignments.

By Anjan P

•

Apr 29, 2016

Excellent course that details important concepts in supervised classification. The programming assignments can be a little easy to complete (and consequently easy to forget later), but I believe it's a well paced course and the lecture material is at any incredibly accessible pace, with options for more advanced material.

One suggestion would be to include more papers for additional technical details in the lecture or programming assignments as you did with dealing with unbalanced data.

By George P

•

Oct 23, 2017

It explains nicely a lot of useful topics and gives you the tools to build real world applications. It even explains precision recall and boosting which could be confusing in an easy to digest way.

4/5 stars because the course could include multiple levels of difficulty for the programming assignment tasks. The task by default were very guided and a keen student would like to explore and build them from scratch or at least in a less guided way.

Positive experience overall

By Kamil Z

•

Aug 31, 2017

Carlos (the teacher) is a fantastic guy, but for me the content of this particular course was too easy comparing to other courses in specialization (when Emily was mainly in charge). If you only look at tutorial videos duration, you will see that they are two times shorter than in remaining courses. And some of them is "very optional". But, that being said, it is still a well taught course.

I wish it'd had more advance content, then I could give full 5-star review.

By Karen B

•

Jul 29, 2016

The course covers many aspects of classification, with each section building on the one before. The lectures cover the theory, with a little bit of practical information, fairly well. The instructor tries to make the lectures interesting, and they are.

The quizzes seem designed both to reinforce what the lectures taught and to expand on them. The quizzes, particularly those based on programming, could use proofreading by someone newer to the subject.

By Sacha v W

•

Nov 10, 2018

The course is well structured and very well explained. The structure is step by step increasing the the complexity. The programming exercises are excellent. I really appreciate the humor and passion of Carlos in teaching the material and his ability to explain complex matters with simple examples. The only drawback is that the course uses python packages that are less familiar. That is why I audited the course and worked with pandas and sklearn.

By Michael C

•

Apr 7, 2016

The course provides an overview on classification methods in machine learning.

The lectures are clear and easy to understand due to the quality of the slides and of the explanations.

The limit of this course lies in the assignments: too easy if done with the provided notebooks and tools. Sometimes impossible to do with different tools (the suggested machine learning package is free for educational purposes, but otherwise it needs a license).

By Shahin S

•

Sep 15, 2016

The lectures are very well prepared and clear. With regards to the assignments: I think it will be nice to design the assignments in a way that allows people to use the language and libraries they prefer as much as possible. I would also prefer to write more of the coding assignments by myself, instead of trying to fill in the blanks in some pre-written code and complete them. That will help the students to learn a lot more.

By MITUL T

•

Sep 2, 2020

This course is well paced. Toughness of assignment and quizzes are moderate and are very conceptual. The only thing this course lacks is it only teaches basic stuff and you need to refer other sources if you are interested to study some advance techniques. This course builds a strong foundation of math and statistics in ML field. If you are struggling to understand math behind all algorithms I do recommend this course.

By Di C L

•

Oct 21, 2022

Very good course about Classification. The theoretical lectures are very detailed and thorough, although the hands-on Lab part is quite tricky, as it uses non-standard libraries (e.g. no pandas and no scikit learn) and the programming assignments are pretty challenging and long. I recomment the course for the theoretical part, although I would suggest updating and shortening the Lab assignments.

By Jordy J C R

•

Nov 22, 2022

At some point it becomes so repetitive on assignments making them boring to accomplish but course content is good. Also, just one think on ensemble learning module: the introduction of ensemble, adaboost, gradientboost and so is 'all in one hit' so it is confusing to understand each term independently. Either way what a great job breaking down all this topics.

By Naveendhar

•

Aug 9, 2019

Last portion was a little difficult to relate to why we started this move for large datasets in the first place. I had to keep going to the fact that I am going to be handling large datasets. Like the use cases. simple and effective. The quizzes were simple and the graph questions were really helpful in gauging my understanding of math behind these models.

By Stefano T

•

Mar 15, 2016

The contents are very interesting and well explained. Nevertheless, unlike the Regression module, the current one suffers of some technical problem, like slides not well formatted, noisy audio in some video, weekly work load not perfectly calibrated. Despite all this, if you are interested in the subject, you will definitely love this course!!!

By Marku v d s

•

Dec 23, 2017

I loved the course. Carlos Guestrin is an excellent and engaging professor that really captivate me to work hard to accomplish the assignments.

I just suggest that the assignments should be divided into small pieces to be taken as long the week is accomplished. I felt bad some weeks that had a lot of videos to watch before the first assignment.

By Lorenzo L

•

Aug 31, 2018

Good, funny and super-clear professors introduce you to the main classification techniques out there (except for neural networks). Great if you are approaching this field and want to know more before deciding if you really want to invest a lot in it. 4 stars because it would have been better with more popular python packages than GraphLab.

By Craig B

•

Dec 19, 2016

Not as evenly paced as the first two courses. Also some material was covered at a very high level, whilst I found that some explanations did not immediately build on my understanding gained through the foundation course, but rather confused it. Still a worthwhile course nonetheless. I look forward to the rest in the specialisation.

By Nitin K M

•

Nov 6, 2019

The course is perfect for people who want to gain in-depth knowledge of classification algorithms but exercise descriptions are vague. I found trouble understanding the flow of assignments. Also, Bagging and Gradient Boosting techniques were not covered under ensembles. Overall, the course is awesome.