What Is Sales Analytics and How Does It Benefit My Business?
March 4, 2024
Article
This course is part of multiple programs.
Instructors: Joseph Santarcangelo
Instructor ratings
We asked all learners to give feedback on our instructors based on the quality of their teaching style.
5,753 already enrolled
Included with
(51 reviews)
Recommended experience
Intermediate level
Basic knowledge of Python and PyTorch. You should also be familiar with machine learning and neural network concepts.
(51 reviews)
Recommended experience
Intermediate level
Basic knowledge of Python and PyTorch. You should also be familiar with machine learning and neural network concepts.
Explain the concept of attention mechanisms in transformers, including their role in capturing contextual information.
Describe language modeling with the decoder-based GPT and encoder-based BERT.
Implement positional encoding, masking, attention mechanism, document classification, and create LLMs like GPT and BERT.
Use transformer-based models and PyTorch functions for text classification, language translation, and modeling.
Add to your LinkedIn profile
6 assignments
Add this credential to your LinkedIn profile, resume, or CV
Share it on social media and in your performance review
This course provides you with an overview of how to use transformer-based models for natural language processing (NLP).
In this course, you will learn to apply transformer-based models for text classification, focusing on the encoder component. You’ll learn about positional encoding, word embedding, and attention mechanisms in language transformers and their role in capturing contextual information and dependencies. Additionally, you will be introduced to multi-head attention and gain insights on decoder-based language modeling with generative pre-trained transformers (GPT) for language translation, training the models, and implementing them in PyTorch. Further, you’ll explore encoder-based models with bidirectional encoder representations from transformers (BERT) and train using masked language modeling (MLM) and next sentence prediction (NSP). Finally, you will apply transformers for translation by gaining insight into the transformer architecture and performing its PyTorch implementation. The course offers practical exposure with hands-on activities that enables you to apply your knowledge in real-world scenarios. This course is part of a specialized program tailored for individuals interested in Generative AI engineering. This course requires a working knowledge of Python, PyTorch, and machine learning.
In this module, you will learn the techniques to achieve positional encoding and how to implement positional encoding in PyTorch. You will learn how attention mechanism works and how to apply attention mechanism to word embeddings and sequences. You will also learn how self-attention mechanisms help in simple language modeling to predict the token. In addition, you will learn about scaled dot-product attention mechanism with multiple heads and how the transformer architecture enhances the efficiency of attention mechanisms. You will also learn how to implement a series of encoder layer instances in PyTorch. Finally, you will learn how to use transformer-based models for text classification, including creating the text pipeline and the model and training the model.
6 videos4 readings2 assignments2 app items1 plugin
In this module, you will learn about decoders and GPT-like models for language translation, train the models, and implement them using PyTorch. You will also gain knowledge about encoder models with Bidirectional Encoder Representations from Transformers (BERT) and pretrain them using masked language modeling (MLM) and next sentence prediction (NSP). You will also perform data preparation for BERT using PyTorch. Finally, you learn about the applications of transformers for translation by understanding the transformer architecture and performing its PyTorch Implementation. The hands-on labs in this module will give you good practice in how you can use the decoder model, encoder model, and transformers for real-world applications.
10 videos6 readings4 assignments4 app items2 plugins
We asked all learners to give feedback on our instructors based on the quality of their teaching style.
Instructor ratings
We asked all learners to give feedback on our instructors based on the quality of their teaching style.
At IBM, we know how rapidly tech evolves and recognize the crucial need for businesses and professionals to build job-ready, hands-on skills quickly. As a market-leading tech innovator, we’re committed to helping you thrive in this dynamic landscape. Through IBM Skills Network, our expertly designed training programs in AI, software development, cybersecurity, data science, business management, and more, provide the essential skills you need to secure your first job, advance your career, or drive business success. Whether you’re upskilling yourself or your team, our courses, Specializations, and Professional Certificates build the technical expertise that ensures you, and your organization, excel in a competitive world.
DeepLearning.AI
Course
Google Cloud
Course
DeepLearning.AI
Course
51 reviews
77.77%
14.81%
1.85%
0%
5.55%
Showing 3 of 51
Reviewed on Nov 16, 2024
need assistance from humans, which seems lacking though a coach can give guidance but not to the extent of human touch.
Reviewed on Dec 29, 2024
This course gives me a wide picture of what transformers can be.
Reviewed on Jan 17, 2025
Exceptional course and all the labs are industry related
Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Earn a degree from world-class universities - 100% online
Upskill your employees to excel in the digital economy
It will take only two weeks to complete this course if you spend 3–5 hours of study time per week.
It would be good if you had a basic knowledge of Python and a familiarity with machine learning and neural network concepts. It would be beneficial if you are familiar with text preprocessing steps and N-gram, Word2Vec, and sequence-to-sequence models. Knowledge of evaluation metrics such as bilingual evaluation understudy (BLEU) will be advantageous.
This course is part of the Generative AI Engineering Essentials with LLMs PC specialization. When you complete the specialization, you will prepare yourself with the skills and confidence to take on jobs such as AI Engineer, NLP Engineer, Machine Learning Engineer, Deep Learning Engineer, and Data Scientist.
Only a modern web browser is required to complete this course and all hands-on labs. You will be provided access to cloud-based environments to complete the labs at no charge.
Access to lectures and assignments depends on your type of enrollment. If you take a course in audit mode, you will be able to see most course materials for free. To access graded assignments and to earn a Certificate, you will need to purchase the Certificate experience, during or after your audit. If you don't see the audit option:
The course may not offer an audit option. You can try a Free Trial instead, or apply for Financial Aid.
The course may offer 'Full Course, No Certificate' instead. This option lets you see all course materials, submit required assessments, and get a final grade. This also means that you will not be able to purchase a Certificate experience.
When you enroll in the course, you get access to all of the courses in the Certificate, and you earn a certificate when you complete the work. Your electronic Certificate will be added to your Accomplishments page - from there, you can print your Certificate or add it to your LinkedIn profile. If you only want to read and view the course content, you can audit the course for free.
If you subscribed, you get a 7-day free trial during which you can cancel at no penalty. After that, we don’t give refunds, but you can cancel your subscription at any time. See our full refund policy.