- Deep Learning
- Artificial Neural Networks
- PyTorch (Machine Learning Library)
- Generative AI
- Natural Language Processing
- Applied Machine Learning
- Text Mining
- Large Language Modeling
Generative AI Language Modeling with Transformers
Completed by Ahmed Elnagar
December 14, 2024
9 hours (approximately)
Ahmed Elnagar's account is verified. Coursera certifies their successful completion of Generative AI Language Modeling with Transformers
What you will learn
Explain the role of attention mechanisms in transformer models for capturing contextual relationships in text
Describe the differences in language modeling approaches between decoder-based models like GPT and encoder-based models like BERT
Implement key components of transformer models, including positional encoding, attention mechanisms, and masking, using PyTorch
Apply transformer-based models for real-world NLP tasks, such as text classification and language translation, using PyTorch and Hugging Face tools
Skills you will gain
