- Generative AI
- Large Language Modeling
- Performance Tuning
- PyTorch (Machine Learning Library)
- Applied Machine Learning
- Embeddings
- Transfer Learning
- Natural Language Processing
- Text Mining
Generative AI Language Modeling with Transformers
Completed by Phi Tai Huy Cung
November 7, 2024
9 hours (approximately)
Phi Tai Huy Cung's account is verified. Coursera certifies their successful completion of Generative AI Language Modeling with Transformers
What you will learn
Explain the role of attention mechanisms in transformer models for capturing contextual relationships in text
Describe the differences in language modeling approaches between decoder-based models like GPT and encoder-based models like BERT
Implement key components of transformer models, including positional encoding, attention mechanisms, and masking, using PyTorch
Apply transformer-based models for real-world NLP tasks, such as text classification and language translation, using PyTorch and Hugging Face tools
Skills you will gain

