- Large Language Modeling
- Natural Language Processing
- ChatGPT
- Artificial Intelligence
- Prompt Engineering
- Deep Learning
- Generative Model Architectures
- Artificial Neural Networks
- LLM Application
- Generative AI
- Data Ethics
- Risk Management Framework
Generative Pre-trained Transformers (GPT)
Completed by Peter Seidenberg
January 30, 2024
12 hours (approximately)
Peter Seidenberg's account is verified. Coursera certifies their successful completion of Generative Pre-trained Transformers (GPT)
What you will learn
Explore Language Models – Learn how computers process text, examine Transformer structures, and compare masked vs causal language models.
Build & Assess Models – Create an n-gram model, analyse text generation, assess models, and understand hallucinations and their impact.
Use LLMs Responsibly – Discuss ethics, compare short- vs long-term risks, and explore LLM use in chatbots, real-world apps, and academia.
Skills you will gain

