- Model Evaluation
- Generative AI
- Natural Language Processing
- Embeddings
- Large Language Modeling
- Recurrent Neural Networks (RNNs)
- Data Preprocessing
- Data Ethics
- Feature Engineering
- PyTorch (Machine Learning Library)
- Transfer Learning
- Artificial Neural Networks
Gen AI Foundational Models for NLP & Language Understanding
Completed by Shashank Naik
July 23, 2024
9 hours (approximately)
Shashank Naik's account is verified. Coursera certifies their successful completion of Gen AI Foundational Models for NLP & Language Understanding
What you will learn
Explain how one-hot encoding, bag-of-words, embeddings, and embedding bags transform text into numerical features for NLP models
Implement Word2Vec models using CBOW and Skip-gram architectures to generate contextual word embeddings
Develop and train neural network-based language models using statistical N-Grams and feedforward architectures
Build sequence-to-sequence models with encoder–decoder RNNs for tasks such as machine translation and sequence transformation
Skills you will gain

