IBM
Gen AI Foundational Models for NLP & Language Understanding
IBM

Gen AI Foundational Models for NLP & Language Understanding

This course is part of multiple programs.

Joseph Santarcangelo
Fateme Akbari

Instructors: Joseph Santarcangelo +1 more

2,371 already enrolled

Included with Coursera Plus

Gain insight into a topic and learn the fundamentals.
4.5

(28 reviews)

Intermediate level

Recommended experience

7 hours to complete
3 weeks at 2 hours a week
Flexible schedule
Learn at your own pace
Gain insight into a topic and learn the fundamentals.
4.5

(28 reviews)

Intermediate level

Recommended experience

7 hours to complete
3 weeks at 2 hours a week
Flexible schedule
Learn at your own pace

What you'll learn

  • Explain how to use one-hot encoding, bag-of-words, embedding, and embedding bags to convert words to features.

  • Build and use word2vec models for contextual embedding.

  • Build and train a simple language model with a neural network.

  • Utilize N-gram and sequence-to-sequence models for document classification, text analysis, and sequence transformation.

Skills you'll gain

  • Category: N-Gram
  • Category: PyTorch torchtext
  • Category: Generative AI for NLP
  • Category: Word2Vec Model
  • Category: Sequence-to-Sequence Model

Details to know

Shareable certificate

Add to your LinkedIn profile

Assessments

5 assignments

Taught in English

Build your subject-matter expertise

This course is available as part of
When you enroll in this course, you'll also be asked to select a specific program.
  • Learn new concepts from industry experts
  • Gain a foundational understanding of a subject or tool
  • Develop job-relevant skills with hands-on projects
  • Earn a shareable career certificate
Placeholder
Placeholder

Earn a career certificate

Add this credential to your LinkedIn profile, resume, or CV

Share it on social media and in your performance review

Placeholder

There are 2 modules in this course

In this module, you will learn about one-hot encoding, bag-of-words, embeddings, and embedding bags. You will also gain knowledge of neural networks and their hyperparameters, cross-entropy loss, and optimization. You will then delve into the concept of language modeling with n-grams. The module also includes hands-on labs on document classification with PyTorch and building a simple language model with a neural network.

What's included

7 videos4 readings3 assignments2 app items1 plugin

In this module, you will learn about the word2vec embedding model and its types. You will also be introduced to sequence-to-sequence models and how they employ Recurrent neural networks (RNNs) to process variable-length input sequences and generate variable-length output sequences. You will gain insights about encoder-decoder RNN models, their architecture, and how to build them using PyTorch. The module will give you knowledge about evaluating the quality of text using perplexity, precision, and recall in text generation. In hands-on labs, you will integrate pre-trained embedding models for text analysis or classification and develop a sequence-to-sequence model for sequence transformation tasks.

What's included

6 videos4 readings2 assignments2 app items3 plugins

Instructors

Joseph Santarcangelo
Joseph Santarcangelo
IBM
33 Courses1,670,573 learners

Offered by

IBM

Why people choose Coursera for their career

Felipe M.
Learner since 2018
"To be able to take courses at my own pace and rhythm has been an amazing experience. I can learn whenever it fits my schedule and mood."
Jennifer J.
Learner since 2020
"I directly applied the concepts and skills I learned from my courses to an exciting new project at work."
Larry W.
Learner since 2021
"When I need courses on topics that my university doesn't offer, Coursera is one of the best places to go."
Chaitanya A.
"Learning isn't just about being better at your job: it's so much more than that. Coursera allows me to learn without limits."

Frequently asked questions