IBM
Generative AI Engineering and Fine-Tuning Transformers
IBM

Generative AI Engineering and Fine-Tuning Transformers

This course is part of multiple programs.

Joseph Santarcangelo
Ashutosh Sagar
Fateme Akbari

Instructors: Joseph Santarcangelo

Sponsored by Hawaii Pacific University

Gain insight into a topic and learn the fundamentals.
Intermediate level

Recommended experience

7 hours to complete
3 weeks at 2 hours a week
Flexible schedule
Learn at your own pace
Gain insight into a topic and learn the fundamentals.
Intermediate level

Recommended experience

7 hours to complete
3 weeks at 2 hours a week
Flexible schedule
Learn at your own pace

What you'll learn

  • Sought-after job-ready skills businesses need for working with transformer-based LLMs for generative AI engineering... in just 1 week.

  • How to perform parameter-efficient fine-tuning (PEFT) using LoRA and QLoRA

  • How to use pretrained transformers for language tasks and fine-tune them for specific tasks.

  • How to load models and their inferences and train models with Hugging Face.

Details to know

Earn a career certificate

Add to your LinkedIn profile

Assessments

4 assignments

Taught in English
Recently updated!

September 2024

See how employees at top companies are mastering in-demand skills

Placeholder

Build your subject-matter expertise

This course is available as part of
When you enroll in this course, you'll also be asked to select a specific program.
  • Learn new concepts from industry experts
  • Gain a foundational understanding of a subject or tool
  • Develop job-relevant skills with hands-on projects
  • Earn a shareable career certificate
Placeholder
Placeholder

Earn a career certificate

Add this credential to your LinkedIn profile, resume, or CV

Share it on social media and in your performance review

Placeholder

There are 2 modules in this course

In this module, you will be introduced to Fine Tuning. You’ll get an overview of generative models and compare Hugging Face and PyTorch frameworks. You’ll also gain insights into model quantization and learn to use pre-trained transformers and then fine-tune them using Hugging Face and PyTorch.

What's included

5 videos4 readings2 assignments4 app items

In this module, you will gain knowledge about parameter efficient fine-tuning (PEFT) and also learn about adapters such as LoRA (Low-Rank Adaptation) and QLoRA (Quantized Low-Rank Adaptation). In hands-on labs you will train a base model and pre-train LLMs with Hugging Face.

What's included

4 videos4 readings2 assignments2 app items4 plugins

Instructors

Joseph Santarcangelo
IBM
33 Courses1,659,080 learners

Offered by

IBM

Why people choose Coursera for their career

Felipe M.
Learner since 2018
"To be able to take courses at my own pace and rhythm has been an amazing experience. I can learn whenever it fits my schedule and mood."
Jennifer J.
Learner since 2020
"I directly applied the concepts and skills I learned from my courses to an exciting new project at work."
Larry W.
Learner since 2021
"When I need courses on topics that my university doesn't offer, Coursera is one of the best places to go."
Chaitanya A.
"Learning isn't just about being better at your job: it's so much more than that. Coursera allows me to learn without limits."
Placeholder

Open new doors with Coursera Plus

Unlimited access to 7,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription

Advance your career with an online degree

Earn a degree from world-class universities - 100% online

Join over 3,400 global companies that choose Coursera for Business

Upskill your employees to excel in the digital economy