As AI moves beyond the cloud, on-device inference is rapidly expanding to smartphones, IoT devices, robots, AR/VR headsets, and more. Billions of mobile and other edge devices are ready to run optimized AI models.
Recommended experience
What you'll learn
Learn to deploy AI models on edge devices like smartphones, using their local compute power for faster and more secure inference.
Explore converting PyTorch/TensorFlow models for device compatibility and quantize them for better performance and reduced model size.
Learn about device integration, including runtime dependencies, and how GPU, NPU, and CPU compute unit utilization affect performance.
Details to know
Only available on desktop
See how employees at top companies are mastering in-demand skills
Learn, practice, and apply job-ready skills in less than 2 hours
- Receive training from industry experts
- Gain hands-on experience solving real-world job tasks
About this project
Instructor
Offered by
How you'll learn
Hands-on, project-based learning
Practice new skills by completing job-related tasks with step-by-step instructions.
No downloads or installation required
Access the tools and resources you need in a cloud environment.
Available only on desktop
This project is designed for laptops or desktop computers with a reliable Internet connection, not mobile devices.
Why people choose Coursera for their career
You might also like
Open new doors with Coursera Plus
Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription
Advance your career with an online degree
Earn a degree from world-class universities - 100% online
Join over 3,400 global companies that choose Coursera for Business
Upskill your employees to excel in the digital economy