This Professional Certificate is intended to help you develop the job-ready skills and portfolio for an entry-level Business Intelligence (BI) or Data Warehousing Engineering position. Throughout the online courses in this program, you will immerse yourself in the in-demand role of a Data Warehouse Engineer and acquire the essential skills you need to work with a range of tools and databases to design, deploy, operationalize and manage Enterprise Data Warehouses (EDW).
By the end of this Professional Certificate, you will be able to perform the key tasks required in a data warehouse engineering role. You will work with Relational Database Management Systems (RDBMS) and query data using SQL statements.
You will use Linux/UNIX shell scripts to automate repetitive tasks, and build data pipelines with tools like Apache Airflow and Kafka to Extract, Transform and Load (ETL) data. You will gain experience with managing databases and data warehouses.
Finally, you will design and populate data warehouse systems and utilize Business Intelligence tools to analyze and extract insights using reports and dashboards.
This program is suitable for anyone with a passion for learning and is suitable for you whether you have a college degree or not and does not require any prior data engineering, or programming experience.
Applied Learning Project
Each course includes numerous hands-on labs and a project to hone and apply the concepts and skills you learn. By the end of the program, you will have designed, implemented, configured, queried, and maintained numerous databases and created data pipelines using real-world tools and data repositories to build a portfolio of job-ready skills.
You will start by provisioning a database instance on Cloud. Next, you will design databases using Entity-Relationship Diagrams (ERD) and create database objects like tables and keys using MySQL, PostgreSQL and IBM Db2.
You will then become proficient with querying databases with SQL using SELECT, INSERT, UPDATE and DELETE statements, and learn to filter, sort & aggregate result sets. Next, you will become familiar with common Linux/Unix shell commands and use them to build Bash scripts.
You will create Data Pipelines for batch and streaming ETL jobs using Apache Airflow and Kafka. Finally, implement data warehouses & create BI dashboards.