Coursera

Building Automated Data Pipelines with Spark,dbt,and Airflow

Grow your skills with Coursera Plus for $239/year (usually $399). Save now.

Coursera

Building Automated Data Pipelines with Spark,dbt,and Airflow

Included with Coursera Plus

 

Learn more

Gain insight into a topic and learn the fundamentals.
Beginner level

Recommended experience

9 hours to complete
Flexible schedule
Learn at your own pace
Gain insight into a topic and learn the fundamentals.
Beginner level

Recommended experience

9 hours to complete
Flexible schedule
Learn at your own pace

What you'll learn

  • Build end-to-end data pipelines that automatically ingest from databases, APIs, and streams using Spark, dbt, and Airflow tools.

  • Design data models with historical tracking using SCD Type 2 patterns to preserve complete change history for analytics.

  • Create automated workflows with intelligent retry logic, SLA monitoring, and parameterization for production reliability.

  • Optimize Spark job performance using partitioning and caching strategies to achieve 30%+ runtime improvements.

Details to know

Shareable certificate

Add to your LinkedIn profile

Recently updated!

March 2026

Assessments

19 assignmentsÂą

AI Graded see disclaimer
Taught in English

See how employees at top companies are mastering in-demand skills

 logos of Petrobras, TATA, Danone, Capgemini, P&G and L'Oreal

Build your Data Analysis expertise

This course is part of the Open source Data Engineering with Spark, dbt & Airflow Professional Certificate
When you enroll in this course, you'll also be enrolled in this Professional Certificate.
  • Learn new concepts from industry experts
  • Gain a foundational understanding of a subject or tool
  • Develop job-relevant skills with hands-on projects
  • Earn a shareable career certificate from Coursera

There are 11 modules in this course

You will learn the foundational concepts and tools needed to create systematic visual documentation of data pipeline architectures.

What's included

3 videos2 readings1 assignment

You will apply advanced techniques to create professional-quality data flow diagrams that accurately represent complex enterprise data systems and support stakeholder collaboration.

What's included

2 videos2 readings3 assignments

You will establish the foundational understanding and core skills for creating modular data pipeline stages, focusing on the principles of separation of concerns and tool integration fundamentals.

What's included

1 video1 reading1 assignment

You will implement complete end-to-end data pipelines by integrating modular components with industry-standard tools, culminating in comprehensive assessment of their pipeline development capabilities.

What's included

2 readings3 assignments

You will establish foundational knowledge of connector architecture and complete their first database connector configuration using Airbyte.

What's included

2 videos2 readings1 assignment

You will implement complete multi-source data integration by configuring streaming and API connectors, applying enterprise security patterns, and demonstrating mastery through comprehensive connector configuration.

What's included

2 videos2 readings2 assignments

You will understand the fundamental concepts of SCD2 logic and begin applying these principles to create data models that preserve historical context in enterprise data warehouses.

What's included

3 videos1 reading1 assignment

You will implement production-ready SCD2 models using dbt, creating automated historical tracking systems with proper change detection, validity periods, and current status management.

What's included

2 videos2 readings3 assignments

You will understand the foundational concepts and design principles for creating robust data workflows with Apache Airflow.

What's included

3 videos1 reading1 assignment

You will implement production-grade Airflow workflows with retry mechanisms, SLA monitoring, and parameterization for enterprise-ready data pipeline resilience.

What's included

2 videos1 reading2 assignments1 ungraded lab

You will integrate data engineering skills to build a complete automated data pipeline that processes diverse data sources, applies historical tracking, and orchestrates workflows. This project synthesizes mapping, transformation, integration, modeling, and automation capabilities into a production-ready data system.

What's included

4 readings1 assignment

Earn a career certificate

Add this credential to your LinkedIn profile, resume, or CV. Share it on social media and in your performance review.

Instructor

Professionals from the Industry
307 Courses 44,329 learners

Offered by

Coursera

Why people choose Coursera for their career

Felipe M.

Learner since 2018
"To be able to take courses at my own pace and rhythm has been an amazing experience. I can learn whenever it fits my schedule and mood."

Jennifer J.

Learner since 2020
"I directly applied the concepts and skills I learned from my courses to an exciting new project at work."

Larry W.

Learner since 2021
"When I need courses on topics that my university doesn't offer, Coursera is one of the best places to go."

Chaitanya A.

"Learning isn't just about being better at your job: it's so much more than that. Coursera allows me to learn without limits."
Coursera Plus

Open new doors with Coursera Plus

Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription

Advance your career with an online degree

Earn a degree from world-class universities - 100% online

Join over 3,400 global companies that choose Coursera for Business

Upskill your employees to excel in the digital economy

Frequently asked questions

Âą Some assignments in this course are AI-graded. For these assignments, your data will be used in accordance with Coursera's Privacy Notice.