Avatar

Organizations

2 results for Orchestration
  • Project Overview

    This repository demonstrates workflow orchestration for data engineering pipelines using Kestra. It guides users through building, running, and scheduling data pipelines that extract, transform, and load (ETL) data both locally (with PostgreSQL) and in the cloud (with Google Cloud Platform). The project is hands-on and includes conceptual explanations, infrastructure setup, and several example pipeline flows.


    Key Concepts

    • Workflow Orchestration: Automating and managing complex workflows with dependencies, retries, logging, and monitoring.
    • Kestra: An orchestration platform with a user-friendly UI and YAML-based workflow definitions (called “flows”).
    • Data Lake & Data Warehouse: Demonstrates moving data from raw storage (GCS) to structured analytics (BigQuery).
  • Project Overview

    This repository serves as a practical guide to building and orchestrating robust data pipelines using Apache Airflow. It covers essential concepts from basic workflow management to advanced deployments with Google Cloud Platform (GCP) and Kubernetes.


    Key Concepts

    • Workflow Orchestration: Automating and managing complex data workflows with dependencies, scheduling, retries, and monitoring using Apache Airflow.
    • DAGs (Directed Acyclic Graphs): The core abstraction in Airflow for defining task dependencies, execution order, and workflow logic.
    • Extensible Operators & Integrations: Leveraging Airflow’s wide range of built-in operators and custom plugins to interact with databases, cloud services (GCP, Kubernetes), and external systems.
    • Scalable Deployments: Running Airflow locally for prototyping, or deploying on cloud and Kubernetes for production-scale, resilient, and distributed data pipeline execution.