is automated solution to deploy Apache Airflow on the cloud. It makes deployment of Kubernetes Cluster with Apache Airflow application easy and quick. Airee boosts effectiveness and lets users focus on building data workflows not provisioning infrastructure or implementing CI/CD.


You want to orchestrate workflows, but you don’t know how to start or you wish to focus only on your data, but you are lost within configuration options? Our team of over 100 experienced engineers works with people from Fortune 500 companies who struggled with the same issues and came up with Airee solution

Time saving

You save your time on creation of Apache Airflow environment, so you can focus on building and running workflows


Optimize resource usage and save costs by automatically scaling

Improves productivity

You use benefits of Continious Integration (CI) and Continious Delivery (CD)- DAGs are silky smooth updated via CLI

Safe and Secure

Role-based acces control (RBAC) for configurable and secure user management


You need less than 15 minutes to set up all components to run Apache Airflow on GCP


It can be installed and run on the premises or on the cloud


You deploy Apache Airflow with only one command from your local machine and you don’t need to worry about its configuration


By DS Stream team with over 100 professionals on board


Airee lets you use Apache Airflow to create data pipelines without need of manage the underlying infrastructure for scalability, availability, and security. It is integrated with security services to help provide you with fast and secure access to your data.

Airee covers the entire process, from design and installation, to Apache Airflow implementation, testing and debugging:

  • Airee deploys Apache Airflow instance on Google Cloud Platform or on-premises infrastructure using Kubernetes.
  • Airee offers predefined environment set-ups according to number of running tasks.
  • Airee supports secret management using GCP Secret Manager and TLS communication.
  • Airee’s strong points are continuous integration CI and continuous delivery CD processes supported by GitHub Actions, Terraform and Flux.
  • Airee provides a setup of GitHub Actions runner deployed on Google Kubernetes Engine
  • Airee provides local Apache Airflow environment for development purposes.
  • Airee is transparent, users are granted access to GitHub code for infrastructure, application and DAGs templates.
  • Airee supports various Apache Airflow executors including CeleryExecutor, KubernetesExecutor and CeleryKubernetesExecutor.
  • Airee supports KEDA autoscaller.
  • Airee takes into account cloud cost-effectiveness and provides option to pause infrastructure if it is unused.
  • Airee is an open-sourced project, users can develop customization suited for their business case.

Airee Use Cases:

  • First time Apache Airflow user.
    If you are looking for orchestration tool and you found Apache Airflow, then Airee helps you with deployment of infrastructure in cloud or on premise on Kubernetes Cluster, it covers configuration and secure connections between components.
    It is cost effective and open to customization. Airee will spin up your Apache Airflow instance in the cloud and manage changes with automated CI/CD.
  • Heavy user of Apache Airflow with need for custom features.
    If you are a heavy Apache Airflow user, you have your own plugins and operators, you need tailor made deployment with company policies integration, you can benefit from Airee with CI/CD features and ready to use deployment. As a team maintaining Airee we can also support you with development of custom features – just create an issue or a PR in this repository for general requests or in particular repository for detailed changes.
  • Running Apache Airflow at a scale.
    When you work in a data driven organization and you need more and more reports and analysis to be delivered each day at exact hour, you can use Airee as a platform for provisioning Apache Airflow instances in a unified way.
    Airee delivers ready to use Terraform recipes for Kubernetes Custer deployment and prepared CI/CD definitions for GitHub Action.
    You can transform our containerized Controller into a REST API that spins up Apache Airflow instances on demand.



Technology tool stack

  • Analytical Databases: Big Query, Redshift, Synapse
  • ETL: Databricks, DataFlow, DataPrep
  • Scalable Compute Engines: GKE, AKS, EC2, DataProc
  • Process Orchestration: Apache Airflow / Composer, Bat
  • Platform Deployment & Scaling: terraform, custom tools

  • Support for all Hadoop distributions: Cloudera, Hortonworks, MapR
  • Hadoop tools: hdfs, hive, pig, spark, flink
  • No SQL Databases: Cassandra MongoDB, Hbase, Phoenix
  • Process Automation: oozie, Apache Airflow

  • Power BI
  • Tableau
  • Data Studio
  • D3.js

  • Python: numpy, pandas, matplotlib, scikit-learn, scipy, spark, pyspark & more
  • Scala, Java

Get in touch with us

Contact us to see how we can help you.
We’ll get back to you within 4 hours on working days (Mon – Fri, 9am – 5pm).

Dominik Radwański

Service Delivery Partner


Grochowska 306/308
03-840 Warsaw

    Select subject of application

    The Controller of your personal data is DS Stream sp. z o.o. with its registered office in Warsaw (03-840), at ul. Grochowska 306/308. Your personal data will be processed in order to answer the question and archive the form. More information about the processing of your personal data can be found in the Privacy Policy.