[ad_1]
Apache Airflow is an open-source mission for scheduling and managing workflows, written in Python.
Kaxil Naik, director of Airflow engineering at Astronomer and one of many core committers of Airflow, advised SD Occasions: “It’s used to automate your every day jobs or every day duties, and duties may be so simple as operating a Python script or it may be as sophisticated as bringing in all the information from 500 totally different knowledge warehouses and manipulating it.”
It was created at Airbnb in 2014 and is about to have fun its 10 12 months anniversary later this 12 months. It joined the Apache Software program Basis in March 2016 on the Incubation stage and was made a top-level mission in 2019.
Airflow was initially designed for simply ETL use circumstances, however has over time advanced so as to add options that make it helpful for all points associated to knowledge engineering.
“It has continued to be the chief on this area, as a result of we’ve maintained an excellent steadiness between innovation and stability. Due to this virtually 10 years of Airflow in the identical area, we’ve added so many options that permit Airflow to be very dependable and steady,” he mentioned.
The latest launch, 2.9, got here out earlier this week and added new options like the power to mix dataset and time-based schedules, customized names for Dynamic Process Mapping, and the power to group process logs.
The mission may be discovered on GitHub right here.
[ad_2]