Senior Data Engineer: PySpark, AWS & Airflow Lead
8 days ago
The successful candidate will have at least 6 years of experience in Data Engineering, skilled in Python, PySpark, SQL, and Airflow, and familiar with AWS and Terraform. Responsibilities include maintaining data pipelines, orchestrating tasks with Airflow, and implementing various AWS services.