About Delta Tech Hub:
Delta Air Lines (NYSE: DAL) is the U.S. global airline leader in safety, innovation, reliability and customer experience. Powered by our employees around the world, Delta has for a decade led the airline industry in operational excellence while maintaining our reputation for award-winning customer service. With our mission of connecting the people and cultures of the globe, Delta strives to foster understanding across a diverse world and serve as a force for social good. Delta has fast emerged as a customer-oriented, innovation-led, technology-driven business. The Delta Technology Hub will contribute directly to these objectives. It will sustain our long-term aspirations of delivering niche, IP-intensive, high-value, and innovative solutions. It supports various teams and functions across Delta and is an integral part of our transformation agenda, working seamlessly with a global team to create memorable experiences for customers.
The Senior Data Engineer designs, builds, and maintains the data infrastructure that powers Delta's Health Intelligence Platform. This role owns the pipelines that transform raw healthcare data from vendors (UMR, UHC, CVS, Spring Health) into clean, trusted, AI-ready data assets. You'll work with modern tools — dbt, Redshift Serverless, AWS Sage Maker — and collaborate with data quality, visualization, and data science teams to deliver enterprise-scale healthcare analytics.
- Build and optimize dbt models across staging, intermediate, domain, and marts layers following healthcare data best practice.
- Design and implement data pipelines for healthcare vendor data ingestion, transformation, and quality validation.
- Develop automated monitoring and alerting to catch data issues before they impact downstream analytics.
- Create ML feature engineering pipelines that prepare data for predictive models in SageMaker and Bedrock.
- Optimize Redshift performance through query tuning, table design, and workload management.
- Build CI/CD pipelines for reliable, automated deployments across environments.
- Document data models and pipelines to enable team knowledge sharing and platform maintainability.
WHAT YOU NEED TO SUCCEED (MINIMUM QUALIFICATIONS):
Qualification:
- Bachelor’s degree in Relevant Field.
- 4+ years of relevant work experience, including 3+ years of data engineering experience with SQL and Python - Mandatory.
- Hands-on experience with DBT- Mandatory , or similar transformation frameworks.
- Experience with cloud data warehouses- AWS / GCP / Azure (Glue, Lambda, Redshift, Snowflake, or BigQuery).
- Strong understanding of data modeling, ETL / ELT patterns, and data quality.
- Ability to work independently and collaborate across distributed teams.
Behavioral Competencies:
- Ability to produce high quality results, work in a collaborative environment by embracing diverse perspectives and with a solution-based approach.
- Adapt communication clearly and concisely based on team dynamics and expresses thoughts & ideas effectively.
- Ability to engage effectively with peers and stakeholders to build trust and reliable working relationships.
- Ability to understand business processes, implement innovative solutions, guide juniors on continuous improvement by constantly updating oneself on current technology & trends.
- Inquisitive to understand customer and business expectations while creating value addition on technical solutions.
WHAT WILL GIVE YOU A COMPETITIVE EDGE (PREFERRED QUALIFICATIONS):
- Healthcare data experience (claims, eligibility, pharmacy, clinical).
- AWS experience (Redshift, SageMaker, Lambda, Step Functions).
- ML feature engineering or model deployment experience.
- CI/CD tools (GitHub Actions, Jenkins, or similar).
- Workflow orchestration tools (Airflow, Dagster, or similar).
- AWS / GCP / Azure- AWS Preferred.
- AWS Databricks- Nice to have.
- Pyspark.