About Delta Tech Hub:
Delta Air Lines (NYSE: DAL) is the U.S. global airline leader in safety, innovation, reliability and customer experience. Powered by our employees around the world, Delta has for a decade led the airline industry in operational excellence while maintaining our reputation for award-winning customer service. With our mission of connecting the people and cultures of the globe, Delta strives to foster understanding across a diverse world and serve as a force for social good. Delta has fast emerged as a customer-oriented, innovation-led, technology-driven business. The Delta Technology Hub will contribute directly to these objectives. It will sustain our long-term aspirations of delivering niche, IP-intensive, high-value, and innovative solutions. It supports various teams and functions across Delta and is an integral part of our transformation agenda, working seamlessly with a global team to create memorable experiences for customers.
Role Purpose:
Design, build, and productionize AI solutions—especially LLM and generative AI workloads—on top of our Enterprise data Platform (Databricks, Azure, OpenAI, and AWS). This role has anemphasis on and operationalization of AI solutions, ensuring AI products are reliable, scalable, secure, and cost effective in production. You will partner with data engineering, data science, and product teams to turn ideas into robust, continuously running services.
KEY RESPONSIBILITIES:
- Design and develop end to end AI solutions (from data preparation and feature/embedding engineering through to serving) on the EDP.
- Productionize AI and LLM applications: harden prototypes into resilient, observable services with clear SLAs, SLOs, and runbooks.
- Implement, optimize, and operate data / feature / embedding pipelines that support AI products, working closely with data engineers and data scientists.
- Build and maintain APIs, batch jobs, and workflows that expose AI capabilities safely to internal users and customer facing applications.
- Establish and own observability for AI systems (logging, metrics, tracing, data/quality monitoring, safety and cost monitoring).
- Contribute to and improve MLOps/LLMOps practices: deployment pipelines, versioning, canary/blue green rollouts, and rollback strategies.
- Ensure security, governance, and responsible AI practices are built into production solutions (access control, PII handling, evaluation, policy enforcement).
- Collaborate with product owners to refine requirements, shape delivery plans, and communicate risks and tradeoffs.
- Mentor and support less experienced engineers, contributing to standards, patterns, and best practices for AI engineering on the EDP.
WHAT YOU NEED TO SUCCEED (MINIMUM QUALIFICATIONS):
- 4–7+ years in software, data, or ML engineering, with at least 2+ years focused on ML or LLM based solutions.
- 2+ years experience with Databricks
- Strong SQL skills
- Strong Python skills for data and AI (e.g. pandas, PySpark, LLM/AI frameworks, API development).
- Handson experience with modern data/AI platforms similar to our EDP (e.g. lakehouse, cloud native data and ML stacks).
- Proven experience building LLM applications (chat/completions, embeddings, RAG, function/tool calling) and integrating them into production systems.
- Solid understanding of data modeling, data quality, and performance considerations for AI workloads (latency, throughput, cost).
- Experience with MLOps / LLMOps tooling and practices (e.g. model registries, experiment tracking, evaluation frameworks, CI/CD).
- Strong grasp of software engineering fundamentals: Git, code review, testing strategies (unit/integration), and documentation.
- Experience working closely with data engineers, data scientists, and product teams to deliver production solutions.
Behavioral Competencies:
- Ability to produce high quality results, work in a collaborative environment by embracing diverse perspectives and with a solution-based approach.
- Adapt communication clearly and concisely based on team dynamics and expresses thoughts & ideas effectively.
- Ability to engage effectively with peers and stakeholders to build trust and reliable working relationships.
- Ability to understand business processes, implement innovative solutions, guide juniors on continuous improvement by constantly updating oneself on current technology & trends.
- Inquisitive to understand customer and business expectations while creating value addition on technical solutions.
WHAT WILL GIVE YOU A COMPETITIVE EDGE (PREFERRED QUALIFICATIONS):
- Experience with OpenAI, Azure and/or AWS within an enterprise data platform context.
- Knowledge of prompt engineering techniques and evaluation methods to improve reliability and reduce hallucinations.
- Background in statistics, NLP, information retrieval, or related fields