At Delta Air Lines, connection is at the heart of everything we do and guides our every action. We strive to welcome and care for all our customers during their travels with us and aim to deliver an elevated experience.
Delta is focused on sustaining a strong IT operation, growing our capabilities, and maximizing optimization across each of our tech hubs to elevate the travel experience for our customers and empower our 90,000 Delta people.
We’re committed to fostering innovation, and we’re excited to invite you to be part of our journey as we shape the future of technology at the world’s best airline!
Delta’s Revenue Technology, Loyalty, and Data Analytics organization unites several groups to achieve strategic business goals and technical solutions that create opportunities for us to innovate on behalf of our customers. The General Manager, Data Operations oversees the end-to-end data operations, ensuring data availability, quality, and security. This role is crucial for optimizing Data & AI platforms, automating pipelines, and facilitating the seamless flow of data between various systems and users. They will also have experience with the full DataOps lifecycle, including ingestion and ETL processes into databases, data lakes, or data warehouses.
Responsibilities include but not limited to:
- Manage end to end operations of our cloud based Data & AI platforms which includes the Enterprise Data Warehouse, Data Lakes, AI/Machine Learning, Data Pipelines, ETL, Business Intelligence and Streaming Platforms
- The ideal candidate will be an innovative problem-solver with a passion for developing and managing high-performing teams
- Manage AWS based cloud infrastructure, data engineering, and data operations
- Leading and motivating teams to achieve objectives and managing multiple projects simultaneously
- Develop and implement comprehensive strategies that drive data operations success
- Evaluate software requests in order to determine where existing requirements intersect or can be used in conjunction with new products as needed.
- Estimate the technical debt of the creation of new features or systems and interpret specifications, troubleshoot, and define needed software solutions.
- Help us stay ahead of the curve by working closely with data engineers, stream processing specialists, API developers, our Architecture team, and analysts to design systems which can scale elastically
- Lead the development of best practices and design patterns for data engineering, operations and analytics, staying at the forefront of industry trends and continuously refining our processes to ensure the highest-quality, innovative, and efficient solutions.
- Build and operate software across our entire cutting-edge data platform, including event driven data processing, storage, and serving through scalable and highly available APIs, with awesome cutting-edge technologies.
- Identify and remediate security and resiliency risks for Data & AI platforms
- Think of new ways to help make our data platform more scalable, resilient and reliable and then work across our team to put your ideas into action.
- Work closely with data analysts and business stake holders to make data easily accessible and understandable to them.
- Ensure data quality by implementing re-usable data quality frameworks.
- Work closely with various other data engineering teams to roll out new capabilities.
- Build process and tools to maintain Machine Learning pipelines in production.
- Develop and enforce data engineering, security, data quality standards through automation.
- Be responsible for cloud cost and improving efficiency.
- Keep data accessible and secure. Partner with security and compliance to design systems using modern technologies to make system and user access as effortless as possible within the needs to protect content, user privacy, and integrity.
- Understand and respond to engineering needs for a good portfolio of fit-for-purpose database technologies, but don’t sacrifice operability, scale, reliability, or resource fungibility across the organization. Manage the lifecycle of old and new tech.
Build documentation, automation, repeatable processes, operational tooling, and a robust suite of test capabilities to validate deployments, nonfunctional requirements, resiliency, and scale.