Skip to Main Content

Job Title


Senior Data Engineer (Airflow Specialist)


Company : Hvantage Technologies Inc USA


Location : Faridabad, Haryana


Created : 2026-04-14


Job Type : Full Time


Job Description

Job title: Senior Data Engineer (Airflow Specialist)Company: Hvantage Tech Solutions Pvt. Ltd.Department: Enterprise Applications / Healthcare ITLocation: Remote / Hybrid / Onsite (as applicable)Employment Type: Full-TimeExperience Required: 5+ YearsWork Mode: Must be comfortable working with global teams and US clientsMethodology: SAFe Agile (Mandatory)About Hvantage Tech Solutions Pvt. Ltd.Hvantage Tech Solutions Pvt. Ltd. is a global technology and healthcare IT services provider delivering advanced digital solutions for healthcare payers, providers, and life sciences organizations. Our expertise spans enterprise platforms, healthcare interoperability, AI-enabled analytics, cloud-native architectures, and digital member engagement platforms.Role OverviewWe are seeking a versatile Senior Data Engineer to lead the design, implementation, and scaling of our enterprise data orchestration platform using Apache Airflow. This role requires a /"builder/" mindset, combining the agility to create robust pipelines from the ground up with the maturity to ensure they are scalable, secure, and integrated into a complex cloud ecosystem. As the primary architect for workflow automation, you will move beyond simple task scheduling to build a resilient, self-healing data infrastructure.Key ResponsibilitiesOrchestration & Pipeline Engineering: Author complex, modular, and idempotent DAGs using Python and the Airflow TaskFlow API.Framework Development: Build custom Operators, Hooks, and Sensors to standardize data integrations across the organization.High-Compute Integration: Integrate Airflow with heavy-compute environments including Azure Databricks, Spark clusters, and cloud data warehouses.Platform Management: Optimize Airflow executors (Celery or Kubernetes) to manage high-concurrency workloads and handle backend database tuning (PostgreSQL/MySQL).Security & Compliance: Implement enterprise-grade security protocols, including Role-Based Access Control (RBAC), Secret Management (Azure Key Vault/HashiCorp Vault), and OAuth integration.DevOps & Reliability: Build and maintain automated CI/CD deployment pipelines for DAGs and infrastructure-as-code using Terraform or Bicep.Observability: Implement advanced monitoring and alerting using Prometheus, Grafana, or the ELK stack to track pipeline health and SLA breaches.Leadership & Mentorship: Establish data engineering best practices across the team and conduct rigorous code reviews to ensure architectural integrity.Technical RequirementsOrchestration: Expert knowledge of Apache Airflow 3.x+ (specifically XComs, Providers, and Dynamic Task Mapping).Languages: Advanced proficiency in Python and expert-level SQL.Cloud Platforms: Proficiency in Azure (Data Factory, Blob Storage, AKS) or equivalent services in AWS/GCP.DevOps: Hands-on experience with CI/CD tools such as GitHub Actions, Azure DevOps, or Jenkins.QualificationsExperience: 5+ years in Data Engineering, with a minimum of 2-3 years focused specifically on scaling Airflow in production environments.Education: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field.Preferred Skills:Experience migrating legacy schedulers to Airflow.Background in software development (Java or .NET).Knowledge of data quality frameworks like Great Expectations or dbt.CompetenciesStrong focus on long-term maintenance and architectural integrity.Ability to move fast in an agile environment without compromising on code quality.Exceptional problem-solving skills for debugging complex data orchestration issues.Application Process - Interested candidates may apply at: or WhatsApp at +91 97552 99999