Skip to Main Content

Job Title


Data Quality Engineer - Databricks & Pyspark


Company : Talentgigs


Location : New delhi, Delhi


Created : 2026-03-26


Job Type : Full Time


Job Description

We are seeking a detail-oriented Data Quality Engineer to ensure the integrity, accuracy, and reliability of data powering our digital twin and AI platforms. You will design and implement data quality frameworks, build automated validation pipelines, and establish quality metrics that enable trusted, simulation-ready data products. This role is critical to ensuring that operational decisions and ML models are built on a foundation of high-quality, governed data. Our core data quality stack includes: Data Quality Frameworks • Delta Live Tables expectations for declarative quality enforcement • Great Expectations for comprehensive data validation • Databricks data profiling and quality monitoring Platform & Tools • Databricks SQL and PySpark for quality checks at scale • Unity Catalog for lineage tracking and governance compliance • Python for custom validation logic and anomaly detection Observability • Quality metrics dashboards and alerting• Data profiling and statistical analysis • Anomaly detection and drift monitoring Key Responsibilities • Design and implement data quality frameworks using Delta Live Tables expectations and Great Expectations • Build automated data validation pipelines that enforce quality standards at ingestion and transformation stages • Develop data profiling processes to understand data distributions, patterns, and anomalies • Define and track data quality metrics (completeness, accuracy, consistency, timeliness, validity) • Implement anomaly detection mechanisms to identify data drift and quality degradation • Create quality dashboards and alerting systems for proactive issue identification • Collaborate with data engineers to embed quality checks into ETL/ELT pipelines • Partner with data architects to establish data quality standards and governance policies