Skip to Main Content

Job Title


Data Engineer ETL


Company : Vistec Partners


Location : Palakkad, Kerala


Created : 2026-03-19


Job Type : Full Time


Job Description

Position: Data EngineerExperience: 5–6 YearsLocation: Work From Home (WFH)Office Requirement: Once a week – NoidaTime Overlap: Mandatory overlap with US ESTRole SummaryWe are seeking an experienced Data Engineer with 5–6 years of industry experience, including a minimum of 2 years of hands-on expertise in Databricks and Azure Data Factory (ADF). The role involves designing, building, and optimizing scalable data pipelines and analytics solutions on Azure. Collaboration with US-based stakeholders requires daily overlap with the EST time zone.Key ResponsibilitiesDesign, develop, and maintain ETL/ELT pipelines using Azure Data Factory and Databricks.Build scalable batch and streaming data processing workflows.Develop Databricks notebooks, jobs, and Delta Lake tables.Perform performance tuning and cost optimization of data workloads.Implement robust data quality checks, validations, and monitoring.Develop Python-based data transformation and automation scripts.Write and optimize complex SQL queries for analytics and reporting.Collaborate with Analytics, BI, and Product teams.Document technical designs, workflows, and operational procedures.Required Skills & ExperienceDatabricks: Minimum 2 years of hands-on experience with Spark, notebooks, workflows, Delta Lake.Azure Data Factory (ADF): Minimum 2 years of experience building production-grade pipelines.Python: Strong scripting and transformation capabilities.SQL: Advanced querying, joins, window functions, and optimization.Data Modeling: Knowledge of data warehousing concepts (star/snowflake schemas).Azure Cloud: Familiarity with Azure data services and architecture.Version Control: Experience with Git / DevOps workflows.Debugging: Strong troubleshooting and problem-solving skills.Communication: Ability to collaborate effectively with US-based stakeholders.Preferred / Good-to-HaveExperience with streaming technologies (Kafka / Event Hub).CI/CD pipelines using Azure DevOps.Expertise in Data Lake / Delta Lake architecture.Exposure to BI tools such as Power BI.Experience in performance and cost optimization initiatives.Work ConditionsPrimarily Work From Home (WFH).Mandatory once-a-week office visit in Noida.Mandatory daily overlap with US EST time zone.Flexibility to work evening hours as required.#DataEngineer #Databricks #AzureDataFactory #Python #SQL #DataEngineering #Hiring #NoidaJobs #WFH #Azure #ETL #BigData