Skip to Main Content

Job Title


Data Ingestion Engineer


Company : Insight Global


Location : Ranchi, Jharkhand


Created : 2025-06-18


Job Type : Full Time


Job Description

Pay; Based on relevant experience to the Job Description. 25-35 LakhRequired Skills & Experience7+ years of experience with Azure Data Services:Strong experience with Azure Data Factory (ADF) for orchestrating data pipelines.Hands-on experience with ADLS Gen 2, Databricks, and various data formats (e.g., Parquet, JSON, CSV).Solid understanding of Azure SQL Database, Azure Logic Apps, Azure Function Apps, and Azure Container Apps.5+ years of experience in Python or Pyspark for data ingestion, automation, and transformation tasks.Experience working with file-based data ingestion, API-based data ingestion, and integrating data from various third-party systems.Experience working with Azure DevOps (ADO) and entering logic int the CI pipelineNice to Have Skills & ExperienceExperience extracting data from SAPExperience with version control systems like Git.Experience in other Azure services such as Azure Synapse Analytics and Azure Data Share.Familiarity with cloud security best practices and data privacy regulations.Job DescriptionA fortune 100 organization is seeking a Data Ingestion Engineer that will sit fully remote in India. We are seeking an experienced and highly motivated Data Ingestion Engineer to join our dynamic team. The ideal candidate will have strong hands-on experience with Azure Data Factory (ADF) and Databricks, a deep understanding of relational and non-relational data ingestion techniques, and proficiency in Python programming. You will be responsible for designing and implementing scalable data ingestion solutions that interface with Azure Data Lake Storage Gen 2 (ADLS Gen 2), Databricks, and various other Azure ecosystem services.The Data Ingestion Engineer will work closely with stakeholders to gather data ingestion requirements, create modularized ingestion solutions, and define best practices to ensure efficient, robust, and scalable data pipelines. This role requires effective communication skills, ownership, and accountability for the delivery of high-quality data solutions.