Skip to Main Content

Job Title


Senior Data Engineer


Company : Asenium Consulting


Location : Bengaluru, Karnataka


Created : 2025-07-23


Job Type : Full Time


Job Description

For one of my customers, we are looking for a Senior Data Engineer. Location : India (100% remote) Duration:6+ months (renewable)Description: We are looking for a Senior Data Engineer with strong experience in SQL, Talend, and cloud platforms such as Google Cloud (BigQuery) and Microsoft Azure. You will be responsible for designing and managing ETL pipelines, optimizing SQL performance, and building cloud-based data solutions.Deliverables:Develop and optimize complex SQL queries , stored procedures, and indexing strategies for large datasets. Design and maintain ETL/ELT data pipelines using Talend , integrating data from multiple sources. Architect and optimize data storage solutions on GCP BigQuery and Azure SQL / Synapse Analytics . Implement best practices for data governance, security, and compliance in cloud environments. Work closely with data analysts, scientists, and business teams to deliver scalable solutions. Monitor, troubleshoot, and improve data pipeline performance and reliability. Automate data workflows and scheduling using orchestration tools (eg, Apache Airflow, Azure Data Factory). Lead code reviews, mentoring, and best practices for junior engineers.Required Qualifications7+ years of hands-on experience in SQL development, database performance tuning, and ETL processes . Expert-level proficiency in SQL , including query optimization, stored procedures, indexing, and partitioning. Strong experience with Talend for ETL/ELT development. Hands-on experience with GCP BigQuery and Azure SQL / Synapse Analytics . Solid understanding of data modeling (relational & dimensional) and cloud-based data architectures . Proficiency in Python or Shell scripting for automation and workflow management. Familiarity with CI/CD, Git, and DevOps best practices for data engineering.Nice to Have Experience with Apache Airflow or Azure Data Factory for workflow automation. Knowledge of real-time data streaming (Kafka, Pub/Sub, Event Hubs). Cloud certifications in GCP or Azure (eg: Google Professional Data Engineer, Azure Data Engineer Associate).