Skip to Main Content

Job Title


Senior Data Engineer


Company : People Tech Group Inc


Location : Bengaluru, Karnataka


Created : 2025-12-15


Job Type : Full Time


Job Description

People Tech Group is a leading Enterprise Solutions, Digital Transformation, Data Intelligence, and Modern Operations services provider. Founded in 2006 in Redmond, Washington, USA, we have expanded our presence across India—Hyderabad, Bangalore, Pune, and Chennai—growing to a 3000+ strong workforce. We operate across 4 countries: US, Canada, India, and Costa Rica.A recent milestone in our journey is our acquisition by Quest Global, one of the world’s largest engineering solutions providers with 20,000+ employees and 70+ Global Delivery Centers, headquartered in Singapore. Going forward, we are proud to be part of the Quest Global family.Hiring: Senior AWS Data Engineer (8+ Years)Role SummaryWe are seeking an experienced Senior AWS Data Engineer to design, develop, and optimize enterprise-grade data platforms. The ideal candidate will have strong expertise in AWS analytics services, PySpark, ETL/ELT frameworks, and modern data architectures, with experience working in secure and regulated environments including ITAR.Key Responsibilities- Design and optimize scalable Data Lakes & Data Warehouses on AWS - Build and maintain ETL/ELT pipelines using PySpark, Glue, Lambda, and APIs - Implement Medallion Architecture (Bronze–Silver–Gold) - Develop ingestion frameworks for API, JSON, Parquet, Iceberg - Build analytics layers using Athena, Redshift, S3, Glue Catalog - Implement IAM policies, encryption standards, governance & ITAR compliance - Provision infrastructure using Terraform (IaC) - Enable CI/CD automation using Git, GitHub Actions, CodePipeline - Perform performance tuning, cost optimization, and pipeline scalability improvements - Troubleshoot ingestion, transformation, and consumption layer issuesRequired Skills & Experience- 8+ years of experience in AWS Data Engineering - Strong hands-on experience with S3, Glue, Athena, Redshift, IAM - Expertise in PySpark, ETL/ELT, and Data Lake/Warehouse design - Experience with Medallion Architecture - Good knowledge of API, JSON, Parquet, Iceberg formats - Hands-on with Terraform, CI/CD pipelines, DevOps practices - Understanding of ITAR compliance and secure data handling➕ Good to Have- Experience with Apache Airflow / MWAA - Exposure to Delta Lake / Hudi / Iceberg - Familiarity with Glue Data Catalog - Experience working in regulated environments