Skip to Main Content

Job Title


Cloud Engineer-GCP


Company : EXL


Location : Amravati, Maharashtra


Created : 2025-07-23


Job Type : Full Time


Job Description

Key RequirementsTechnical Skills Expert in GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, and Cloud Functions. GCP Professional Data Engineer Certification is highly favourable. Advanced knowledge of SQL for complex data transformation and query optimization. Proven experience in Python for scalable data pipeline development and orchestration following best practices. Experience implementing Terraform for Infrastructure as Code (IaC) to automate GCP resource management. Knowledge of CI/CD pipelines and automated deployment practices. Experience with containerization technologies (e.g., Docker, Kubernetes) Experience building and optimizing batch and streaming data pipelines. Understanding of data governance principles, GCP security (IAM, VPC), and compliance requirements.Soft Skills Demonstrates a growth mindset by actively seeking to learn from peers and stakeholders, fostering a culture of open communication and shared knowledge. Works effectively across teams, including Data Science, Engineering, and Analytics, to understand their needs and deliver impactful data solutions. Actively participates in design discussions, brainstorming sessions, and cross-functional projects, always striving for continuous improvement and innovation. Builds strong relationships across the organization, using empathy and active listening to ensure alignment on goals and deliverables. Approaches challenges with agrowth mindset , viewing obstacles as opportunities to innovate and improve processes. Applies a structured and analytical approach to solving complex problems, balancing immediate needs with long-term scalability and efficiency. Demonstrates resilience under pressure, maintaining a positive and solution-focused attitude when faced with tight deadlines or ambiguity. Actively seeks feedback and lessons learned from past projects to continuously refine problem-solving strategies and improve outcomes. Shares expertise generously, guiding team members in adopting best practices and helping them overcome technical challenges. Leads by example, demonstrating how to approach complex problems pragmatically while promoting curiosity and a willingness to explore new tools and technologies. Encourages professional development within the team, supporting individuals in achieving their career goals and obtaining certifications, especially within the Google Cloud ecosystem.Main duties and responsibilitiesDesign, develop, and maintain scalable data pipelines using modern data engineering tools and technologies on our GCP stack. Build and optimize our lake house on Google Cloud Platform (GCP) Implement data ingestion, transformation, and loading processes for various data sources (e.g., databases, APIs, cloud storage) Ensure data quality, consistency, and security throughout the data pipeline Leverage GCP services (e.g., Dataflow, Dataproc, BigQuery, Cloud Storage) to build and maintain cloud-native data solutions Implement infrastructure as code (IaC) principles using Terraform to automate provisioning and configuration Manage and optimize cloud resources to ensure cost-efficiency and performance Design and implement efficient data models following a star schema approach to support analytical and operational workloads Collaborate with data analysts to develop advanced analytics solutions. Conduct data quality analysis to drive better data management on outputs in our Curated Layer. Mentor junior data engineers and provide technical guidance Contribute to the development of data engineering best practices and standards Collaborate with cross-functional teams to deliver complex data projects