Role: GCP Data EngineerJob Location: Bangalore / Chennai / GurgaonExperience: 6 to 12 YearsNotice Period: Immediate to Official 30 DaysRole SummaryWe are seeking a skilled GCP Data Engineer with strong hands-on experience in SQL, PySpark, and Google Cloud Platform. The role involves designing, developing, and optimizing scalable data pipelines while ensuring high performance, reliability, and CI/CD readiness.Mandatory SkillsAdvanced SQL (complex joins, window functions, performance optimization)Google Cloud Platform (GCP)BigQuery (partitioning, clustering, and cost optimization)PythonDataproc (PySpark / Spark)Google Cloud Storage (GCS)Cloud Composer (Airflow)Additional GCP ServicesPub/SubCloud Functions / Cloud RunDataflow (Apache Beam)DevOps & ToolsGit / GitHubCI/CD pipelinesLinux / Shell scriptingKey ResponsibilitiesDevelop and maintain ETL/ELT pipelines using Python and PySparkDesign, build, and optimize Spark jobs on DataprocWrite and optimize BigQuery SQL queries with focus on performance and cost efficiencyOrchestrate and schedule workflows using Cloud Composer (Airflow)Implement data validation, monitoring, and error handling mechanismsManage incremental and batch data processing pipelinesSupport Hadoop to GCP migration initiativesPerform metadata management, pipeline optimization, and performance tuning
Job Title
GCP Data Engineer