Skip to Main Content

Job Title


GCP Data Engineer (BigQuery, Cloud Storage, Dataproc, Airflow


Company : Tata Consultancy Services


Location : Hyderabad, Telangana


Created : 2026-01-26


Job Type : Full Time


Job Description

TCS Hiring !!.GCP Data Engineer (BigQuery, Cloud Storage, Dataproc, Airflow)Please read Job description before ApplyingSKILLS: GCP Data Engineer (BigQuery, Cloud Storage, Dataproc, Airflow)GCP Services: BigQuery, Cloud Storage, Dataproc, Cloud Composer (managed Airflow) or self-managed Airflow. · Airflow: Strong experience in DAG creation, operators/hooks, scheduling, backfilling, retry strategies, and CI/CD for DAG deployments. · Programming: Proficiency in Python (PySpark, Airflow DAGs), SQL (advanced BigQuery SQL). · Data Modeling: Dimensional modeling (Star/Snowflake), data vault basics, and schema design for analytics. · Performance Tuning: BigQuery partitioning/clustering, predicate pushdown, job stats review, Dataproc executor tuning. · Version Control & CI/CD: Git, branching strategies, pipelines for deploying Airflow DAGs and config. · Operational Excellence: Monitoring with Stackdriver/Cloud Logging, debugging pipeline failures, and root-cause analysis. · involves end-to-end ownership of data ingestion, transformation, orchestration, and performance tuning for batch and near real-time workflows.NOTE: If the skills/profile matches and interested, please reply to this email by attaching your latest updated CV and with below few details:Name:Contact Number:Email ID:Highest Qualification in: (Eg. B.Tech/B.E./M.Tech/MCA/M.Sc./MS/BCA/B.Sc./Etc.)Current Organization Name:Total IT Experience-7+ yearsLocation: TCS: HyderabadCurrent CTCExpected CTCNotice period: Immediate JoinerWhether worked with TCS - Y/NGCP Services: BigQuery, Cloud Storage, Dataproc, Cloud Composer (managed Airflow) or self-managed Airflow. · Airflow: Strong experience in DAG creation, operators/hooks, scheduling, backfilling, retry strategies, and CI/CD for DAG deployments. · Programming: Proficiency in Python (PySpark, Airflow DAGs), SQL (advanced BigQuery SQL). · Data Modeling: Dimensional modeling (Star/Snowflake), data vault basics, and schema design for analytics. · Performance Tuning: BigQuery partitioning/clustering, predicate pushdown, job stats review, Dataproc executor tuning. · Version Control & CI/CD: Git, branching strategies, pipelines for deploying Airflow DAGs and config. · Operational Excellence: Monitoring with Stackdriver/Cloud Logging, debugging pipeline failures, and root-cause analysis. · involves end-to-end ownership of data ingestion, transformation, orchestration, and performance tuning for batch and near real-time workflows.