Data Engineer Key Responsibilities Design, develop, and maintain scalable data pipelines on GCP Build and optimize ETL/ELT workflows using tools like Dataflow, Dataproc, and Cloud Composer Develop and manage datasets, tables, and transformations in BigQuery Ensure data quality, integrity, and consistency across pipelines Optimize query performance and cost efficiency in BigQuery Support data ingestion from multiple sources (APIs, databases, flat files, streaming data) Collaborate with analytics and business teams to deliver reporting and data solutions Monitor and troubleshoot pipeline failures and performance issues Implement best practices for data governance, security, and compliance Support ongoing data engineering operations and enhancements ️ Required Skills & Qualifications 5+ years of experience in data engineering / data development Strong hands-on experience with Google Cloud Platform (GCP) Expertise in: BigQuery (must-have) Dataflow / Apache Beam Dataform Cloud Run Services Cloud Storage Pub/Sub (for streaming pipelines) Proficiency in SQL (advanced level) Experience with Python or Java for pipeline development Strong understanding of data modeling (star/snowflake schemas) Experience in ETL/ELT design and optimization Knowledge of workflow orchestration tools (Cloud Composer / Airflow) Familiarity with CI/CD practices and version control (Git)
Job Title
Database Engineer