2 days ago Be among the first 25 applicantsDirect message the job poster from ALOIS AustraliaTalent Acquisition Specialist at ALOIS SolutionsRole: Data Engineer with GCPLocation: SydneyRole Type: ContractExperience level - 7+ years total in Data Engineering; 4+ years on GCP with production systems.Top 3 MustHave SkillsetsGCP BigData Engineering (BigQuery + Dataform + Pub/Sub)Expert in designing and optimising BigQuery schemas, partitioning/clustering, cost/performance tuning, query optimisation, and policy tag integration.Building streaming and batch pipelines using Apache Beam/Dataflow and Pub/Sub with exactly-once semantics, backpressure handling, and replay strategies.Strong experience with Dataform (or similar) for SQL-based transformations, dependency graphs, unit tests, and multi-environment deployments.Production-grade Python for ETL/ELT, distributed processing, robust error handling, and testable modular design.Designing resilient Airflow DAGs on Cloud Composer: dependency management, retries, SLAs, sensors, service accounts, and secrets.Monitoring, alerting, and Cloud Logging/Stackdriver integration for end-to-end pipeline observability.Data Security & Governance on GCPHandson with Dataplex (asset management, data quality, lineage), BigQuery policy tags, Cloud IAM (least privilege, finegrained access), KMS (key rotation, envelope encryption), and audit trails via Cloud Logging.Practical experience implementing PII controls (data masking, tokenisation, attributebased access control) and privacybydesign in pipelines.Cloud Run & APIs: Building stateless microservices for data access/serving layers, implementing REST/gRPC endpoints, authentication/authorisation, rate limiting.Data Modelling: Telecomcentric event models (e.g., CDRs, network telemetry, session/flow data), star/snowflake schemas, and lakehouse best practices.Performance Engineering: BigQuery slot management, materialised views, BI Engine, partition pruning, cache strategies.Secure Source Manager (CI/CD): Pipelineascode, automated tests, artifact versioning, environment promotion, canary releases, and GitOps patterns.Data Quality & Testing: Great Expectations/Deequlike checks, schema contracts, anomaly detection, and automated data validations in CI/CD.Streaming Patterns: Exactlyonce delivery, idempotent sinks, watermarking, late data handling, windowing strategies.Observability & SRE Practices: Metrics, logs, traces, runbooks, SLIs/SLOs for data platforms, major incident response to support DevOps.Cost Governance: BigQuery cost controls, slot commitments/reservations, workload management, storage lifecycle policies.Domain Knowledge (Mobile Networks): Familiarity with 3G/4G/5G network data, OSS/BSS integrations, network KPIs, and typical analytics use cases.Experience Level7+ years total in data engineering; 4+ years on GCP with production systems.Evidence of impact:Led end-to-end delivery of largescale pipelines (batch + streaming) with strict PII governance.Owned performance/cost optimisation initiatives in BigQuery/Dataflow at scale.Implemented CI/CD for data workflows (Secure Source Manager) including automated tests and environment promotion.Drove operational excellence (SLAs, incident management, RTO/RPO awareness, DR patterns).Soft skills: Technical leadership, code reviews, mentoring, clear documentation, crossfunctional collaboration with Network/Analytics teams, and a bias for automation & reliability.Seniority levelMidSenior levelEmployment typeContractJob functionConsultingIndustriesIT Services and IT Consulting #J-18808-Ljbffr
Job Title
Data Engineer-GCP