Join to apply for the GCP BigData Engineering role at XPT SoftwareSydney, New South Wales, AustraliaTop 3 MustHave SkillsetsGCP BigData Engineering (BigQuery + Dataform + Pub/Sub)Expert in designing and optimising BigQuery schemas, partitioning/clustering, cost/performance tuning, query optimisation, and policy tag integration.Building streaming and batch pipelines using Apache Beam/Dataflow and Pub/Sub with exactlyonce semantics, backpressure handling, and replay strategies.Strong experience with Dataform (or similar) for SQLbased transformations, dependency graphs, unit tests, and multienvironment deployments.Productiongrade Python for ETL/ELT, distributed processing, robust error handling, and testable modular design.Designing resilient Airflow DAGs on Cloud Composer: dependency management, retries, SLAs, sensors, service accounts, and secrets.Monitoring, alerting, and Cloud Logging/Stackdriver integration for endtoend pipeline observability.Data Security & Governance on GCPHandson with Dataplex (asset management, data quality, lineage), BigQuery policy tags, Cloud IAM (least privilege, finegrained access), KMS (key rotation, envelope encryption), and audit trails via Cloud Logging.Practical experience implementing PII controls (data masking, tokenisation, attributebased access control) and privacybydesign in pipelines.Additional ExpertiseCloud Run & APIs: Building stateless microservices for data access/serving layers, implementing REST/gRPC endpoints, authentication/authorisation, rate limiting.Data Modelling: Telecomcentric event models (e.g., CDRs, network telemetry, session/flow data), star/snowflake schemas, and lakehouse best practices.Performance Engineering: BigQuery slot management, materialised views, BI Engine, partition pruning, cache strategies.Secure Source Manager (CI/CD): Pipelineascode, automated tests, artifact versioning, environment promotion, canary releases, and GitOps patterns.Data Quality & Testing: Great Expectations/Deequlike checks, schema contracts, anomaly detection, and automated data validations in CI/CD.Streaming Patterns: Exactlyonce delivery, idempotent sinks, watermarking, late data handling, windowing strategies.Observability & SRE Practices: Metrics, logs, traces, runbooks, SLIs/SLOs for data platforms, major incident response to support DevOps.Cost Governance: BigQuery cost controls, slot commitments/reservations, workload management, storage lifecycle policies.Domain Knowledge (Mobile Networks): Familiarity with 3G/4G/5G network data, OSS/BSS integrations, network KPIs, and typical analytics use cases.Experience Level7+ years total in data engineering; 4+ years on GCP with production systemsEvidence of impact: Led endtoend delivery of largescale pipelines (batch + streaming) with strict PII governance. Owned performance/cost optimisation initiatives in BigQuery/Dataflow at scale. Implemented CI/CD for data workflows (Secure Source Manager) including automated tests and environment promotion. Drove operational excellence (SLAs, incident management, RTO/RPO awareness, DR patterns).Soft skills: Technical leadership, code reviews, mentoring, clear documentation, crossfunctional collaboration with Network/Analytics teams, and a bias for automation & reliability. #J-18808-Ljbffr
Job Title
GCP BigData Engineering