Skip to Main Content

Job Title


Senior Data & ML Engineer


Company : Predelo Pty Ltd


Location : Perth, Western Australia


Created : 2026-03-07


Job Type : Full Time


Job Description

Senior Data & ML Engineer Hybrid About Predelo Predelo is an AI decision agent for backoffice operations, transforming operational data into trusted predictions and automated actions. Built on a stateoftheart forecasting and optimization engine developed over five years, we are currently focused on Workforce Management optimisation through a strategic partnership with Deputy, reaching 1.7million end users. We are trusted by enterprise brands operating across thousands of locations and complex labour environments. Operating at scale: 2.2B+ records in Databricks, ~170000 shifts generated per week, 20000+ employees supported across the US and Australia. Job Overview You will own Predelo''s data platform component (internally: Cybertron) endtoend. This is a platformasaproduct role. You don''t just "build pipelines"; you build the core platform and standards that enable faster AI product development across forecasting, optimisation, automation, analytics, and customer onboarding. Your first mandate is to modernise and harden our Databricks lakehouse foundation (governance, orchestration, reliability, and developer experience) so teams can ship AIpowered product capabilities quickly and safely. This is not a ticket queue role. You''ll have architecture authority, handson delivery expectations, and operational ownership. What We''re Looking For Deep handson Databricks experience (Spark, Delta Lake) in production. Unity Catalog governance and migration experience. Strong SQL and Python/PySpark. Strong data modelling skills (medallion architecture, dimensional modelling). Experience with dbt and/or DLT including CI/CD and testing patterns. AWS fundamentals (S3, IAM/KMS, eventing/Lambda; Step Functions a bonus). Strong engineering hygiene: version control, testing, observability, operational readiness. Proven ability to leverage AI tools to improve speed and quality. Key Responsibilities Own the Databricks lakehouse foundation: Delta Lake, Unity Catalog, compute patterns, job and workflow orchestration, and performance tuning. Own the transformation and modelling layer: dbt today, and evaluate where managed patterns (e.g., DLT) are a better fit. Be the technical owner of CONNECT reliability patterns: idempotency, retries and backoff, replay, and data freshness SLAs from source to gold. Treat data as a product: define stable data products, contracts, validation, lineage expectations, documentation, and clear ownership for priority datasets. Build DataOps as code: monitoring, alerting, runbooks, and guardrails that prevent repeat incidents. Partner with product, ML, and integrations teams to keep platform priorities tied to customer value and AI delivery velocity. Operate what you build: incident response, postmortems, systematic fixes, and continuous reliability improvement. For the right candidate: help own and improve our MLOps platform (SageMaker and Step Functions), including reliability, repeatability, and faster experimentationtoproduction. Liaise with and manage our Databricks and AWS account teams, and stay active in the Databricks and data platform community to bring back best practices and new capabilities. Why Join Predelo? Own highimpact AI systems operating at realworld scale. Work in a small, highleverage, AInative team. Ship fast, learn from production, and see direct customer impact. Competitive compensation + ESOPs. #J-18808-Ljbffr