Perception Engineer Localisation & Sensor Fusion - Sydney Own and evolve Breaker''s 3D localisation pipeline turning 2D detections into accurate real-world coordinates using passive sensors only. Deepstack work in camerabased localisation, sensor fusion, and realtime data processing on edge hardware. Looking for a midtosenior engineer who has deployed production localisation systems on moving robotic platforms. You''ll build the metrics, tooling, and processes to continuously improve localisation accuracy across diverse hardware configurations. Must have working rights in Australia and be willing to participate in regular field testing. Join an exciting startup backed by globally recognised investors at the bleeding edge of physical AI. About Breaker The way humans use robots is broken. Modern problems demand more robots than we have operators. Every drone, ground vehicle, and maritime system requires dedicated training, manual control, and constant oversight. One operator per robot. One pilot per mission. Breaker''s AI agent breaks this constraint. Our technology turns any robot into a truly autonomous, selforganising teammate. Operators command and query teams of robots through natural language conversations over the pushtotalk radios they already carry, no laptops, no controllers, no additional gear. Instead of manually flying search patterns across three different screens, you say, "survey this area and flag anything unusual." The robot team figures out how to divide the task, coordinate their movements, and report back what matters. We are fundamentally changing the operatortorobot ratio. Small teams become force multipliers. Our software deploys directly onboard each robot, enabling realtime, intentdriven control even in contested environments with limited bandwidth. Were solving problems most AI companies never touch: subsecond inference on edge hardware with strict latency, power, and connectivity constraints. Were backed by some of the best investors globally and growing our team across Austin, Texas, and Sydney, Australia. Were a small team of experienced engineers moving fast on technology that will define how humans and machines work together for decades to come. About The Role One of the most critical outputs of Breaker''s product is putting accurate points on a map taking 2D detections from camera systems and resolving them into precise 3D world coordinates. This is a hard, fascinating problem: we do it without LIDAR or active ranging, using passive sensors on moving platforms in unpredictable environments. You''ll own this pipeline endtoend. This is a greenfield ownership opportunity. You''ll take an existing localisation system and make it yours by building the measurement infrastructure to quantify performance, researching and implementing new techniques, and expanding existing capability into areas like tracking moving objects from moving platforms and generalising across diverse camera hardware. Youll define the contracts between your system and the teams that feed into it (computer vision, hardware, field operations) and build the data workflows that turn realworld testing into continuous improvement. You''ll work at the deepest level of our software stack, where realtime sensor data meets geometric reasoning. If you''re the kind of engineer who gets energy from digging into a dataset to find where localisation breaks, running experiments to prove a new approach works, and building the automation to measure it all. Key Responsibilities Own the accuracy and performance of Breaker''s 3D localisation pipeline, from 2D bounding boxes in, lat/long/alt out Build metrics, ground truth collection processes, and automated evaluation tooling to measure and track localisation performance Research, prototype, and deploy new localisation and sensor fusion techniques to improve accuracy and robustness Extend the system to track moving objects from moving platforms Generalise the localisation pipeline to deploy reliably across different camera systems and hardware configurations Define interface contracts with adjacent systems like what the CV pipeline must deliver and what camera hardware must provide Process and analyse field data to identify failure modes, validate improvements, and feed findings back into the development loop Design and specify field test scenarios (CONOPsstyle) to stresstest specific localisation behaviours Participate in regular field testing, including training field teams on data collection requirements Conduct softwareintheloop and hardwareintheloop testing and experimentation Required Skills & Experience Demonstrated experience with camerabased localisation on moving robotic platforms. Monocular or stereo, not LIDARdependent Strong sensor fusion fundamentals. You understand timing, calibration, and the realities of combining data from multiple realtime sensor streams Experience processing realtime data on constrained or edgedeployed hardware Proven ability to dig into robotic datasets, identify where systems fail, and systematically troubleshoot localisation problems Familiarity with optimisation backends (Ceres, GTSAM, or similar) for geometric inference Proficiency in Python and/or C++ Comfortable working in Linux environments Have deployed a production system, not just research prototypes or thesis projects Must have working rights in Australia Willing to participate in regular field testing Highly Valued Experience GPSdenied navigation or localisation work Photogrammetry or largescale imagery processing pipelines Robotic manipulation with visual servoing or camerabased feedback Experience with gimbaling camera systems and their associated complexities Understanding of machine learning strengths and limitations as an input to geometric systems Multiplatform data fusion (combining data from multiple robotic platforms) ROS/ROS2 development experience Experience with drones or other UAV platforms Startup or scaleup environment experience Australian citizenship (preferred but not required) Why Join Us? Youll be an owner, not a renter. Were at the stage where foundational decisions are still being made and entire systems need to be built from scratch. Your work wont be maintaining someone elses legacy youll be creating what comes next. The problems you solve and the systems you build will define how Breaker scales. Youll work with people whove done this before. Our team has shipped production robotics systems, scaled infrastructure, and solved the kind of hard integration problems that only come up when software meets the physical world. You wont be the only person in the room whos debugged a sensor fusion pipeline or optimized inference on a Jetson. Youll solve problems that dont exist anywhere else. Most companies are building incremental improvements on established technology. Were defining new categories which means the work is harder, more ambiguous, and infinitely more interesting. Youll work hard, together. Were in the office every day, grinding on hard problems alongside great people. Weve built a workspace where the best work happens access to hardware, quick decisions, real collaboration. Were flexible when life requires it, but were looking for people who want to show up, get stuck in, and build something significant with a team they respect. Were going global. Backed by globally recognised investors, were growing teams across Sydney, Australia and Austin, Texas. If you want exposure to international expansion and the opportunity to help build across regions, that path exists here. Youll own what you build. Generous equity packages mean when Breaker wins, you win. Location. Cicada Innovations, Eveleigh, Sydney, Australia (National Innovation Centre) If youre excited about the opportunity to work at the bleeding edge of physical AI, wed love to hear from you. #J-18808-Ljbffr
Job Title
Perception Engineer | Localisation, Sensor Fusion & World Building