Skip to Main Content

Job Title


Senior Data Engineer


Company : Datatonic https://static.whatjobs.com/static/ajCor


Location : Toronto, Ontario


Created : 2026-05-06


Job Type : Full Time


Job Description

Shape the Future of AI & Data with Us At Datatonic, we are Google Cloud's premier partner in AI, driving transformation for worldclass businesses. We push the boundaries of technology with expertise in machine learning, data engineering, and analytics on Google Cloud Platform. By partnering with us, clients futureproof their operations, unlock actionable insights, and stay ahead of the curve in a rapidly evolving world. Your Mission Data Engineers at Datatonic work across a wide range of projects, helping clients unlock the full potential of the Modern Data Stack. Youll bring expertise in our technologies of choice - dbt, Looker, Snowflake, BigQuery, Google Cloud, Sigma, Fivetran, Python, Spark, Pub/Sub - and apply them to solve real client challenges. In this role, youll collaborate closely with a Delivery Manager, support project teams, and make handson contributions to the codebase where needed. Our Data Engineers combine strong technical skills with a clientfocused mindset, ensuring that data solutions are not only wellengineered but also deliver measurable impact. What Youll Do Foundational Support for Analytics and Data Science Teams: Build the infrastructure that enables analytics and data science teams to deliver innovative, impactful solutions for clients. Google Cloud Migration and Data Warehouse Solutions: Assist clients in migrating their existing business intelligence and data warehouse solutions to Google Cloud. Build Scalable Data Pipelines: Design, develop, and optimise robust data pipelines, making data easily accessible for visualization and machine learning applications. Design and Build Data Warehouses and Data Marts: Design and implement new data warehouse and data mart solutions, including: Transforming, testing, deploying, and documenting data. Understanding data modelling techniques. Optimising and storing data for warehouse technologies. Manage Cloud Infrastructure: Architect, maintain, and troubleshoot cloudbased infrastructure to ensure high availability and performance. Collaboration with Technology Partners: Work closely with technology partners such as Google Cloud, Snowflake, dbt, and Looker, mastering their technologies and building a network with their engineers. Agile and Dynamic Team Collaboration: Collaborate in an agile and dynamic environment with a team of data engineers, BI analysts, data scientists, and machine learning experts. Applying Software Engineering Best Practices: Implement software engineering best practices in analytics processes, such as version control, testing, and continuous integration. What Youll Bring Experience: 4+ years in a datarelated role (e.g., Data Engineer, Data Analyst, Analytics Engineer). Technical Expertise: Handson experience with Looker, dbt, modern data warehouses like Snowflake or BigQuery, and Kimball data modelling. Strong Programming Skills: Expertise in Python and/or Java, with proficiency in SQL. Experience in Data Engineering: 5+ years of experience in designing and building scalable data solutions. HighQuality Code Standards: Ability to write tested, resilient, and welldocumented code. Cloud Computing Experience: Experience in building and maintaining cloud infrastructure (GCP or AWS is a plus). ProblemSolving Mindset: Ability to take ownership and drive projects from concept to completion. Project Management: Natural ability to manage multiple initiatives and clients simultaneously. SQL Proficiency: Skilled in writing analytical SQL, with an understanding of the difference between SQL that works and performant SQL. Business Translation: Experience in translating business requirements into technical solutions. Communication Skills: Ability to communicate complex ideas simply to a wide range of audiences. Leadership: Experience in providing technical guidance and direction on projects. Cultural Alignment: Complete alignment with our culture of transparency, empathy, accountability, and performance. Bonus points if you have dbt Developer certification Google Cloud Professional Data Engineer certification Snowflake SnowPro certification Experience with Scrum methodology ClientFacing Role: Prior experience in a clientfacing position API Development Experience: Experience building scalable REST APIs using Python or similar technologies. Whats in It for You? We believe in empowering our team to thrive, with benefits including: 20 days of paid vacation per calendar year Public Holidays for your Province of Residence 5 Wellness days (sickness, personal time, mental health) 5 Lifestyle days (religious events, volunteer day, sick day) Matching Group Retirement Savings Plan after 3 months Competitive Group Insurance plan on Day 1 - individual premium paid 100%! Virtual Medicine and Family Assistance Program - 100% employerpaid! Home office budget - We are 100% remote! CAD $70/month for internet/phone expenses CAD $1,500 every 3 years for tech accessories and office equipment (monitor, keyboard, mouse, desk, etc.) starting on Day 1 Companysupplied MacBook Pro or Air CAD $400/year for books, relevant app subscriptions or an ereader. Opportunities for paid certifications Opportunities for professional and personal learning through Google and other training programmes Regular company offsites and meetups Why Datatonic? Datatonic is a UKbased company with an Americas division located in Canada. The Canadian team operates remotely, with members distributed across North and South America. This role is open to candidates located anywhere in Canada. Join us to work alongside AI enthusiasts and data experts who are shaping tomorrow. At Datatonic, innovation isnt just encouraged - its embedded in everything we do. If youre ready to inspire change and deliver value at the forefront of data and AI, wed love to hear from you! Salary: CAD 110,000 - CAD 140,000 Are you ready to make an impact? Apply now and take your career to the next level. #J-18808-Ljbffr