Skip to Main Content

Job Title


Senior Finops Data engineer


Company : Mastech Digital


Location : Toronto, Ontario


Created : 2026-03-25


Job Type : Full Time


Job Description

Title: Senior Data Engineer Financial Dataops Duration: Long term Location: Remote/Canada Job Description: About the Team The Finance Data team owns the data infrastructure, modeling, and analytics pipelines that power financial reporting, planning, and analysis across Life360. Our stack is built on Databricks, dbt, Airflow , and a suite of downstream consumers including Tableau, Causal, Adaptive, and reverse ETL tooling . We value engineering rigor, clear documentation, and proactive communication. We want someone who can partner with us directly, raise the bar on developer experience and operational maturity, and deliver working systems not just recommendations. About the Job We are looking for a Senior Data Engineer to help with ingestion, pipeline, and infrastructure work, while also improving systems around CI/CD, observability, and developer tooling. What Youll Do Core Data Engineering (Primary Focus) Build and maintain data ingestion pipelines across sources (web scraping, RPA, APIs, Fivetran, DBs) into Databricks Develop and extend dbt models supporting financial reporting use cases (billing, revenue recognition, commissions, ads revenue) Contribute to monitoring, data quality, and alerting capabilities across pipelines Support fine-grained access control and role management in Databricks Collaborate with analysts and finance stakeholders to scope and deliver new data sources and reporting requirements Developer Experience & Platform (Secondary Focus) Own and improve CI/CD infrastructure around the Finance dbt repo Implement AI-assisted code review workflows (using the Claude API) Build automated testing frameworks for data pipelines and containerized applications Automate documentation publishing so dbt model metadata stays current in Confluence Establish and reinforce engineering standards (testing patterns, environment management, observability, Git workflow discipline) What Were Looking For 5+ years of experience in Data Engineering roles in production environments Expert-level dbt proficiency Strong Python skills (testable, production-ready code) Experience with AWS (ECS, S3, MWAA) Experience with modern cloud data warehouses (Databricks preferred) Proven ability to work with LLM APIs (Claude/Anthropic preferred) Hands-on experience with GitHub Actions workflows for CI/CD Familiarity with orchestration frameworks (Airflow) Experience implementing data diffing, quality checks, anomaly detection at scale Terraform proficiency Strong Git discipline and PR/code review experience Excellent written communication skills BS in Computer Science, Engineering, Mathematics, or equivalent experience