Skip to Main Content

Job Title


Data Engineer


Company : Mastech Digital


Location : Mississauga, Ontario


Created : 2026-03-20


Job Type : Full Time


Job Description

Title: Data Engineer Duration: Long term Location: Remote (Canada) Job Description: Role Overview We are looking for Sr Data Engineers to join our team. You will work alongside our existing Data Engineering team to migrate our Databricks workspaces from a legacy account to a new, purpose-built AWS infrastructure. This is a hands-on role where you will be building infrastructure-as-code, creating data pipelines as Databricks Asset Bundles (DAB), and ensuring a smooth transition with zero data loss. Key Responsibilities Migrate existing Databricks workspaces, jobs, notebooks, and configurations to new AWS accounts Build and maintain infrastructure using Terraform with the Databricks provider Define and deploy Databricks resources using Databricks Asset Bundles (DAB) Create CI/CD pipelines using GitHub Actions for automated deployments Set up and configure Unity Catalog objects including catalogs, schemas, and access grants Migrate data between S3 buckets across AWS accounts Write and optimize Spark jobs in Python or Scala Configure orchestration workflows using Apache Airflow (MWAA) Collaborate with the platform and cloud infrastructure teams on AWS networking, IAM, and security configurations Participate in code reviews and follow GitOps best practices Required Qualifications 5+ years of experience in data engineering or related roles Strong hands-on experience with Databricks on AWS Proficiency in Terraform for infrastructure provisioning and management Experience with Spark (PySpark or Scala) for data processing Familiarity with AWS services: S3, IAM, KMS, Secrets Manager, VPC Experience with CI/CD pipelines, preferably GitHub Actions Strong understanding of data lake architectures and Delta Lake Comfortable working with Git and following GitOps workflows Preferred Qualifications Experience with Databricks Asset Bundles (DAB) Experience with Databricks Unity Catalog Hands-on experience with Atmos or similar Terraform orchestration frameworks Experience with Apache Airflow or Amazon MWAA Experience with data migration projects, especially Databricks workspace migrations Familiarity with Databricks Terraform Exporter or similar reverse-engineering tools Experience working in multi-account AWS environments Technology Stack Cloud: AWS (multi-account) Data Platform: Databricks (Unity Catalog) IaC: Terraform, Atmos Resource Bundles: Databricks Asset Bundles (DAB) Orchestration: Apache Airflow (MWAA) CI/CD: GitHub Actions Languages: Python, Scala, SQL, HCL Storage: S3, Delta Lake Version Control: GitHub