Skip to Main Content

Job Title


Data Engineer


Company : Mastech Digital


Location : mississauga, Ontario


Created : 2026-03-20


Job Type : Full Time


Job Description

Title: Data Engineer Duration: Long term Location: Remote (Canada)Job Description: Role OverviewWe are looking for Sr Data Engineers to join our team. You will work alongside our existing Data Engineering team to migrate our Databricks workspaces from a legacy account to a new, purpose-built AWS infrastructure. This is a hands-on role where you will be building infrastructure-as-code, creating data pipelines as Databricks Asset Bundles (DAB), and ensuring a smooth transition with zero data loss.Key ResponsibilitiesMigrate existing Databricks workspaces, jobs, notebooks, and configurations to new AWS accountsBuild and maintain infrastructure using Terraform with the Databricks providerDefine and deploy Databricks resources using Databricks Asset Bundles (DAB)Create CI/CD pipelines using GitHub Actions for automated deploymentsSet up and configure Unity Catalog objects including catalogs, schemas, and access grantsMigrate data between S3 buckets across AWS accountsWrite and optimize Spark jobs in Python or ScalaConfigure orchestration workflows using Apache Airflow (MWAA)Collaborate with the platform and cloud infrastructure teams on AWS networking, IAM, and security configurationsParticipate in code reviews and follow GitOps best practicesRequired Qualifications5+ years of experience in data engineering or related rolesStrong hands-on experience with Databricks on AWSProficiency in Terraform for infrastructure provisioning and managementExperience with Spark (PySpark or Scala) for data processingFamiliarity with AWS services: S3, IAM, KMS, Secrets Manager, VPCExperience with CI/CD pipelines, preferably GitHub ActionsStrong understanding of data lake architectures and Delta LakeComfortable working with Git and following GitOps workflowsPreferred QualificationsExperience with Databricks Asset Bundles (DAB)Experience with Databricks Unity CatalogHands-on experience with Atmos or similar Terraform orchestration frameworksExperience with Apache Airflow or Amazon MWAAExperience with data migration projects, especially Databricks workspace migrationsFamiliarity with Databricks Terraform Exporter or similar reverse-engineering toolsExperience working in multi-account AWS environmentsTechnology StackCloud: AWS (multi-account)Data Platform: Databricks (Unity Catalog)IaC: Terraform, AtmosResource Bundles: Databricks Asset Bundles (DAB)Orchestration: Apache Airflow (MWAA)CI/CD: GitHub ActionsLanguages: Python, Scala, SQL, HCLStorage: S3, Delta LakeVersion Control: GitHub