Skip to Main Content

Job Title


AWS Databricks Engineer


Company : Mastech Digital


Location : toronto, Ontario


Created : 2026-05-02


Job Type : Full Time


Job Description

Remote - CanadaWere looking for a handson Databricks Engineer to help Client migrate and modernize 100+ data pipelines (batch + streaming) from an acquired companys environment. This is a fastpaced, executionheavy role where youll:Rebuild jobs/workflows in Databricks (AWS East)Modernize CI/CD with GitHub Actions + Terraform/DABWork deeply with Spark, Delta Lake, and largescale ETLCollaborate with a lean, highimpact teamIts a chance to own delivery on a largescale migration project with visibility and immediate impact. If you enjoy building, debugging, and optimizing at scale, this role offers both challenge and recognition.Title: AWS Databricks Engineer Duration: Long term Location: Remote (Canada)Job Description: Role OverviewWe are looking for Sr Data Engineers to join our team. You will work alongside our existing Data Engineering team to migrate our Databricks workspaces from a legacy account to a new, purpose-built AWS infrastructure. This is a hands-on role where you will be building infrastructure-as-code, creating data pipelines as Databricks Asset Bundles (DAB), and ensuring a smooth transition with zero data loss.Key ResponsibilitiesMigrate existing Databricks workspaces, jobs, notebooks, and configurations to new AWS accountsBuild and maintain infrastructure using Terraform with the Databricks providerDefine and deploy Databricks resources using Databricks Asset Bundles (DAB)Create CI/CD pipelines using GitHub Actions for automated deploymentsSet up and configure Unity Catalog objects including catalogs, schemas, and access grantsMigrate data between S3 buckets across AWS accountsWrite and optimize Spark jobs in Python or ScalaConfigure orchestration workflows using Apache Airflow (MWAA)Collaborate with the platform and cloud infrastructure teams on AWS networking, IAM, and security configurationsParticipate in code reviews and follow GitOps best practicesRequired Qualifications5+ years of experience in data engineering or related rolesStrong hands-on experience with Databricks on AWSProficiency in Terraform for infrastructure provisioning and managementExperience with Spark (PySpark or Scala) for data processingFamiliarity with AWS services: S3, IAM, KMS, Secrets Manager, VPCExperience with CI/CD pipelines, preferably GitHub ActionsStrong understanding of data lake architectures and Delta LakeComfortable working with Git and following GitOps workflowsPreferred QualificationsExperience with Databricks Asset Bundles (DAB)Experience with Databricks Unity CatalogHands-on experience with Atmos or similar Terraform orchestration frameworksExperience with Apache Airflow or Amazon MWAAExperience with data migration projects, especially Databricks workspace migrationsFamiliarity with Databricks Terraform Exporter or similar reverse-engineering toolsExperience working in multi-account AWS environmentsTechnology StackCloud: AWS (multi-account)Data Platform: Databricks (Unity Catalog)IaC: Terraform, AtmosResource Bundles: Databricks Asset Bundles (DAB)Orchestration: Apache Airflow (MWAA)CI/CD: GitHub ActionsLanguages: Python, Scala, SQL, HCLStorage: S3, Delta LakeVersion Control: GitHubBest regards,Vivek ShrivastavaLead Recruiter Information TechnologyTel 412-286-1530