Airswift is seeking a Data Engineer to work a 12-month contract with one of our major clients in Calgary, AB.Key ResponsibilitiesDesign, build, and optimize ETL/ELT workflows using KNIME Analytics PlatformIntegrate data from relational databases, APIs, and cloud storageAutomate and productionize workflows using KNIME Business HubDevelop and scale data pipelines using Databricks (Apache Spark)Implement data quality checks, anonymization, and governance controlsDocument workflows, data pipelines, and operational processesLead the design and delivery of scalable Databricks data pipelines using Spark, Delta Lake, and PySparkDrive Lakehouse architecture across ingestion, transformation, and curated data layersDefine technical standards and best practices for batch and streaming data engineeringOptimize performance and cost efficiency of Spark workloadsImplement enterprise-grade data governance, security, and monitoring (e.g., access controls, catalogs)Act as a senior technical leader and mentor within the data engineering teamWhat You BringPost-secondary degree in Computer Science, Software Engineering, or equivalent experience7+ years of hands-on experience in data engineeringStrong experience with Databricks, Spark, and Lakehouse architecturesAdvanced proficiency in PySpark, Python, and SQLAdvanced experience with KNIME Analytics Platform (nodes, components, workflow control)Solid understanding of data modeling and ETL/ELT best practicesExperience building and supporting production-grade data pipelines at scaleExposure to cloud platforms (Azure preferred, AWS acceptable)Familiarity with DevOps / CI/CD practices for data workloadsNice to HaveExperience with MLOps and deploying machine learning workloads in DatabricksFamiliarity with data integration tools (e.g., Azure Data Factory, HVR, or similar)Experience working in regulated or large enterprise environmentsExposure to streaming data architectures (Kafka, Structured Streaming)
Job Title
Data Engineer