Skip to Main Content

Job Title


GCP Data Architect


Company : Software International


Location : Toronto, Ontario


Created : 2026-01-22


Job Type : Full Time


Job Description

Software International (SI) supplies technical talent to a variety of clients ranging from Fortune 100/500/1000 companies to small and midsized organizations in Canada/US and Europe. We currently have an indefinite contract role as a GCP Data Architect with our global consulting client, working remotely. This is a 6month contract initially, but could be extended. Type: Contract Duration: 6 months to start + potential extension Location: Toronto, ON - remote with occasional office visits Rate: $100 - $120 CDN/hr C2C depending on overall experience GCP Data Architect - Role Overview We are seeking a highly skilled Google Cloud Platform (GCP) Data Architect with strong SAP data integration expertise to design, implement, and oversee enterprisegrade data solutions. The ideal candidate will combine deep expertise in cloud data platforms, data governance, security, and data modeling with handson experience in ETL/ELT pipelines, SAP data extraction, system migrations, and analytics . This role will collaborate with business stakeholders and engineering teams to create a robust, scalable, and costeffective data ecosystem that bridges SAP and GCP environments. Key Responsibilities Data Strategy, Security & Governance Define and implement enterprisewide data strategy aligned with business goals. Establish data governance frameworks , data classification, retention, and privacy policies. Ensure compliance with industry standards and regulations (e.g., GDPR, HIPAA, PIPEDA). Design conceptual, logical, and physical data models to support analytics and operational workloads. Implement star, snowflake, and data vault models for analytical systems. Implement S4 CDS views in Google BigQuery. Architect data solutions on GCP using BigQuery, Cloud Storage, Dataflow, and Dataproc. Design and orchestrate ETL/ELT pipelines using Apache Airflow (Cloud Composer) and Dataflow. Integrate data from multiple systems including SAP BW, SAP HANA, Business Objects using tools like SAP SLT or Google Cortex Framework. Leverage integration tools such as Boomi for system interoperability. Programming & Analytics Develop complex SQL queries for analytics, transformations, and performance tuning. Build automation scripts and utilities in Python . Good understanding of CDS views, ABAP language. System Migration Lead onpremise to cloud migrations for enterprise data platforms ([SAP BW/Bobj]). Manage migration of SAP datasets to GCP ensuring data integrity and minimal downtime. DevOps for Data Implement CI/CD pipelines for data workflows using GitHub Actions, Cloud Build, and Terraform. Apply infrastructureascode principles for reproducible and scalable deployments. Preferred Skills Proven experience with GCP BigQuery, Cloud Storage, Pub/Sub, Dataflow . Strong SQL and Python programming skills. Handson experience with SAP data extraction, modeling, and integration from ERP, BW, and/or HANA systems. Knowledge of data governance frameworks and data security best practices. Experience with Boomi, Informatica, or MuleSoft for SAP and nonSAP integrations. Experience in Google Cortex Framework for SAPGCP integrations. #J-18808-Ljbffr