Skip to Main Content

Job Title


Senior AI/ML Solutions Architect – MySQL


Company : Yochana


Location : moncton, New Brunswick


Created : 2026-05-08


Job Type : Full Time


Job Description

Role: Senior AI/ML Solutions Architect MySQLLocation: Ontario, CanadaJob SummaryExperience : 12-14 Years of relevant experience in Big data, Databricks engineer, Sr. data lead. As a Data Architect (Custody Domain), you will design and lead the implementation of a high-performance, event-driven data ecosystem. You will serve as the technical authority on the Cloudera Data Platform (CDP), with a heavy focus on Kafka-based streaming and Cloud-native architectures. Your role is to bridge real-time data flows from custody operationssuch as trade settlements and cash movementsinto resilient microservices, data pipelines, and data marts across hybrid and multi-cloud environments.Key Responsibilities- Event-Driven Architecture: Architect enterprise-grade streaming solutions using Apache Kafka as the central event bus to decouple producers and consumers across the custody lifecycle.Cloud Strategy & Migration: Design and oversee the deployment of data workloads across Public, Private, and Hybrid Cloud environments, ensuring high availability, disaster recovery, and cost-optimization.Real-Time Processing: Build and tune Apache Flink and Spark Streaming jobs to process Kafka streams for real-time fraud detection, automated regulatory reporting, and continuous transaction monitoring.Data Ingestion & Orchestration: Design scalable, automated ingestion frameworks to move data from legacy custody systems into the CDP ecosystem, ensuring data integrity and low-latency delivery.Microservices Strategy: Lead the design of data-centric microservices that interact with Kafka for event sourcing and asynchronous communication in a containerized cloud environment.Data Mart Design: Develop performant data marts and reporting layers that provide actionable insights to business stakeholders and regulatory bodies using CDPs modern warehouse engines.Security & Governance: Implement centralized security and governance through Cloudera SDX, ensuring strict compliance with financial regulations across all cloud storage and compute layers.Technical Qualifications - Cloudera Mastery: Expert-level knowledge of the Cloudera Data Platform (CDP) stack and its integration within cloud-native infrastructuresKafka Expertise: Advanced skills in Kafka cluster planning, topic management, partitioning strategies, and performance tuning (e.g., exactly-once delivery, back-pressure handling).Cloud Proficiency: Deep experience in architecting data solutions on major Cloud Service Providers, focusing on managed compute, object storage, and networking security.Stream Processing Engines: Strong proficiency in Apache Spark (Streaming/Batch) and working knowledge of Apache Flink.Infrastructure as Code: Familiarity with containerization (Docker/Kubernetes) and automated deployment tools to manage data services at scale