Level of experience required: 12+ YearsThe Senior Data Architect is responsible for training new hires. Also, designing, developing new enterprise data architectures within a modern cloud environment. This role requires deep expertise in Big Data solutions (Delta Lake architecture), modern data warehouse practices and operations, semantic layering, dimensional modeling (star schemas), transactional OLTP databases (3NF modeling), and advanced data modeling techniques. The ideal candidate will have at least 12 years of data modeling experience specifically within Business Intelligence and Analytics contexts, extensive hands-on experience with batch and streaming data processing, and strong expertise with Apache Spark, Databricks, and Spark Structured Streaming. Required skills include proficiency in Python programming, Azure cloud technologies, semantic modeling, and modern CI/CD deployment practices. Experience in ML engineering is highly desirable. The candidate must be able to collaborate quickly and effectively with data and engineering teams, clearly document source-to-target mappings, and reverse engineer existing database objects such as stored procedures, views, and complex SQL queries.ResponsibilitiesEffectively train engineering teams to rapidly deliver data architecture solutions and support agile development practices. Lead architecture design and implementation of enterprise-scale data platforms leveraging Databricks, Delta Lake, Azure cloud, Snowflake and modern Big Data technologies. Design, build, and maintain modern data warehouse solutions using dimensional modeling (star schema) and semantic layering to optimize analytics and reporting capabilities. Define and enforce data modeling standards, guidelines, and best practices within analytics and BI contexts. Architect robust batch processing and real-time streaming solutions using Apache Spark, Databricks, Kafka, Kinesis, and Spark Structured Streaming. Provide clear, comprehensive source-to-target documentation, data lineage mappings, and semantic layer definitions. Reverse engineer existing database structures, including stored procedures, views, and complex SQL logic, to document existing data processes and support modernization initiatives. Provide technical leadership, mentoring, and guidance to data engineering teams, ensuring alignment with architectural standards and best practices. Evaluate and continuously improve existing data architectures, optimize performance, and recommend enhancements for efficiency and scalability. Collaborate closely with stakeholders to define long-term data strategies and clearly communicate architectural decisions. Ensure compliance with industry standards, data governance practices, regulatory requirements, and security guidelines. Proficiency in Python programming required. Strong experience with Azure cloud technologies, including Azure Data Factory, Azure Storage, Azure Databricks, and related data services. Solid experience designing streaming data solutions using Kafka, Kinesis, or similar streaming technologies. Knowledge and hands-on experience implementing modern CI/CD practices for data engineering and analytics solutions. Strong analytical, organizational, and communication skills, with the ability to clearly articulate complex technical concepts to diverse stakeholders. Proven ability to collaborate effectively and efficiently with engineering teams and business stakeholders.Preferred QualificationsML Engineering experience or exposure to ML pipelines and model deployment processes highly desirable. Expert in developing Analytics based on Dynamics CRM CE modules data Data engineer professional certification completion
Job Title
Data Engineer Architect - Databricks