Role - Data Architect/Lead (Senior Role)Skills - Azure/AWS, Databricks, Pyspark, BI tools and Dashboard.Location - NewCastle NSWDuration - PermanentAs a Data Architect, candidate will be responsible for designing and implementing enterprise-level data architecture that supports advanced analytics, reporting, and operational needs. You will define standards for data modelling, storage, and integration across multiple Insurance related applications using Azure/AWS platforms, ensuring scalability, security, performance and implementing Business Intelligence solutions that enable data-driven decision-making across the organization.Key Responsibilities:Cloud Strategy & Integration:Data Quality and Data Governance:Must have Cross platform Data Migration experienceShould be able to Develop new and understand existing end-to-end data architecture for data lakes, warehouses, and real-time streaming systems.Create conceptual, logical, and physical data models for structured and unstructured data.Create Source to Target mapping for ELT/ETL solutionsDesign hybrid and multi-cloud solutions leveraging Azure, Databricks, Pyspark etcLead cloud migration projects and optimize cost and performance.Perform Data profilingWork closely with business stakeholders, data engineers, and BI teams to align architecture with business goals.Evaluate emerging technologies (AWS, Snowflake, Delta Lake) and recommend adoption strategies.Implement DQ practices and Data Governance rulesExperience in metadata management and data lineage tracking.Design and implement BI architecture, including data models & reporting frameworksCollaborate with business teams to gather requirements and translate them into technical solutionsProvide solutions to BI tools and dashboards for performance and usabilityMentor junior engineers and enforce best practices in data engineering and reporting.Technical Skills:Data Modelling: Advanced knowledge of normalization, dimensional modeling, and NoSQL design.Cloud Expertise: Deep experience in Azure (ADF, databricks, Pyspark, Python) or AWS (Glue, Kafka, etc)Programming: Proficiency in SQL, Python, Pyspark, and Big data processing.Architecture Tools: ERwin, PowerDesigner, or similar modeling tools.Security & Governance: Familiarity with Identity Access Management (IAM), encryption, and compliance frameworks.#J-18808-Ljbffr
Job Title
Data Architect