Experience Level: 9-10 years Key Responsibilities Senior Data and AI Engineering lead for large and complex data ecosystem leveraging data domains, data products, cloud and modern technology stack Real-Time Data Streaming: Design, build and maintain scalable and robust real-time data streaming pipelines using technologies such as Apache Kafka, AWS Kinesis, Spark streaming, or similar. Senior Data and AI Engineering lead responsible for Implementing Data and AI pipelines that bring together structured, semi-structured and unstructured data to support AI and Agentic solutions. This Includes pre-processing with extraction, chunking, embedding and grounding strategies to get the data ready. Design and Develop Data and AI-driven systems to improve data capabilities, ensuring compliance with industry best practices. Design and Develop data domains and data products for various consumption archetypes including Reporting, Data Science, AI/ML, Analytics etc. Design and Implement efficient Retrieval-Augmented Generation (RAG) architectures and integrate with enterprise data infrastructure. Collaborate with cross-functional teams to integrate solutions into operational processes and systems supporting various functions. Stay up to date with industry advancements in GenAI and apply modern technologies and methodologies to our systems. This includes leading prototypes (POCs), conducting experiments, and recommending innovative tools and technologies to enhance data capabilities enabling business strategy. Model domain entities, relationships, and business logic in knowledge graphs (e.g., Neo4j, Amazon Neptune, RDF). Integrate data from multiple sources, ensuring canonical representation and semantic consistency. Synthetic data generation: Develop and validate synthetic data to simulate rare events and edge cases, supporting robust agent evaluation. Integrate synthetic data workflows with automated testing frameworks to ensure consistent, scalable agent performance assessment. Identify and Champion AI driven Data Engineering productivity improvements capabilities accelerating end-to-end data delivery lifecycle. This includes researching and implementing innovative solutions such as AI-driven auto-generation of data pipelines, advanced DevOps practices (AI augmented self-healing data pipelines) for data and automated data quality frameworks. Semantic layer and Real time analytics: Design and implement scalable semantic layer with dynamic query translation to deliver real time insights for conversational analytics. Integrate the semantic layers with AI/LLM platforms to provide low-latency, secure, and context-rich data access, optimized for high concurrency and aligned with enterprise governance standards. Ensure the reliability, availability, and scalability of data pipelines and systems through effective monitoring, alerting, and incident management. Implement best practices in reliability engineering, including redundancy, fault tolerance, and disaster recovery strategies. Collaborate closely with DevOps and infrastructure teams to ensure seamless deployment, operation, and maintenance of data systems. Mentoring junior team members and leading communities of practice to deliver high-quality data and AI solutions while promoting best practices, standards, and adoption of reusable patterns. Design and Develop graph database solutions for complex data relationships supporting AI systems, this also includes developing and optimizing queries (e.g., Cyhper, SPARQL) to enable complex reasoning, relationship discovery, and contextual enrichment for AI agents. Design and Apply GenAI solutions to insurance-specific data use cases and challenges. Partner with architects and stakeholders to influence and implement the vision of the AI and data pipelines while safeguarding the integrity and scalability of the environment.Required Skills & Experience : Bachelor's or Master's degree in Computer Science, Artificial Intelligence, or related field. 9+ years of data engineering experience including Data solutions, SQL and NoSQL, Snowflake, ETL/ELT tools, CICD, Bigdata, Cloud Technologies (AWS/Google/AZURE), Python/Spark, Datamesh, Datalake or Data Fabric. Mastery level data engineering and architecture skills, including deep expertise in data architecture patterns, data warehouse, data integration, data lakes, data domains, data products, business intelligence, and cloud technology capabilities. 5+ years expertise with cloud platforms (AWS, GCP, or Azure) and containerization technologies (Docker, Kubernetes). 2+ years of data engineering experience focused on supporting Generative AI technologies. 3+ years hands on experience with Snowflake 3+ years' experience with building Data and AI pipelines that bring together structured, semi-structured and unstructured data. This includes pre-processing with extraction, chunking, embedding and grounding strategies, semantic modeling, and getting the data ready for Models and Agentic solutions. 2+ years strong hands-on experience implementing production ready enterprise grade GenAI data solutions. 2+ years' experience with prompt engineering techniques for large language models. 2+ years’ experience in implementing Retrieval-Augmented Generation (RAG) pipelines, integrating retrieval mechanisms with language models. Mastery in processing and leveraging unstructured data for GenAI applications. Mastery in implementing scalable AI driven data systems supporting agentic solutions (AWS Lambda, S3, EC2, Langchain, Langgraph, MCP, A2A). 3+ years strong programming skills in Python and familiarity with deep learning frameworks such as PyTorch or TensorFlow. 2+ years’ experience in vector databases, graph databases, NoSQL, Document DBs, including design, implementation, and optimization. (e.g., AWS open search, GCP Vertex AI, Neo4j, Spanner Graph, Neptune, Mongo, DynamoDB etc.). Mastery in implementing data governance practices, including Data Quality, Lineage, Data Catalogue capture, holistically, strategically, and dynamically on a large-scale data platform. Strong written and verbal communication skills and ability to explain technical concepts to various stakeholders. Expert level collaboration skills across teams, decision making, conflict resolution and relationship building skills. Expertise in mentoring and developing Junior AI or Data Engineers. Familiarity Knowledge of evolving industry design patterns for AI. Strong planning, organization, and execution skills. Ability to provide thought leadership to dynamic and collaborative teams, demonstrating excellent interpersonal skills and time management capabilities. Ability to understand and align deliverables to the departmental and organization strategies and objectives. Ability to lead successfully in a lean, agile, and fast-paced organization, leveraging Scaled Agile principles and ways of working. Leader and team player with a transformation mindset. Ability to translate complex technical topics into business solutions and strategies, as well as turn business requirements into a technical solution.Nice to Have Experience in multi cloud hybrid AI solutions. Certifications in AI, or GCP or Snowflake Experience in P&C or Employee Benefits Insurance industry Knowledge of natural language processing (NLP) and computer vision technologies. Contributions to open-source AI projects or research publications in the field of Generative AI. What We Offer Collaborative work environment with global teams. Competitive salary and comprehensive benefits. Continuous learning and professional development opportunities.
Job Title
Senior Data Engineer