About FM: FM is a 190-year-old, Fortune 500 commercial property insurance company of 6,000+ employees with a unique focus on science and risk engineering. Serving over a quarter of the Fortune 500 and major corporations globally, they deliver data-driven strategies that enhance resilience, ensure business continuity, and empower organizations to thrive. FM India located in Bengaluru is a strategic location for driving FM's global operational efficiency that allows them to leverage the country’s talented workforce and advance their capabilities to serve their clients better. Role Title: Lead Data Engineer Position Summary: This role is responsible for analysis, data modeling, data collection, data integration, and preparation of data for consumption. The Data Engineer is responsible for creating and managing data infrastructure, data pipeline design, implementation and data verification. Along with the team, it is responsible for ensuring the highest standards of data quality, security and compliance. Additionally, it will implement methods to improve data reliability and quality, combine raw information from different sources to create consistent data sets.?This role will need to be proficient in DataOps and be able to provide technical expertise to other Data Engineers. Incumbents will design and build data solutions and integrations which may involve diverse data platforms, software, hardware, technologies and tools. Uses available approved technologies and recommends solution options. Incumbents may design and build application solutions and integrations which may involve diverse development platforms (including 3rd party systems), software, hardware, technologies and tools. This is the third level position in the Data Engineer job family. Those holding this position are typically assigned to work on integrated project teams for medium to large projects and be the lead for smaller projects. The roleholder must also be able to work independently. Job Responsibilities: Data Acquisition: Possess and continually grow knowledge of structured and unstructured data sources within each product journey (Underwriting and Risk; Client Service, Sales and Marketing; Claims; Account and Location Engineering) as well as emerging data sources (purchased data sets; external data; etc.) Partner with product owners, developers, solution architects, business analysts, data engineers, data analysts, data scientists and others to understand data needs Develop solutions using data modeling techniques and using technologies such as ER-Studio, Postgres, SQL Server, Azure Data Factory, Kakfa, SSIS, and others as required Validate solutions are accurate through detailed and disciplined testing methodologies Ensure tables and views are designed for data integrity, efficiency and performance, and are easy to comprehend Move and Store Data: Data flow, infrastructure pipelines, ETL/ELT, structured and unstructured data movement and storage solutions Design data models and data flows into and out of databases Understand and design data relationships between business and data subject areas Follow standards for naming conventions, code documentation and code review Support data exploration and transformation needs Conduct data cleansing and support other team members with data cleansing tasks as needed Conduct data profiling to identify data anomalies and resolve issues Execute data preparation tasks Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure technologies Support users and production applications: Managing and addressing operational data issues by establishing workarounds and/or bringing in cross functional teams to solve the issues in timely manner Support developers, data analysts and data scientists who need to interact with data in either data warehouse Analyze and assess reported data quality issues, quickly identifying root cause Consult dba(s) and team members on configuration and maintenance of the warehouse infrastructure Monitor system performance and execute opportunities for optimization Monitor storage capacity and reliability Fix productions issues quickly, with appropriate validation and deployment steps Provide clear and professional communication to users, management, and teammates Provide ad hoc data extracts and analysis to respond to tactical business needs Participate in effective execution of team priorities: Ability to solve complex problems with on-time delivery Identify work tasks and capture them in the team backlog Organize known tasks, prioritize work as needed Ability to resolve colliding priorities and escalate as need Provide production support Network with product teams to keep abreast of database changes as well as business process changes which result in data interpretation changes. Skill and Experience: 5 - 7 years of Experience Required to Perform Essential Job Functions Ability to read and create data models ,Relational (3rd Normal Form) and non-relational (Inmon/Kimball) database theory Design, build, maintain data solutions Skilled with database clustering Expertise with databases including Postgres, SQL Server, and Data Lake Knowledge of Azure Cloud applications ETL/ELT design Programming languages (SQL, C#, Python, PowerShell, KSQL) Collaboration & Standards: Experience with GraphQL, peer reviews, and adherence to coding and quality standards. Must Have Skills: Data Engineering Tools: Proficiency in SQL, ADO, Azure Data Factory (ADF), Kafka, Azure Service Bus (ASB), and both stream and batch processing. CI/CD & Quality Assurance: Experience with continuous integration / deployment pipelines and implementing data quality checks. Cloud & Storage Technologies: Familiarity with unstructured data storage, data lakes, cloud-native development, and containerization. Software Quality & Security: Strong understanding of software quality practices, security principles, and API integration. Education and Certifications: 4 Year / Bachelors Degree Bachelor's Degree or Master's Degree, preferably in Computer Science, Information Technology, Computer Engineering, or equivalent experience Work location: Bengaluru
Job Title
Lead Data Engineer [T500-19182]