Role: Lead Data Engineer Primary Skill: Python, Pyspark Mandatory and AWS Services and pipelines. Location: Hyderabad/ Pune/ Coimbatore Experience: 7 - 9 years of experience Job Summary: We are looking for a Lead Data Engineer who will be responsible for building AWS Data pipelines as per requirements. Should have strong analytical skills, design capabilities, problem solving skills. Based on stakeholders’ requirements, should be able to propose solutions to the customer for review. Discuss pros/cons of different solution designs and optimization strategies. Responsibilities: Provide technical and development support to clients to build and maintain data pipelines. Develop data mapping documents listing business and transformational rules. Develop, unit test, deploy and maintain data pipelines. Design a Storage Layer for storing tabular/semi-structured/unstructured data. Design pipelines for batch/real-time processing of large data volumes. Analyze source specifications and build data mapping documents. Identify and document applicable non-functional code sets and reference data across insurance domains. Understand profiling results and validate data quality rules. Utilize data analysis tools to construct and manipulate datasets to support analyses. Collaborate with and support Quality Assurance (QA) in building functional scenarios and validating results. Requirements: 5+ years’ experience developing and maintaining modern ingestion pipeline using technologies like (AWS pipelines, Lamda, Spark, Apache Nifi etc). Basic understanding of the MLOPs lifecycle (Data prep -> model training -> model deployment -> model inference -> model re-training). Should be able to design data pipelines for batch/real time using Lambda, Step Functions, API Gateway, SNS, S3. Hands on experience on AWS Cloud and its Native components like S3, Athena, Redshift & Jupyter Notebooks. Requirements Gathering - Active involvement during requirements discussions with project sponsors, defining the project scope and delivery timelines, Design & Development. Strong in Spark Scala & Python pipelines (ETL & Streaming). Strong experience in metadata management tools like AWS Glue. Strong experience in coding with languages like Java, Python. Good-to-have AWS Developer certified. Good-to-have Postman-API and Apache Airflow or similar schedulers experience. Working with cross-functional teams to meet strategic goals. Experience in high volume data environments. Critical thinking and excellent verbal and written communication skills. Strong problem-solving and analytical abilities should be able to work and deliver individually. Good knowledge of data warehousing concepts. Desired Skill Set : Lambda, Step Functions, API Gateway, SNS, S3 (unstructured data), DynamoDB (semi-structured data), Aurora PostgreSQL (tabular data), AWS Sagemaker, AWS CodeCommit/GitLab, AWS CodeBuild, AWS Code Pipeline, AWS ECR. Aboutthe Company: ValueMomentum is amongst the fastestgrowing insurance-focused IT services providersin North America.Leading insurers trust ValueMomentum with their core, digital and data transformation initiatives. Having grown consistently every year by 24%, we have now grown to over 4000 employees. ValueMomentum is committed to integrity and to ensuring that each team and employee is successful. We foster an open work culture where employees' opinions are valued. We believe in teamwork and cultivate a sense of fun, fellowship, and pride among our employees. Benefits: We at ValueMomentum offer you the opportunity to grow by working alongside the experts. Some of the benefits you can avail are: Competitive compensation package comparable to the best in the industry. Career Advancement : Individual Career Development, coaching and mentoring programs for professional and leadership skill development. Comprehensive training and certification programs. Performance Management : Goal Setting, continuous feedback and year-end appraisal. Reward & recognition for the extraordinary performers. Benefits : Comprehensive health benefits, wellness and fitness programs. Paid time off and holidays. Culture : A highly transparent organization with an open-door policy and a vibrant culture
Job Title
Lead AWS Data engineer