Skip to Main Content

Job Title


AWS Big Data Lead Software Engineer


Company : JPMorgan Chase & Co.


Location : Columbus, OH


Created : 2026-04-23


Job Type : Full Time


Job Description

Job responsibilitiesDesign & build new applications utilizing leading edge technologies and modernize existing applicationsImplement batch & real-time software components consistent with architectural best-practices of reliability, security, operational efficiency, cost-effectiveness and performanceEnsure quality of deployed code via automated unit, integration & acceptance testingCollaborate with multi-national agile development, support and business teams to meet sprint objectivesParticipate in all agile meetings & rituals, including daily standups, sprint planning, backlog reviews, demos, and retrospectivesProvide level 2 support for production systemsLearn and applies system processes, methodologies, and skills for the development of secure, stable code and systemsHands on applicaitn development leveraging distributed compute such as Apache flink or Spark on very large datasetsDesign and development of applications that leverage the AWS infrastructure deploying software components onAWS using common compute and storage services such as EC2, EKS, Lambda, S3Lead and deliver projects from concept to production across PNI (Personalization and Insights) platformRequired qualifications, capabilities, and skillsFormal training or certification on software engineering concepts and 5+ years applied experienceHands-on practical experience in Frameworks, system design, application development, testing, and operational stabilityExperience with Apache Spark, Apache Flink or similar large-scale data processing enginesExperience with Distributed Datastores (. Cassandra, Red Shift)Experience designing, developing and deploying software components onAWS using common compute and storage services such as EC2, EKS, Lambda, S3Experience with Big Data / Distributed / cloud technology (AWS Big data services like lambda, glue, glue emr, Performance tuning, Streaming, KAFKA, Entitlements etc., )Experience with Apache Spark, Ray or similar large-scale data processing enginesProficiency in automation and continuous delivery methodsProficient in all aspects of the Software Development Life CycleAdvanced understanding of agile methodologies such as CI/CD, Application Resiliency, and SecurityDemonstrated knowledge of software applications and technical processes within a technical discipline (., cloud, BigData, artificial intelligence, machine learning, mobile, Preferred qualifications, capabilities, and skillsExperience building ETL/Feature processing pipelinesExperience using workflow orchestration tools—Airflow, Kubeflow etc.Experience using Terraform to deploy infrastructure-as-code to public cloudExperience with Linux scripting such as Bash, KSH, or PythonCertified AWS Cloud Practitioner, Developer or Solutions Architect strongly preferred