
Lead Software Engineer- Data Modernization
at J.P. Morgan
Posted 3 days ago
No clicks
- Compensation
- Not specified USD
- City
- New York City
- Country
- United States
Currency: $ (USD)
Join JPMorgan Chase as a Lead Software Engineer focused on data modernization within the Consumer & Community Banking Card Rewards team. You will lead an agile engineering effort to build cloud-native data platforms that ingest, process, and serve large-scale batch and streaming data, and own end-to-end delivery of data warehousing migrations. You will provide technical leadership across Java, Python, Spark, Snowflake, and AWS, while driving CI/CD automation, security, and operational excellence. You will foster ownership, collaboration, and continuous learning, manage multi-workstream roadmaps, and define data platform architectures to meet scale, reliability, and cost targets.
Location: New York, NY, United States
We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible.
As a Lead Software Engineer at JPMorganChase within the Consumer & Community Banking Card Rewards team, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible for conducting critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job responsibilities
- Executes creative software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Develops secure high-quality production code, and reviews and debugs code written by others
- Lead a high‑performing engineering team building and operating cloud‑native data platforms that ingest, process, and serve large‑scale batch and streaming data
- Own end‑to‑end delivery for data warehousing and multi‑terabyte to petabyte‑scale migration initiatives, ensuring reliability, performance, security, and cost efficiency
- Provide technical leadership across Java, Python, Spark, Snowflake, and AWS, while driving strong Agile practice, CI/CD automation, and operational excellence
- Foster a culture of ownership, collaboration, psychological safety, and continuous learning
- Own multi‑workstream roadmaps; plan releases, define milestones, track progress, and remove blockers to hit scope, schedule, and quality targets
- Create detailed estimates and work breakdown structures, manage dependencies, risks, and stakeholder expectations
- Define and evolve data platform architecture for batch and streaming use cases (event‑driven, microservices, warehouse patterns)
- Lead large‑scale data migration (on‑prem to cloud), data profiling, reconciliation, lineage, quality (e.g., deduplication, schema validation), and backfills
Define SLAs; implement monitoring/alerting; lead incident response, root cause analysis, and continuous improvement
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 5+ years applied experience
- Hands-on practical experience delivering system design, application development, testing, and operational stability
- Proficiency in automation and continuous delivery methods
- Proficient in all aspects of the Software Development Life Cycle
- Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security
- Demonstrated proficiency in software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
- 8+ years hands-on practical experience in system design, application development, testing and operational stability and CI-CD process
- Hands-on practical experience in developing spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark Core & Spark Streaming
- Strong coding in one or more languages (Java or Python)
- Experience building Data Warehouse platform
- Cloud implementation experience with AWS including:
- AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Airflow (or) Lambda + Step Functions + Event Bridge
- Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON
- AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager
Preferred qualifications, capabilities, and skills- Experience in Snowflake and Databricks
- Experience in Gen AI skills
- In-depth knowledge of the financial services industry and their IT systems
Carry out critical tech solutions across multiple technical areas as an integral part of an agile team





