
Lead Software Engineer - Data Engineering (Python, AWS, ETL)
at J.P. Morgan
Posted 18 hours ago
No clicks
- Compensation
- Not specified
- City
- Mumbai
- Country
- India
Currency: Not specified
Lead Software Engineer – Data Engineering at JPMorganChase within Asset and Wealth Management. You will lead the data platform, architect and implement ETL processes, and deliver scalable data pipelines for financial and market data. The role includes mentoring a team of developers and testers, guiding agile-scrum sprints, and ensuring high-quality coding standards and rigorous testing. Strong Python, SQL, and AWS skills, plus a focus on data quality, security, and lineage, are essential.
Location: Mumbai, Maharashtra, India
We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible.
As a Lead Software Engineer at JPMorganChase within Asset and Wealth Management, you are an integral part of an agile team that works to enhance, build, and deliver trusted market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you are responsible to lead the data platform that forms the backbone of a technology-enabled investment platform and design, implement and maintain data extraction-transformation-load systems with a strong focus financial and capital markets’ data.
Job responsibilities
- Manages end-to-end development, enhancement and maintenance of 55ip’s market data platform
- Partners with multiple stakeholders to define & deliver innovative solutions to functional requirements
- Identifies, prioritizes and allocates tasks to effectively deliver to the firm’s technology objectives.
- Architects and develops data ETL processes and data objects.
- Builds, mentors and manages a team of software developers and testers.
- Plans and executes agile-scrum sprints.
- Ensures the team delivers high quality coding standards through code reviews, conducts thorough unit or functional testing and provides support during UAT & post go-live phases.
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 5+ years applied experience
- Proven experience in data management, ETL & ELT pipeline development and large-scale data processing.
- Proficiency in Python, SQL AWS & any ETL tool.
- Strong understanding of data quality, security, and lineage best practices.
- Experience with cloud-based data warehouse migration and modernization.
- Experienced with AWS services such as Redshift, S3, EC2, Lambda, Athena, EMR, AWS Glue, data pipeline.
- Excellent problem-solving and troubleshooting skills.
- Strong communication and documentation abilities.
- Proven ability to collaborate effectively with business and technical stakeholders.
- Strong understanding of DevOps practices, CI/CD pipelines, and release management across multiple environments.

