
Software Engineer III - Python, Databrick, AWS
at J.P. Morgan
Posted 15 hours ago
No clicks
- Compensation
- Not specified
- City
- Hyderabad
- Country
- India
Currency: Not specified
Senior-member role on an agile Corporate Technology team building scalable data engineering solutions. You will design, develop, and maintain Python-based data pipelines and workflows using AWS and Databricks, integrate diverse data sources, and optimize processing performance. The role emphasizes system design, CI/CD, resiliency, security, and collaboration with cross-functional teams to deliver stable, secure, and scalable products. Requires proven experience in data engineering, Python, and cloud platforms.
Location: Hyderabad, Telangana, India
Job summary
As a Software Engineer III at JPMorgan Chase within the Corporate Technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You will drive critical technology solutions that support the firm’s business objectives across multiple technical areas.
Job responsibilities
- Develop and maintain scalable data engineering applications to process and analyze large datasets
- Build and optimize data pipelines and workflows using AWS and Databricks platforms
- Create efficient and robust code in Python to automate data processing tasks
- Integrate data from diverse sources to support unified data models and analytics initiatives
- Monitor and optimize data processing performance for efficient data handling and storage
- Collaborate with cross-functional teams to understand data requirements and deliver effective solutions
Required qualifications, capabilities, and skills
- Formal training or certification in software engineering concepts and 3+ years applied experience
- Hands-on experience in system design, application development, testing, and operational stability
- Proficient in Python programming
- Experience in developing, debugging, and maintaining data engineering applications using AWS and Databricks
- Experience in Python, Databricks, and AWS.
- Knowledge of data integration and ETL processes
- Solid understanding of agile methodologies such as CI/CD, resiliency, and security
- Strong problem-solving and analytical skills
Preferred qualifications, capabilities, and skills
- Familiarity with modern front-end technologies
- Exposure to cloud technologies
- Experience with large-scale data analytics
- Strong communication and collaboration abilities




