LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Software Engineer III- Data Engineering

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Software Engineer III- Data Engineering

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted 25 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Bengaluru
Country
India

Join the Planning and Analysis - Data Platform team to architect and build scalable, event-driven applications on AWS that integrate with Databricks and support finance-related data workflows. You will design and implement RESTful APIs in Python (FastAPI/Django), manage PostgreSQL operational databases, and contribute to ETL and analytics pipelines. The role involves full lifecycle development, CI/CD automation, and close collaboration with product, data, and infrastructure teams to ensure secure, performant, and maintainable solutions.

Location: Bengaluru, Karnataka, India

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

The Planning and Analysis - Data Platform Team  is at the forefront of managing and optimizing finance-related data. We focus on effective data lake utilization, data governance, and seamless data integration through advanced ETL processes. Our goal is to deliver high-quality, timely data in a controlled manner, reducing risk and cost. We are looking for a Senior Application Engineer skilled in AWS, event-promoten architecture, RESTful API design, and modern application development, with expertise in Python frameworks, database management, and Databricks for data engineering.

Key Responsibilities

  • Architect, build, and maintain scalable applications on AWS using event-driven patterns (e.g., Lambda, SNS/SQS, EventBridge).
  • Create robust RESTful APIs using Python frameworks such as FastAPI and Django, ensuring high performance and security.
  •  Design, implement, and optimize operational databases, primarily PostgreSQL, including schema design, indexing, and query optimization.
  •  Collaborate with data teams to integrate applications with Databricks, supporting ETL pipelines, data processing, and analytics workflows.
  • Participate in the full software development lifecycle, including requirements gathering, design, coding, testing, deployment, and maintenance.
  • Implement CI/CD pipelines, automate deployments, and monitor application health using AWS-native and third-party tools.
  • Work closely with cross-functional teams (product, data, infrastructure) and document technical solutions, architecture, and best practices.

 

Required qualifications, capabilities and skills

  • Proven experience with AWS services (EC2, Lambda, S3, RDS, SNS/SQS, EventBridge, IAM, CloudFormation).
  • Strong understanding and practical experience designing event-driven systems and microservices.
  • Deep knowledge of REST principles and hands-on experience building APIs with FastAPI and Django.
  • Advanced proficiency in Python, with experience in application engineering and scripting.
  • Solid experience with PostgreSQL, including performance tuning and operational management.
  • Good understanding of Databricks platform, Spark, and data engineering concepts.
  • Experience with CI/CD, infrastructure as code, and application monitoring.
  • Excellent problem-solving, communication, and teamwork abilities.

 

Preferred qualifications, capabilities, and skills

 

  • Familiarity with modern front-end technologies
  • Exposure to cloud technologies
Design and deliver market-leading technology products in a secure and scalable way as a seasoned member of an agile team

Software Engineer III- Data Engineering

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Software Engineer III- Data Engineering

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted 25 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Bengaluru
Country
India

Join the Planning and Analysis - Data Platform team to architect and build scalable, event-driven applications on AWS that integrate with Databricks and support finance-related data workflows. You will design and implement RESTful APIs in Python (FastAPI/Django), manage PostgreSQL operational databases, and contribute to ETL and analytics pipelines. The role involves full lifecycle development, CI/CD automation, and close collaboration with product, data, and infrastructure teams to ensure secure, performant, and maintainable solutions.

Location: Bengaluru, Karnataka, India

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

The Planning and Analysis - Data Platform Team  is at the forefront of managing and optimizing finance-related data. We focus on effective data lake utilization, data governance, and seamless data integration through advanced ETL processes. Our goal is to deliver high-quality, timely data in a controlled manner, reducing risk and cost. We are looking for a Senior Application Engineer skilled in AWS, event-promoten architecture, RESTful API design, and modern application development, with expertise in Python frameworks, database management, and Databricks for data engineering.

Key Responsibilities

  • Architect, build, and maintain scalable applications on AWS using event-driven patterns (e.g., Lambda, SNS/SQS, EventBridge).
  • Create robust RESTful APIs using Python frameworks such as FastAPI and Django, ensuring high performance and security.
  •  Design, implement, and optimize operational databases, primarily PostgreSQL, including schema design, indexing, and query optimization.
  •  Collaborate with data teams to integrate applications with Databricks, supporting ETL pipelines, data processing, and analytics workflows.
  • Participate in the full software development lifecycle, including requirements gathering, design, coding, testing, deployment, and maintenance.
  • Implement CI/CD pipelines, automate deployments, and monitor application health using AWS-native and third-party tools.
  • Work closely with cross-functional teams (product, data, infrastructure) and document technical solutions, architecture, and best practices.

 

Required qualifications, capabilities and skills

  • Proven experience with AWS services (EC2, Lambda, S3, RDS, SNS/SQS, EventBridge, IAM, CloudFormation).
  • Strong understanding and practical experience designing event-driven systems and microservices.
  • Deep knowledge of REST principles and hands-on experience building APIs with FastAPI and Django.
  • Advanced proficiency in Python, with experience in application engineering and scripting.
  • Solid experience with PostgreSQL, including performance tuning and operational management.
  • Good understanding of Databricks platform, Spark, and data engineering concepts.
  • Experience with CI/CD, infrastructure as code, and application monitoring.
  • Excellent problem-solving, communication, and teamwork abilities.

 

Preferred qualifications, capabilities, and skills

 

  • Familiarity with modern front-end technologies
  • Exposure to cloud technologies
Design and deliver market-leading technology products in a secure and scalable way as a seasoned member of an agile team