LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Software Engineer III - Data / ETL Developer

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Software Engineer III - Data / ETL Developer

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted 19 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Not specified
Country
United States

Join JPMorgan Chase's Consumer and Community Banking Risk Technology team as a Software Engineer III to design, build, and maintain scalable data and ETL solutions. You'll develop production-grade code and data pipelines using Python, Databricks, Spark, and Snowflake, integrate APIs on AWS, and support data migrations and validations. The role requires working within an agile team to produce architecture artifacts, ensure data security and integrity, and drive improvements through analysis and visualization of large datasets.

Location: Plano, TX, United States

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III at JPMorgan Chase within the Corporate Technology - Consumer and Community Banking Risk Technology team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

 

Job responsibilities

  • Designs, develops, and troubleshoots software solutions, leveraging both conventional and innovative approaches to solve complex technical problems 
  • Creates secure, high-quality production code and maintains efficient algorithms that integrate seamlessly with relevant systems
  • Produces architecture and design artifacts for complex applications, ensuring that software development aligns with established design constraints 
  • Develops workflows and ETL pipelines using Python, Databricks, and Spark, optimizing data processing and transformation at scale 
  • Implements and manages data solutions using Snowflake, including data modeling, performance tuning, and secure data sharing 
  • Frequently utilizes SQL, understands the role of NoSQL databases in the marketplace, and applies Spark for distributed data processing and analytics 
  • Gathers, analyzes, and synthesizes large, diverse data sets to develop visualizations and reporting that drive continuous improvement of software applications and systems 
  • Proactively identify hidden issues and patterns in data, using these insights to enhance coding practices and system architecture
  • Supports the review of controls to ensure robust protection of enterprise data and updates on logical or physical data models based on evolving use cases
  • Advises and implements custom configuration changes in select tools to deliver tailored solutions for business or customer needs
  • Fosters a team culture of diversity, opportunity, inclusion, and respect 

 

Required qualifications, capabilities, and skills

  • Formal training or certification in software/data engineering concepts and 3+ years applied experience  
  • Advanced in one or more programming language / frameworks (i.e., Python 3, ETL, Spark, Snowflake, Databricks, SQL, NoSQL, etc.), with active knowledge of AWS functions (i.e., ECS, Lambda, API Gateway, and general AWS services), and Terraform-based infrastructure deployments) 
  • Strong experience with statistical data analysis, including selecting appropriate tools and identifying data patterns
  • Demonstrated experience in API-driven development, particularly using Fast API on AWS ECS with API Gateway integration, and running APIs from AWS Lambda. 
  • Experience across the data engineering lifecycle, from ingestion to analysis and productization 
  • Ability to customize and implement changes in tools to generate products 
  • Significant experience with data migration and platform migration for data projects, including planning, execution, and post-migration support
  • Strong skills in data validations and reconciliations (recons) to ensure data integrity and accuracy during migrations and transformations
  • Solid understanding of the Software Development Life Cycle (SDLC) and agile methodologies, including CI/CD, application resiliency, and security best practices
  • Hands-on experience with deployment pipelines such as Git, Jules, Jenkins, and Spinnaker along with strong skills in building test scripts and using True CD for coding and testing
  • Demonstrated knowledge of software applications and technical processes within technical disciplines such as cloud, artificial intelligence, machine learning, or mobile

 

Preferred qualifications, capabilities, and skills

  • Familiarity with modern data engineering technologies
  • Exposure to cloud technologies (i.e. AWS)
Design and deliver market-leading technology products in a secure and scalable way as a seasoned member of an agile team

Software Engineer III - Data / ETL Developer

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Software Engineer III - Data / ETL Developer

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted 19 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Not specified
Country
United States

Join JPMorgan Chase's Consumer and Community Banking Risk Technology team as a Software Engineer III to design, build, and maintain scalable data and ETL solutions. You'll develop production-grade code and data pipelines using Python, Databricks, Spark, and Snowflake, integrate APIs on AWS, and support data migrations and validations. The role requires working within an agile team to produce architecture artifacts, ensure data security and integrity, and drive improvements through analysis and visualization of large datasets.

Location: Plano, TX, United States

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III at JPMorgan Chase within the Corporate Technology - Consumer and Community Banking Risk Technology team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

 

Job responsibilities

  • Designs, develops, and troubleshoots software solutions, leveraging both conventional and innovative approaches to solve complex technical problems 
  • Creates secure, high-quality production code and maintains efficient algorithms that integrate seamlessly with relevant systems
  • Produces architecture and design artifacts for complex applications, ensuring that software development aligns with established design constraints 
  • Develops workflows and ETL pipelines using Python, Databricks, and Spark, optimizing data processing and transformation at scale 
  • Implements and manages data solutions using Snowflake, including data modeling, performance tuning, and secure data sharing 
  • Frequently utilizes SQL, understands the role of NoSQL databases in the marketplace, and applies Spark for distributed data processing and analytics 
  • Gathers, analyzes, and synthesizes large, diverse data sets to develop visualizations and reporting that drive continuous improvement of software applications and systems 
  • Proactively identify hidden issues and patterns in data, using these insights to enhance coding practices and system architecture
  • Supports the review of controls to ensure robust protection of enterprise data and updates on logical or physical data models based on evolving use cases
  • Advises and implements custom configuration changes in select tools to deliver tailored solutions for business or customer needs
  • Fosters a team culture of diversity, opportunity, inclusion, and respect 

 

Required qualifications, capabilities, and skills

  • Formal training or certification in software/data engineering concepts and 3+ years applied experience  
  • Advanced in one or more programming language / frameworks (i.e., Python 3, ETL, Spark, Snowflake, Databricks, SQL, NoSQL, etc.), with active knowledge of AWS functions (i.e., ECS, Lambda, API Gateway, and general AWS services), and Terraform-based infrastructure deployments) 
  • Strong experience with statistical data analysis, including selecting appropriate tools and identifying data patterns
  • Demonstrated experience in API-driven development, particularly using Fast API on AWS ECS with API Gateway integration, and running APIs from AWS Lambda. 
  • Experience across the data engineering lifecycle, from ingestion to analysis and productization 
  • Ability to customize and implement changes in tools to generate products 
  • Significant experience with data migration and platform migration for data projects, including planning, execution, and post-migration support
  • Strong skills in data validations and reconciliations (recons) to ensure data integrity and accuracy during migrations and transformations
  • Solid understanding of the Software Development Life Cycle (SDLC) and agile methodologies, including CI/CD, application resiliency, and security best practices
  • Hands-on experience with deployment pipelines such as Git, Jules, Jenkins, and Spinnaker along with strong skills in building test scripts and using True CD for coding and testing
  • Demonstrated knowledge of software applications and technical processes within technical disciplines such as cloud, artificial intelligence, machine learning, or mobile

 

Preferred qualifications, capabilities, and skills

  • Familiarity with modern data engineering technologies
  • Exposure to cloud technologies (i.e. AWS)
Design and deliver market-leading technology products in a secure and scalable way as a seasoned member of an agile team