LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Software Engineer III - Big Data Pyspark, Java And AWS

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Software Engineer III - Big Data Pyspark, Java And AWS

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted 12 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Not specified
Country
United States

Join JPMorgan Chase as a Software Engineer III specializing in Java, PySpark, and AWS within the Cards Technology team. You will design and deliver data collection, storage, access, and analytics platforms, create secure and scalable data pipelines, and contribute to architecture, deployment, and continuous improvement of software applications. The role involves building production-ready code, maintaining data processing workflows, and collaborating on agile initiatives to solve complex data challenges across multiple data sources.

Location: Wilmington, DE, United States

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III – Java/PySpark/AWS Developer at JPMorgan Chase within the Consumer and Community Bank – Cards Technology Team, you are an integral part of an agile team that works to enhance, build, and deliver data collection, storage, access, and analytics in a secure, stable, and scalable way. Leverage your deep technical expertise and problem-solving capabilities to drive significant business impact and tackle a diverse array of challenges that span multiple data pipelines, data architectures, and other data consumers.  You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

  • Executes software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems
  • Designs and delivers trusted data collection, storage, access, and analytics data platform solutions in a secure, stable, and scalable way 
  • Defines database back-up, recovery, and archiving strategy 
  • Design and develop data pipelines to ingest, store, and process data from multiple sources 
  • Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
  • Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
  • Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems.
  • Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture.
  • Contributes to software engineering communities of practice and events that explore new and emerging technologies
  • Adds to team culture of diversity, opportunity, inclusion, and respect

Required qualifications, capabilities, and skills

  • Formal training or certification in software engineering concepts and 3+ years applied experience
  • Proficient in coding in Java and PySpark, and experience with AWS cloud technologies, including S3 
  • Experience with SQL-based technologies (e.g., MySQL/ Oracle DB)
  • Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages
  • Cloud implementation experience with AWS including:
    • AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge
    • Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON-LD
    • AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager
  • Experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis and proficient in automation and continuous delivery methods.
  • Overall knowledge of the Software Development Life Cycle and solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security 
  • Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)

Preferred qualifications, capabilities, and skills

  • Snowflake knowledge or experience preferred 
  • In-depth knowledge of the financial services industry and their IT systems
  • Worked with building Data lake, built Data platforms, built Data frameworks, Built/Design of Data as a Service AP

 

Design and deliver market-leading technology products in a secure and scalable way as a seasoned member of an agile team

Software Engineer III - Big Data Pyspark, Java And AWS

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Software Engineer III - Big Data Pyspark, Java And AWS

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted 12 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Not specified
Country
United States

Join JPMorgan Chase as a Software Engineer III specializing in Java, PySpark, and AWS within the Cards Technology team. You will design and deliver data collection, storage, access, and analytics platforms, create secure and scalable data pipelines, and contribute to architecture, deployment, and continuous improvement of software applications. The role involves building production-ready code, maintaining data processing workflows, and collaborating on agile initiatives to solve complex data challenges across multiple data sources.

Location: Wilmington, DE, United States

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III – Java/PySpark/AWS Developer at JPMorgan Chase within the Consumer and Community Bank – Cards Technology Team, you are an integral part of an agile team that works to enhance, build, and deliver data collection, storage, access, and analytics in a secure, stable, and scalable way. Leverage your deep technical expertise and problem-solving capabilities to drive significant business impact and tackle a diverse array of challenges that span multiple data pipelines, data architectures, and other data consumers.  You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

  • Executes software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems
  • Designs and delivers trusted data collection, storage, access, and analytics data platform solutions in a secure, stable, and scalable way 
  • Defines database back-up, recovery, and archiving strategy 
  • Design and develop data pipelines to ingest, store, and process data from multiple sources 
  • Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
  • Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
  • Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems.
  • Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture.
  • Contributes to software engineering communities of practice and events that explore new and emerging technologies
  • Adds to team culture of diversity, opportunity, inclusion, and respect

Required qualifications, capabilities, and skills

  • Formal training or certification in software engineering concepts and 3+ years applied experience
  • Proficient in coding in Java and PySpark, and experience with AWS cloud technologies, including S3 
  • Experience with SQL-based technologies (e.g., MySQL/ Oracle DB)
  • Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages
  • Cloud implementation experience with AWS including:
    • AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge
    • Data De/Serialization: Expertise in at least 2 of the formats: Parquet, Iceberg, AVRO, JSON-LD
    • AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager
  • Experience with statistical data analysis and ability to determine appropriate tools and data patterns to perform analysis and proficient in automation and continuous delivery methods.
  • Overall knowledge of the Software Development Life Cycle and solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security 
  • Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)

Preferred qualifications, capabilities, and skills

  • Snowflake knowledge or experience preferred 
  • In-depth knowledge of the financial services industry and their IT systems
  • Worked with building Data lake, built Data platforms, built Data frameworks, Built/Design of Data as a Service AP

 

Design and deliver market-leading technology products in a secure and scalable way as a seasoned member of an agile team