LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Data Engineer II

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Data Engineer II

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted a month ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Mumbai
Country
India

Join JPMorgan Chase as a Data Engineer II supporting US Wealth Management from Mumbai. You will design, develop, test, and maintain scalable data pipelines and architectures, working across the data lifecycle with Spark, Python, and cloud data services. The role involves performance tuning, ETL/ELT implementation, ensuring data security and controls, and producing architecture and reporting artifacts. You will collaborate in an agile team to drive data-driven improvements and support production analytics and operational needs.

Location: Mumbai, Maharashtra, India

You thrive on diversity and creativity, and we welcome individuals who share our vision of making a lasting impact. Your unique combination of design thinking and experience will help us achieve new heights.

As a Data Engineer II at JPMorgan Chase within the JPM - US Wealth Management , you serve as a seasoned member of an agile team to design and deliver trusted data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. You are responsible for developing, testing, and maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

  • Supports review of controls to ensure sufficient protection of enterprise data
  • Advise and making custom configuration changes in one to two tools to generate a product at the business or customer request also updates logical or physical data models based on new use cases
  • Frequently uses SQL and understands NoSQL databases and their niche in the marketplace
  • Produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development also gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems.
  • Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture.

 

Required qualifications, capabilities, and skills

 

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Experience across the data lifecycle spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark SQL & Spark Streaming. 
  • Strong hands on working experience of Big Data stack including Spark and Python (Pandas, Spark SQL).
  • Good understanding on RDMS database, Relational, No SQL databases and Linux/UNIX.
  • Strong knowledge of multi-threading and high volume batch processing.
  • Should be good in performance tuning on for Python and Spark along with Autosys or Control-M scheduler.
  • Cloud implementation experience with AWS including, AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge, Data De/Serialization: Expertise in at least 2 of the formats: Parquet, AVRO, Fixed Width, AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager.

 

Preferred qualifications, capabilities, and skills

  • Proficient in all aspects of the Software Development Life Cycle.
  • Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security.
  • Knowledge of Java and Microservice architecture
 

 
Be part of an agile team that works to enhance, design, and deliver data collection, storage, access, and analytics solutions

Data Engineer II

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Data Engineer II

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted a month ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Mumbai
Country
India

Join JPMorgan Chase as a Data Engineer II supporting US Wealth Management from Mumbai. You will design, develop, test, and maintain scalable data pipelines and architectures, working across the data lifecycle with Spark, Python, and cloud data services. The role involves performance tuning, ETL/ELT implementation, ensuring data security and controls, and producing architecture and reporting artifacts. You will collaborate in an agile team to drive data-driven improvements and support production analytics and operational needs.

Location: Mumbai, Maharashtra, India

You thrive on diversity and creativity, and we welcome individuals who share our vision of making a lasting impact. Your unique combination of design thinking and experience will help us achieve new heights.

As a Data Engineer II at JPMorgan Chase within the JPM - US Wealth Management , you serve as a seasoned member of an agile team to design and deliver trusted data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. You are responsible for developing, testing, and maintaining critical data pipelines and architectures across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

  • Supports review of controls to ensure sufficient protection of enterprise data
  • Advise and making custom configuration changes in one to two tools to generate a product at the business or customer request also updates logical or physical data models based on new use cases
  • Frequently uses SQL and understands NoSQL databases and their niche in the marketplace
  • Produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development also gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems.
  • Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture.

 

Required qualifications, capabilities, and skills

 

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Experience across the data lifecycle spark-based Frameworks for end-to-end ETL, ELT & reporting solutions using key components like Spark SQL & Spark Streaming. 
  • Strong hands on working experience of Big Data stack including Spark and Python (Pandas, Spark SQL).
  • Good understanding on RDMS database, Relational, No SQL databases and Linux/UNIX.
  • Strong knowledge of multi-threading and high volume batch processing.
  • Should be good in performance tuning on for Python and Spark along with Autosys or Control-M scheduler.
  • Cloud implementation experience with AWS including, AWS Data Services: Proficiency in Lake formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge, Data De/Serialization: Expertise in at least 2 of the formats: Parquet, AVRO, Fixed Width, AWS Data Security: Good Understanding of security concepts such as: Lake formation, IAM, Service roles, Encryption, KMS, Secrets Manager.

 

Preferred qualifications, capabilities, and skills

  • Proficient in all aspects of the Software Development Life Cycle.
  • Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security.
  • Knowledge of Java and Microservice architecture
 

 
Be part of an agile team that works to enhance, design, and deliver data collection, storage, access, and analytics solutions