LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Data Engineer III - Data Modeler and Engineer

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Data Engineer III - Data Modeler and Engineer

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted 6 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Bengaluru
Country
India

Be part of a dynamic team where your distinctive skills will contribute to a winning culture. As a Data Engineer III at JPMorgan Chase within the Corporate Compliance Technology, you will design and deliver reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. You will develop, test, and maintain essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's objectives. Design and implement data models, work with structured and unstructured data, and collaborate with data engineers, analysts, and stakeholders to ensure data needs are met while maintaining security and regulatory compliance.

Location: Bengaluru, Karnataka, India

Be part of a dynamic team where your distinctive skills will contribute to a winning culture and team.

As a Data Engineer III at JPMorgan Chase within the Corporate Compliance Technology, you will be a seasoned member of an agile team, tasked with designing and delivering reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. Your responsibilities include developing, testing, and maintaining essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's business objectives.

 

Job responsibilities

  • Design and implement data models (conceptual, logical, and physical) to support business requirements.
  • Work with structured and unstructured data from multiple sources and integrate them into Databricks.
  • Collaborate with data engineers, analysts, and business stakeholders to understand data needs.
  • Apply and promote best practices in data engineering, security, risk management, regulatory compliance, and automation.
  • Create and maintain documentation for data models, pipelines, and architectures.
  • Collaborate cross-functionally with software engineering, analytics, and business teams to deliver data solutions aligned with business objectives

 

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Hands-on experience in data modeling with strong SQL , Python for data manipulation
  • Strong expertise in Databricks, Delta Lake, advanced queries.
  • Hands-on experience with Big Data, Glue catalog on AWS, Data Mesh, relational databases (Oracle), SQL and NoSQL.
  • Strong problem-solving skills and the ability to work in an agile environment.
  • Proficiency in enterprise-grade languages (e.g., Python) and data modeling (ERWin) & engineering tools.
  • Experience in data engineering, with a proven track record of leading large-scale data architecture and transformation projects in hybrid cloud and on-prem environments.
  • Expertise in data modeling, database design, big data technologies, cloud data platforms (AWS, Azure, GCP), and modern data tools (e.g., Snowflake, Databricks, Teradata).
  • Strong experience architecting and optimizing transactional and analytical data systems.
  • Knowledge of ETL/ELT processes and data pipeline development.
  • Solid background in data security, risk management, and regulatory compliance.

     

Preferred qualifications, capabilities, and skills
  • Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Experience with OLAP tools (TM1, Essbase, Atoti etc) is a plus.
  • Experience in Compliance and Risk domain (Operational and Compliance Risk, Surveillance and Financial Crimes) will be preferable. 
  • Familiarity with data governance, security, and compliance best practices.

 

Develop, test, and maintain critical data pipelines and architectures across multiple technical areas

Data Engineer III - Data Modeler and Engineer

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Data Engineer III - Data Modeler and Engineer

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted 6 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Bengaluru
Country
India

Be part of a dynamic team where your distinctive skills will contribute to a winning culture. As a Data Engineer III at JPMorgan Chase within the Corporate Compliance Technology, you will design and deliver reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. You will develop, test, and maintain essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's objectives. Design and implement data models, work with structured and unstructured data, and collaborate with data engineers, analysts, and stakeholders to ensure data needs are met while maintaining security and regulatory compliance.

Location: Bengaluru, Karnataka, India

Be part of a dynamic team where your distinctive skills will contribute to a winning culture and team.

As a Data Engineer III at JPMorgan Chase within the Corporate Compliance Technology, you will be a seasoned member of an agile team, tasked with designing and delivering reliable data collection, storage, access, and analytics solutions that are secure, stable, and scalable. Your responsibilities include developing, testing, and maintaining essential data pipelines and architectures across diverse technical areas, supporting various business functions to achieve the firm's business objectives.

 

Job responsibilities

  • Design and implement data models (conceptual, logical, and physical) to support business requirements.
  • Work with structured and unstructured data from multiple sources and integrate them into Databricks.
  • Collaborate with data engineers, analysts, and business stakeholders to understand data needs.
  • Apply and promote best practices in data engineering, security, risk management, regulatory compliance, and automation.
  • Create and maintain documentation for data models, pipelines, and architectures.
  • Collaborate cross-functionally with software engineering, analytics, and business teams to deliver data solutions aligned with business objectives

 

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Hands-on experience in data modeling with strong SQL , Python for data manipulation
  • Strong expertise in Databricks, Delta Lake, advanced queries.
  • Hands-on experience with Big Data, Glue catalog on AWS, Data Mesh, relational databases (Oracle), SQL and NoSQL.
  • Strong problem-solving skills and the ability to work in an agile environment.
  • Proficiency in enterprise-grade languages (e.g., Python) and data modeling (ERWin) & engineering tools.
  • Experience in data engineering, with a proven track record of leading large-scale data architecture and transformation projects in hybrid cloud and on-prem environments.
  • Expertise in data modeling, database design, big data technologies, cloud data platforms (AWS, Azure, GCP), and modern data tools (e.g., Snowflake, Databricks, Teradata).
  • Strong experience architecting and optimizing transactional and analytical data systems.
  • Knowledge of ETL/ELT processes and data pipeline development.
  • Solid background in data security, risk management, and regulatory compliance.

     

Preferred qualifications, capabilities, and skills
  • Experience with cloud platforms (AWS, Azure, or GCP) is a plus. Experience with OLAP tools (TM1, Essbase, Atoti etc) is a plus.
  • Experience in Compliance and Risk domain (Operational and Compliance Risk, Surveillance and Financial Crimes) will be preferable. 
  • Familiarity with data governance, security, and compliance best practices.

 

Develop, test, and maintain critical data pipelines and architectures across multiple technical areas

SIMILAR OPPORTUNITIES

No similar jobs available at the moment.