LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Senior Data Engineer, AVP

at Deutsche Bank

Back to all Data Engineering jobs
Deutsche Bank logo
Bulge Bracket Investment Banks

Senior Data Engineer, AVP

at Deutsche Bank

Mid LevelNo visa sponsorshipData Engineering

Posted a month ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Pune
Country
India

Deutsche Bank is hiring a Senior Data Engineer to design, build and maintain scalable PySpark/DBT/BigQuery data pipelines on Google Cloud Platform for transaction monitoring and compliance. The role involves implementing data quality frameworks, contributing to DevOps automation, and collaborating across Cloud, Security, Data, and Risk & Compliance teams. Candidates should have strong hands-on data engineering experience (Java/Scala/Python/DBT), data warehousing knowledge (ideally BigQuery), and experience with CI/CD and enterprise hybrid cloud environments.

Senior Data Engineer, AVP

Job ID:R0418489 Full/Part-Time: Full-time
Regular/Temporary: Regular Listed: 2026-01-21
Location: Pune

Position Overview

Job Title: Senior Data Engineer, AVP

Location: Pune, India

Role Description

Our Technology, Data and Innovation (TDI) strategy is focused on strengthening engineering expertise, introducing an agile delivery model, as well as modernising the bank's IT infrastructure with long-term investments and taking advantage of cloud computing.

You will be working in the Transaction Monitoring and Data Controls team designing, implementing, and operationalising Java components.

What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoy

  • Best in class leave policy
  • Gender neutral parental leaves
  • 100% reimbursement under childcare assistance benefit (gender neutral)
  • Sponsorship for Industry relevant certifications and education
  • Employee Assistance Program for you and your family members
  • Comprehensive Hospitalization Insurance for you and your dependents
  • Accident and Term life Insurance
  • Complementary Health screening for 35 yrs. and above

Your key responsibilities

  • Design, build, and maintain scalable and reliable PySpark/DBT/BigQuery data pipelines, pre-dominantly on Google Cloud Platform (GCP) to process high-volume transaction data for regulatory and internal compliance monitoring.
  • Implement robust data quality frameworks and monitoring solutions to ensure the accuracy, completeness, and timeliness of data within our critical transaction monitoring systems.
  • Contributing to DevOps capabilities to ensure maximum automation of our applications
  • Collaboration across the TDI areas such as Cloud Platform, Security, Data, Risk & Compliance areas to create optimum solutions for the business, increasing re-use, creating best practice, and sharing knowledge

Your skills and experience

  • Expert hands-on Data Engineering using at least one of:
    • Java/Scala/Kotlin in a toolset such as Apache Spark, Dataflow/Apache-Beam, Apache Flink
    • Python in a toolset such as PySpark or Dataflow/Apache-Beam
    • SQL based using DBT
  • Professional experience of at least one data warehousing technology (ideally Google Big Query), including knowledge of partitioning, clustering, and cost/performance optimization strategies.
  • Hands on experience writing and maintaining DevOps pipelines in at least one "CI/CD" tool such as Team City, Jenkins, GitHub Actions.
  • Experience contributing to software design and architecture including consideration of meeting non-functional requirements (e.g., reliability, scalability, observability, testability) and understanding of relevant Architecture styles and their trade-offs - e.g., Data Warehouse, ETL, ELT, Monolith, Batch, Incremental loading vs Stateless processing
  • Experience navigating and engineering within a secure, enterprise hybrid cloud environment within a large, regulated, and complex technology landscape
  • Experience of working with a globally distributed team requiring remote interaction across locations, time zones and diverse cultures and excellent communication skills (verbal and written)

How we’ll support you

  • Training and development to help you excel in your career
  • Coaching and support from experts in your team
  • A culture of continuous learning to aid progression
  • A range of flexible benefits that you can tailor to suit your needs

About us and our teams

Please visit our company website for further information:

https://www.db.com/company/company.html

We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.

Senior Data Engineer, AVP

at Deutsche Bank

Back to all Data Engineering jobs
Deutsche Bank logo
Bulge Bracket Investment Banks

Senior Data Engineer, AVP

at Deutsche Bank

Mid LevelNo visa sponsorshipData Engineering

Posted a month ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Pune
Country
India

Deutsche Bank is hiring a Senior Data Engineer to design, build and maintain scalable PySpark/DBT/BigQuery data pipelines on Google Cloud Platform for transaction monitoring and compliance. The role involves implementing data quality frameworks, contributing to DevOps automation, and collaborating across Cloud, Security, Data, and Risk & Compliance teams. Candidates should have strong hands-on data engineering experience (Java/Scala/Python/DBT), data warehousing knowledge (ideally BigQuery), and experience with CI/CD and enterprise hybrid cloud environments.

Senior Data Engineer, AVP

Job ID:R0418489 Full/Part-Time: Full-time
Regular/Temporary: Regular Listed: 2026-01-21
Location: Pune

Position Overview

Job Title: Senior Data Engineer, AVP

Location: Pune, India

Role Description

Our Technology, Data and Innovation (TDI) strategy is focused on strengthening engineering expertise, introducing an agile delivery model, as well as modernising the bank's IT infrastructure with long-term investments and taking advantage of cloud computing.

You will be working in the Transaction Monitoring and Data Controls team designing, implementing, and operationalising Java components.

What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoy

  • Best in class leave policy
  • Gender neutral parental leaves
  • 100% reimbursement under childcare assistance benefit (gender neutral)
  • Sponsorship for Industry relevant certifications and education
  • Employee Assistance Program for you and your family members
  • Comprehensive Hospitalization Insurance for you and your dependents
  • Accident and Term life Insurance
  • Complementary Health screening for 35 yrs. and above

Your key responsibilities

  • Design, build, and maintain scalable and reliable PySpark/DBT/BigQuery data pipelines, pre-dominantly on Google Cloud Platform (GCP) to process high-volume transaction data for regulatory and internal compliance monitoring.
  • Implement robust data quality frameworks and monitoring solutions to ensure the accuracy, completeness, and timeliness of data within our critical transaction monitoring systems.
  • Contributing to DevOps capabilities to ensure maximum automation of our applications
  • Collaboration across the TDI areas such as Cloud Platform, Security, Data, Risk & Compliance areas to create optimum solutions for the business, increasing re-use, creating best practice, and sharing knowledge

Your skills and experience

  • Expert hands-on Data Engineering using at least one of:
    • Java/Scala/Kotlin in a toolset such as Apache Spark, Dataflow/Apache-Beam, Apache Flink
    • Python in a toolset such as PySpark or Dataflow/Apache-Beam
    • SQL based using DBT
  • Professional experience of at least one data warehousing technology (ideally Google Big Query), including knowledge of partitioning, clustering, and cost/performance optimization strategies.
  • Hands on experience writing and maintaining DevOps pipelines in at least one "CI/CD" tool such as Team City, Jenkins, GitHub Actions.
  • Experience contributing to software design and architecture including consideration of meeting non-functional requirements (e.g., reliability, scalability, observability, testability) and understanding of relevant Architecture styles and their trade-offs - e.g., Data Warehouse, ETL, ELT, Monolith, Batch, Incremental loading vs Stateless processing
  • Experience navigating and engineering within a secure, enterprise hybrid cloud environment within a large, regulated, and complex technology landscape
  • Experience of working with a globally distributed team requiring remote interaction across locations, time zones and diverse cultures and excellent communication skills (verbal and written)

How we’ll support you

  • Training and development to help you excel in your career
  • Coaching and support from experts in your team
  • A culture of continuous learning to aid progression
  • A range of flexible benefits that you can tailor to suit your needs

About us and our teams

Please visit our company website for further information:

https://www.db.com/company/company.html

We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.