LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

GCP Data & AI Engineer, AS

at Deutsche Bank

Back to all Data Engineering jobs
Deutsche Bank logo
Bulge Bracket Investment Banks

GCP Data & AI Engineer, AS

at Deutsche Bank

Mid LevelNo visa sponsorshipData Engineering

Posted 2 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Pune
Country
India

Hands-on GCP Data & AI Engineer responsible for designing, developing and maintaining end-to-end data pipelines, data preparation/transformation, and AI/ML workloads using Python, SQL and GCP services. The role includes building and operationalizing AI and RAG pipelines (Vertex AI, vector DBs), infrastructure as code with Terraform, REST API hosting, CI/CD, and L3 support for deployed workloads.

GCP Data & AI Engineer, AS

Job ID:R0415266 Full/Part-Time: Full-time
Regular/Temporary: Regular Listed: 2026-01-12
Location: Pune

Position Overview

Job Title: GCP Data & AI Engineer, AS

Location: Pune, India

Role Description:

It’s a GCP Data & AI Engineer role, performing end to end development of integrations, data preparation/synthesization/transformation, data quality, data storage workloads, along with usage of the prepared data for advanced data analytics and various AI use cases. It encompasses the L3 support activities as well for the implemented workloads.

What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoy

  • Best in class leave policy.
  • Gender neutral parental leaves
  • 100% reimbursement under childcare assistance benefit (gender neutral)
  • Sponsorship for Industry relevant certifications and education
  • Employee Assistance Program for you and your family members
  • Comprehensive Hospitalization Insurance for you and your dependents
  • Accident and Term life Insurance
  • Complementary Health screening for 35 yrs. and above

Your key responsibilities:

  • Design, develop and maintain data pipelines using Python and SQL programming language on GCP.
  • Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills.
  • Work with Cloud Composer to manage and process batch data jobs efficiently.
  • Develop and optimize complex SQL queries for data analysis, extraction, and transformation.
  • Develop and deploy google cloud services using Terraform.
  • Consume and Hosting REST API using Python.
  • Build and deploy AI and RAG pipelines
  • Design, build and deploy agentic workflows
  • Operationalize AI applications
  • Knowledge of RAG, vector database, multi-agent system, agentic framework – ADK along with a2a protocol and agent cards
  • Experienced in Vertex AI platform
  • Implement CI CD pipeline using GitHub Action
  • Monitor and troubleshoot data pipelines, resolving any issues in a timely manner.
  • Ensure team collaboration using Jira, Confluence, and other tools.
  • Ability to quickly learn new any existing technologies Strong problem-solving skills.
  • Write advanced SQL and Python scripts.
  • Strategize novel approaches for designing, implementing, deploying robust and scalable ai systems
  • Certification on Professional Google Cloud Data engineer will be an added advantage.

Your skills and experience

  • 6-10 years of IT experience, as a hands-on technologist.
  • Proficient in Python
  • Proficient in SQL.
  • Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run, Google ADK and good to have GKE
  • Hands on experience in building and managing AI workloads and REST API hosting and consumptions.
  • Proficient in Terraform/ Hashicorp.
  • Experienced in GitHub and Git Actions
  • Experienced in CI-CD
  • Experience in automating ETL testing using python and SQL.
  • Good to have APIGEE.
  • Good to have Bit Bucket
  • Understanding of LLM(Gemini) and embedding models
  • Experienced in prompt engineering

How we’ll support you

  • Training and development to help you excel in your career
  • Coaching and support from experts in your team
  • A culture of continuous learning to aid progression
  • A range of flexible benefits that you can tailor to suit your needs

About us and our teams

Please visit our company website for further information:

https://www.db.com/company/company.html

We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.

GCP Data & AI Engineer, AS

at Deutsche Bank

Back to all Data Engineering jobs
Deutsche Bank logo
Bulge Bracket Investment Banks

GCP Data & AI Engineer, AS

at Deutsche Bank

Mid LevelNo visa sponsorshipData Engineering

Posted 2 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Pune
Country
India

Hands-on GCP Data & AI Engineer responsible for designing, developing and maintaining end-to-end data pipelines, data preparation/transformation, and AI/ML workloads using Python, SQL and GCP services. The role includes building and operationalizing AI and RAG pipelines (Vertex AI, vector DBs), infrastructure as code with Terraform, REST API hosting, CI/CD, and L3 support for deployed workloads.

GCP Data & AI Engineer, AS

Job ID:R0415266 Full/Part-Time: Full-time
Regular/Temporary: Regular Listed: 2026-01-12
Location: Pune

Position Overview

Job Title: GCP Data & AI Engineer, AS

Location: Pune, India

Role Description:

It’s a GCP Data & AI Engineer role, performing end to end development of integrations, data preparation/synthesization/transformation, data quality, data storage workloads, along with usage of the prepared data for advanced data analytics and various AI use cases. It encompasses the L3 support activities as well for the implemented workloads.

What we’ll offer you

As part of our flexible scheme, here are just some of the benefits that you’ll enjoy

  • Best in class leave policy.
  • Gender neutral parental leaves
  • 100% reimbursement under childcare assistance benefit (gender neutral)
  • Sponsorship for Industry relevant certifications and education
  • Employee Assistance Program for you and your family members
  • Comprehensive Hospitalization Insurance for you and your dependents
  • Accident and Term life Insurance
  • Complementary Health screening for 35 yrs. and above

Your key responsibilities:

  • Design, develop and maintain data pipelines using Python and SQL programming language on GCP.
  • Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills.
  • Work with Cloud Composer to manage and process batch data jobs efficiently.
  • Develop and optimize complex SQL queries for data analysis, extraction, and transformation.
  • Develop and deploy google cloud services using Terraform.
  • Consume and Hosting REST API using Python.
  • Build and deploy AI and RAG pipelines
  • Design, build and deploy agentic workflows
  • Operationalize AI applications
  • Knowledge of RAG, vector database, multi-agent system, agentic framework – ADK along with a2a protocol and agent cards
  • Experienced in Vertex AI platform
  • Implement CI CD pipeline using GitHub Action
  • Monitor and troubleshoot data pipelines, resolving any issues in a timely manner.
  • Ensure team collaboration using Jira, Confluence, and other tools.
  • Ability to quickly learn new any existing technologies Strong problem-solving skills.
  • Write advanced SQL and Python scripts.
  • Strategize novel approaches for designing, implementing, deploying robust and scalable ai systems
  • Certification on Professional Google Cloud Data engineer will be an added advantage.

Your skills and experience

  • 6-10 years of IT experience, as a hands-on technologist.
  • Proficient in Python
  • Proficient in SQL.
  • Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run, Google ADK and good to have GKE
  • Hands on experience in building and managing AI workloads and REST API hosting and consumptions.
  • Proficient in Terraform/ Hashicorp.
  • Experienced in GitHub and Git Actions
  • Experienced in CI-CD
  • Experience in automating ETL testing using python and SQL.
  • Good to have APIGEE.
  • Good to have Bit Bucket
  • Understanding of LLM(Gemini) and embedding models
  • Experienced in prompt engineering

How we’ll support you

  • Training and development to help you excel in your career
  • Coaching and support from experts in your team
  • A culture of continuous learning to aid progression
  • A range of flexible benefits that you can tailor to suit your needs

About us and our teams

Please visit our company website for further information:

https://www.db.com/company/company.html

We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively.
Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group.
We welcome applications from all people and promote a positive, fair and inclusive work environment.