LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Data Platform Engineer II - India

at CME Group

Back to all Data Engineering jobs
CME Group logo
Other

Data Platform Engineer II - India

at CME Group

Mid LevelNo visa sponsorshipData Engineering

Posted 6 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Bengaluru
Country
India

Platform Engineer II focusing on Cloud Infrastructure and Data Governance, building self-service tools to boost developer productivity and data transparency. Design automated pipelines for metadata discovery and data lineage across on-prem and GCP. Engineer scalable CI/CD and data orchestration workflows using Argo Workflows and shell scripting. Collaborate with senior engineers to migrate data assets to GCP (BigQuery) and evolve the global tech stack while ensuring data security.

Data Platform Engineer II (GCP & Data Platforms)
Summary
The Platform Engineering team is a collective of highly skilled individuals bridging the gap between development and operations with a security-first mindset. We are looking for a Platform Engineer II to help us build the next generation of self-service capabilities that drive developer productivity and data transparency.
In this role, you will focus on the intersection of Cloud Infrastructure and Data Governance. You will be responsible for engineering automated pipelines that facilitate metadata discovery, data lineage, and integration across our hybrid-cloud ecosystem (On-prem to GCP). Your mission is to treat the "Platform as a Product," ensuring our data analytics and application teams can discover, trust, and deploy data assets with speed and security.


Principal Accountabilities

  • Data Platform Engineering: Design and implement self-service tools and "Golden Paths" that reduce time-to-market for application and data teams.
  • Metadata & Discovery: Build and maintain automated pipelines to crawl metadata and establish lineage from diverse sources (Oracle, SQL Server, Cloud SQL, AlloyDB) into our centralized discovery and governance platforms.
  • Automated Workflows: Engineer scalable CI/CD and data orchestration workflows using Argo Workflows and shell scripting to automate complex infrastructure and data tasks.
  • Hybrid Cloud Integration: Collaborate on the architecture and migration of data assets from on-premises environments to GCP (BigQuery), ensuring data remains well-architected and secure.
  • Collaboration & Mentorship: Work closely with senior engineers to identify bottlenecks in the delivery process and contribute to the evolution of our global tech stack.

Skills and Experience Requirements
Experience Levels:

  • Total Experience: 3 to 5 years in a professional technical role (Software Engineering, Platform, or DevOps).
  • Relevant Cloud Experience: At least 2+ years of hands-on experience specifically within Google Cloud Platform (GCP) and Kubernetes.
  • Data Experience: At least 1 year of experience working with data-centric tools (SQL, BigQuery, or ETL/ELT pipelines).

Required (Must Have):

  • Cloud Expertise: Proven experience with GCP core services, specifically BigQuery, Cloud SQL, AlloyDB, and IAM.
  • Database Proficiency: Strong command of SQL for querying complex schemas and managing data across RDBMS and Cloud Data Warehouses.
  • Orchestration & Automation: Hands-on experience with Kubernetes and Argo Workflows.
  • Scripting & Development: Strong proficiency in Shell Scripting and at least one high-level language such as Python or Java.
  • Infrastructure as Code: Experience with Git-based workflows and automation tools (e.g., Ansible, Terraform, or Jenkins).
  • Containerization: Deep understanding of Docker and Kubernetes-native deployment strategies.
  • Communication: Excellent oral and written communication skills, with the ability to work effectively in a global team environment.

Preferred (Good to Have):

  • Data Governance: Experience with data cataloging, metadata management, and lineage tools (e.g., Atlan, Collibra, or DataPlex).
  • Data Engineering Fundamentals: Familiarity with ETL/ELT patterns and movement of data from RDBMS (Oracle/SQL Server) to Cloud Data Warehouses.
  • Data Visualization: Exposure to Looker or Tableau, specifically regarding how these tools consume data and represent metadata.
  • Security Mindset: Understanding of DevSecOps principles, including secret management and least-privilege access in a data context.

What We Offer

  • A supportive environment fostering career progression and continuous learning.
  • Exposure to diverse products, asset classes, and high-scale data environments.
  • A competitive salary and comprehensive benefits package consistent with our global standards.

CME Group: Where Futures are Made

CME Group is the world’s leading derivatives marketplace. But who we are goes deeper than that. Here, you can impact markets worldwide. Transform industries. And build a career by shaping tomorrow. We invest in your success and you own it – all while working alongside a team of leading experts who inspire you in ways big and small. Problem solvers, difference makers, trailblazers. Those are our people. And we’re looking for more.

At CME Group, we embrace our employees' unique experiences and skills to ensure that everyone’s perspectives are acknowledged and valued. As an equal-opportunity employer, we consider all potential employees without regard to any protected characteristic.

Important Notice: Recruitment fraud is on the rise, with scammers using misleading promises of job offers and interviews to solicit money and personal information from job seekers. CME Group adheres to established procedures designed to maintain trust, confidence and security throughout our recruitment process. Learn more here.

Location: Bangalore - Bagmane Tridib

Time Type: Full time

Data Platform Engineer II - India

at CME Group

Back to all Data Engineering jobs
CME Group logo
Other

Data Platform Engineer II - India

at CME Group

Mid LevelNo visa sponsorshipData Engineering

Posted 6 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Bengaluru
Country
India

Platform Engineer II focusing on Cloud Infrastructure and Data Governance, building self-service tools to boost developer productivity and data transparency. Design automated pipelines for metadata discovery and data lineage across on-prem and GCP. Engineer scalable CI/CD and data orchestration workflows using Argo Workflows and shell scripting. Collaborate with senior engineers to migrate data assets to GCP (BigQuery) and evolve the global tech stack while ensuring data security.

Data Platform Engineer II (GCP & Data Platforms)
Summary
The Platform Engineering team is a collective of highly skilled individuals bridging the gap between development and operations with a security-first mindset. We are looking for a Platform Engineer II to help us build the next generation of self-service capabilities that drive developer productivity and data transparency.
In this role, you will focus on the intersection of Cloud Infrastructure and Data Governance. You will be responsible for engineering automated pipelines that facilitate metadata discovery, data lineage, and integration across our hybrid-cloud ecosystem (On-prem to GCP). Your mission is to treat the "Platform as a Product," ensuring our data analytics and application teams can discover, trust, and deploy data assets with speed and security.


Principal Accountabilities

  • Data Platform Engineering: Design and implement self-service tools and "Golden Paths" that reduce time-to-market for application and data teams.
  • Metadata & Discovery: Build and maintain automated pipelines to crawl metadata and establish lineage from diverse sources (Oracle, SQL Server, Cloud SQL, AlloyDB) into our centralized discovery and governance platforms.
  • Automated Workflows: Engineer scalable CI/CD and data orchestration workflows using Argo Workflows and shell scripting to automate complex infrastructure and data tasks.
  • Hybrid Cloud Integration: Collaborate on the architecture and migration of data assets from on-premises environments to GCP (BigQuery), ensuring data remains well-architected and secure.
  • Collaboration & Mentorship: Work closely with senior engineers to identify bottlenecks in the delivery process and contribute to the evolution of our global tech stack.

Skills and Experience Requirements
Experience Levels:

  • Total Experience: 3 to 5 years in a professional technical role (Software Engineering, Platform, or DevOps).
  • Relevant Cloud Experience: At least 2+ years of hands-on experience specifically within Google Cloud Platform (GCP) and Kubernetes.
  • Data Experience: At least 1 year of experience working with data-centric tools (SQL, BigQuery, or ETL/ELT pipelines).

Required (Must Have):

  • Cloud Expertise: Proven experience with GCP core services, specifically BigQuery, Cloud SQL, AlloyDB, and IAM.
  • Database Proficiency: Strong command of SQL for querying complex schemas and managing data across RDBMS and Cloud Data Warehouses.
  • Orchestration & Automation: Hands-on experience with Kubernetes and Argo Workflows.
  • Scripting & Development: Strong proficiency in Shell Scripting and at least one high-level language such as Python or Java.
  • Infrastructure as Code: Experience with Git-based workflows and automation tools (e.g., Ansible, Terraform, or Jenkins).
  • Containerization: Deep understanding of Docker and Kubernetes-native deployment strategies.
  • Communication: Excellent oral and written communication skills, with the ability to work effectively in a global team environment.

Preferred (Good to Have):

  • Data Governance: Experience with data cataloging, metadata management, and lineage tools (e.g., Atlan, Collibra, or DataPlex).
  • Data Engineering Fundamentals: Familiarity with ETL/ELT patterns and movement of data from RDBMS (Oracle/SQL Server) to Cloud Data Warehouses.
  • Data Visualization: Exposure to Looker or Tableau, specifically regarding how these tools consume data and represent metadata.
  • Security Mindset: Understanding of DevSecOps principles, including secret management and least-privilege access in a data context.

What We Offer

  • A supportive environment fostering career progression and continuous learning.
  • Exposure to diverse products, asset classes, and high-scale data environments.
  • A competitive salary and comprehensive benefits package consistent with our global standards.

CME Group: Where Futures are Made

CME Group is the world’s leading derivatives marketplace. But who we are goes deeper than that. Here, you can impact markets worldwide. Transform industries. And build a career by shaping tomorrow. We invest in your success and you own it – all while working alongside a team of leading experts who inspire you in ways big and small. Problem solvers, difference makers, trailblazers. Those are our people. And we’re looking for more.

At CME Group, we embrace our employees' unique experiences and skills to ensure that everyone’s perspectives are acknowledged and valued. As an equal-opportunity employer, we consider all potential employees without regard to any protected characteristic.

Important Notice: Recruitment fraud is on the rise, with scammers using misleading promises of job offers and interviews to solicit money and personal information from job seekers. CME Group adheres to established procedures designed to maintain trust, confidence and security throughout our recruitment process. Learn more here.

Location: Bangalore - Bagmane Tridib

Time Type: Full time

SIMILAR OPPORTUNITIES

No similar jobs available at the moment.