LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

DET-ISR-EUC Engineer -VDI - MDM-GDSN02

at Ernst & Young

Back to all Data Engineering jobs
Ernst & Young logo
Big Four

DET-ISR-EUC Engineer -VDI - MDM-GDSN02

at Ernst & Young

Tech LeadNo visa sponsorshipData Engineering

Posted 4 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Noida
Country
India

EY is seeking a DET-ISR-EUC Engineer - VDI - MDM to design and maintain scalable data pipelines and data models. You will ingest, process, and transform large datasets using Kafka, Spark, and cloud services, ensuring data quality and availability for analytics. The role includes implementing ETL automation, CI/CD for data workflows, IaC, containerization with Docker/Kubernetes, and monitoring with Prometheus/Grafana. You will collaborate with data scientists and analysts, mentor junior engineers, and contribute to data governance and documentation.

At EY, we’re all in to shape your future with confidence. 

We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. 

Join EY and help to build a better working world. 

 

Designation-Data Engineer

 

Job Description

  • Develop and maintain scalable data pipelines using tools like Apache Kafka, Apache Spark, or AWS Glue to ingest, process, and transform large datasets from various sources, ensuring efficient data flow and processing.
  • Design and implement data models and schemas in data warehouses (e.g., Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake) to support analytics and reporting needs.
  • Collaborate with data scientists and analysts to understand data requirements, ensuring data availability and accessibility for analytics, machine learning, and reporting.
  • Utilize ETL tools and frameworks (e.g., Apache NiFi, Talend, or custom Python scripts) to automate data extraction, transformation, and loading processes, ensuring data quality and integrity.
  • Monitor and optimize data pipeline performance using tools like Apache Airflow or AWS Step Functions, implementing best practices for data processing and workflow management.
  • Write, test, and maintain scripts in Python, SQL, or Bash for data processing, automation tasks, and data validation, ensuring high code quality and performance.
  • Implement CI/CD practices for data engineering workflows using tools like Jenkins, GitLab CI, or Azure DevOps, automating the deployment of data pipelines and infrastructure changes.
  • Collaborate with DevOps teams to integrate data solutions into existing infrastructure, leveraging Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation for provisioning and managing resources.
  • Manage containerized data applications using Docker and orchestrate them with Kubernetes, ensuring scalability and reliability of data processing applications.
  • Implement monitoring and logging solutions using tools like Prometheus, Grafana, or ELK Stack to track data pipeline performance, troubleshoot issues, and ensure data quality.
  • Ensure compliance with data governance, security best practices, and data privacy regulations, embedding DevSecOps principles in data workflows.
  • Participate in code reviews and contribute to the development of best practices for data engineering, data quality, and DevOps methodologies.
  • Mentor junior data engineers, providing guidance on data engineering practices, data architecture, and DevOps tools and techniques.
  • Contribute to the documentation of data architecture, processes, and workflows for knowledge sharing, compliance, and onboarding purposes.
  • Demonstrate strong communication skills to collaborate effectively with cross-functional teams, including data science, analytics, and business stakeholders.

 

Desired Profile

  • Seeking a DevOps Engineer with 5+ years of hands-on Cloud and DevOps experience, including significant leadership. Requires a Bachelor's/master’s in computer science.
  • Must have expert proficiency in Terraform and extensive experience across at least two major cloud platforms (AWS, Azure, GCP). Strong hands-on experience with Kubernetes, Helm charts, and designing/optimizing CI/CD pipelines (e.g., Jenkins, GitLab CI) is essential. Proficiency in Python and scripting (Bash/PowerShell) is also a must.
  • Valued experience includes leading cloud migrations, contributing to RFP/RFI processes, and mentoring teams. Excellent problem-solving, communication, and collaboration skills are critical. Experience with configuration management (Ansible, Puppet) and DevSecOps principles is required; OpenShift is a plus.

 

Experience

  • 10 years and above

 

Education

  • B.Tech. / BS in Computer Science

 

Technical Skills & Certifications

  • Certifications in cloud platforms (e.g., AWS Certified Solutions Architect, Azure Administrator, Google Professional Cloud Architect).
  • Terraform, Kubernetes, Python, CI/CD, Ansible, Security tools, Monitoring tools.

EY | Building a better working world

EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.

Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.

EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

DET-ISR-EUC Engineer -VDI - MDM-GDSN02

at Ernst & Young

Back to all Data Engineering jobs
Ernst & Young logo
Big Four

DET-ISR-EUC Engineer -VDI - MDM-GDSN02

at Ernst & Young

Tech LeadNo visa sponsorshipData Engineering

Posted 4 days ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Noida
Country
India

EY is seeking a DET-ISR-EUC Engineer - VDI - MDM to design and maintain scalable data pipelines and data models. You will ingest, process, and transform large datasets using Kafka, Spark, and cloud services, ensuring data quality and availability for analytics. The role includes implementing ETL automation, CI/CD for data workflows, IaC, containerization with Docker/Kubernetes, and monitoring with Prometheus/Grafana. You will collaborate with data scientists and analysts, mentor junior engineers, and contribute to data governance and documentation.

At EY, we’re all in to shape your future with confidence. 

We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. 

Join EY and help to build a better working world. 

 

Designation-Data Engineer

 

Job Description

  • Develop and maintain scalable data pipelines using tools like Apache Kafka, Apache Spark, or AWS Glue to ingest, process, and transform large datasets from various sources, ensuring efficient data flow and processing.
  • Design and implement data models and schemas in data warehouses (e.g., Amazon Redshift, Google BigQuery) and data lakes (e.g., AWS S3, Azure Data Lake) to support analytics and reporting needs.
  • Collaborate with data scientists and analysts to understand data requirements, ensuring data availability and accessibility for analytics, machine learning, and reporting.
  • Utilize ETL tools and frameworks (e.g., Apache NiFi, Talend, or custom Python scripts) to automate data extraction, transformation, and loading processes, ensuring data quality and integrity.
  • Monitor and optimize data pipeline performance using tools like Apache Airflow or AWS Step Functions, implementing best practices for data processing and workflow management.
  • Write, test, and maintain scripts in Python, SQL, or Bash for data processing, automation tasks, and data validation, ensuring high code quality and performance.
  • Implement CI/CD practices for data engineering workflows using tools like Jenkins, GitLab CI, or Azure DevOps, automating the deployment of data pipelines and infrastructure changes.
  • Collaborate with DevOps teams to integrate data solutions into existing infrastructure, leveraging Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation for provisioning and managing resources.
  • Manage containerized data applications using Docker and orchestrate them with Kubernetes, ensuring scalability and reliability of data processing applications.
  • Implement monitoring and logging solutions using tools like Prometheus, Grafana, or ELK Stack to track data pipeline performance, troubleshoot issues, and ensure data quality.
  • Ensure compliance with data governance, security best practices, and data privacy regulations, embedding DevSecOps principles in data workflows.
  • Participate in code reviews and contribute to the development of best practices for data engineering, data quality, and DevOps methodologies.
  • Mentor junior data engineers, providing guidance on data engineering practices, data architecture, and DevOps tools and techniques.
  • Contribute to the documentation of data architecture, processes, and workflows for knowledge sharing, compliance, and onboarding purposes.
  • Demonstrate strong communication skills to collaborate effectively with cross-functional teams, including data science, analytics, and business stakeholders.

 

Desired Profile

  • Seeking a DevOps Engineer with 5+ years of hands-on Cloud and DevOps experience, including significant leadership. Requires a Bachelor's/master’s in computer science.
  • Must have expert proficiency in Terraform and extensive experience across at least two major cloud platforms (AWS, Azure, GCP). Strong hands-on experience with Kubernetes, Helm charts, and designing/optimizing CI/CD pipelines (e.g., Jenkins, GitLab CI) is essential. Proficiency in Python and scripting (Bash/PowerShell) is also a must.
  • Valued experience includes leading cloud migrations, contributing to RFP/RFI processes, and mentoring teams. Excellent problem-solving, communication, and collaboration skills are critical. Experience with configuration management (Ansible, Puppet) and DevSecOps principles is required; OpenShift is a plus.

 

Experience

  • 10 years and above

 

Education

  • B.Tech. / BS in Computer Science

 

Technical Skills & Certifications

  • Certifications in cloud platforms (e.g., AWS Certified Solutions Architect, Azure Administrator, Google Professional Cloud Architect).
  • Terraform, Kubernetes, Python, CI/CD, Ansible, Security tools, Monitoring tools.

EY | Building a better working world

EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets.

Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow.

EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.