LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Applied Machine Learning Engineer, Circuit Design - New College Grad 2026

at Nvidia

Back to all Data Science / AI / ML jobs
N
Industry not specified

Applied Machine Learning Engineer, Circuit Design - New College Grad 2026

at Nvidia

GraduateNo visa sponsorshipData Science/AI/ML

Posted 11 hours ago

No clicks

Compensation
$116,000 – $218,500 USD

Currency: $ (USD)

City
Not specified
Country
United States

Applied Machine Learning Engineer, Circuit Design for New College Grad 2026 at NVIDIA. You will work within a multi-functional team on pre-silicon and post-silicon circuit design data, circuit/layout optimization, and Spice correlation. You will translate requirements into data science problems, architect and build solutions, and test/release models that integrate with existing ML tools. You will analyze datasets, raise hypotheses, extract features, build models, and optimize algorithms to achieve the desired quality/operational results.

Our work at NVIDIA is dedicated towards a computing model focused on visual and AI computing. For two decades, NVIDIA has pioneered visual computing, the art and science of computer graphics, with our invention of the GPU. The GPU has also shown to be spectacularly effective at solving some of the most complex problems in computer science. Today, NVIDIA’s GPU simulates human intelligence, running deep learning algorithms and acting as the brain of computers, robots and self-driving cars that can perceive and understand the world. We are looking to grow our company and teams with the smartest people in the world and there has never been a more exciting time to join our team!

What you'll be doing:

  • Work within a multi-functional team on various projects involving Pre-silicon and Post Silicon custom circuit design and related data, Circuit/Layout Optimization and Spice correlation

  • Work on projects with applications ranging from analysis of silicon data, manufacturing process variation analysis, VLSI circuit design and timing etc.

  • Responsible for translating the requirements into a data science problem, architect and build solutions.

  • In charge for testing and release of models that integrate with existing machine learning and visualization tools within the organization.

  • Responsible for analyzing the datasets, raise and validate hypotheses, extract relevant features and build models on top of them.

  • Optimize the models and algorithms until they reach the desired QOR.

What we need to see:

  • MS or PhD in Electrical/Computer Engineering (or equivalent experience).

  • Experience with VLSI, Circuit Design, CMOS Device Physics, Timing, ASIC, EDA is a strict requirement for this role.

  • Shown ability in writing code in Python and C++.

  • Experience in Applied Math/ML/Software programming.

Ways to stand out from the crowd:

  • Experience with ML/DL algorithms with frameworks such as TensorFlow, PyTorch, Spark is a plus.

  • Working with multiple levels and teams across organizations (engineering/research, product, sales and marketing teams).

  • Prior experience in CMOS layout drawing, including schematic-to-layout translation and DRC/LVS compliance, is a definite plus.

  • Effective verbal/written communication, and technical presentation skills.

  • Proven continuous learning and sharing findings across the team.

NVIDIA is widely considered to be one of the technology world’s most desirable employers. We have some of the most forward-thinking and hardworking people in the world working for us. If you're creative and autonomous, we want to hear from you!

#LI-Hybrid

Your base salary will be determined based on your location, experience, and the pay of employees in similar positions. The base salary range is 116,000 USD - 189,750 USD for Level 2, and 136,000 USD - 218,500 USD for Level 3.

You will also be eligible for equity and benefits.

Applications for this job will be accepted at least until January 27, 2026.

This posting is for an existing vacancy.

NVIDIA uses AI tools in its recruiting processes.

NVIDIA is committed to fostering a diverse work environment and proud to be an equal opportunity employer. As we highly value diversity in our current and future employees, we do not discriminate (including in our hiring and promotion practices) on the basis of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law.

Applied Machine Learning Engineer, Circuit Design - New College Grad 2026

at Nvidia

Back to all Data Science / AI / ML jobs
N
Industry not specified

Applied Machine Learning Engineer, Circuit Design - New College Grad 2026

at Nvidia

GraduateNo visa sponsorshipData Science/AI/ML

Posted 11 hours ago

No clicks

Compensation
$116,000 – $218,500 USD

Currency: $ (USD)

City
Not specified
Country
United States

Applied Machine Learning Engineer, Circuit Design for New College Grad 2026 at NVIDIA. You will work within a multi-functional team on pre-silicon and post-silicon circuit design data, circuit/layout optimization, and Spice correlation. You will translate requirements into data science problems, architect and build solutions, and test/release models that integrate with existing ML tools. You will analyze datasets, raise hypotheses, extract features, build models, and optimize algorithms to achieve the desired quality/operational results.

Our work at NVIDIA is dedicated towards a computing model focused on visual and AI computing. For two decades, NVIDIA has pioneered visual computing, the art and science of computer graphics, with our invention of the GPU. The GPU has also shown to be spectacularly effective at solving some of the most complex problems in computer science. Today, NVIDIA’s GPU simulates human intelligence, running deep learning algorithms and acting as the brain of computers, robots and self-driving cars that can perceive and understand the world. We are looking to grow our company and teams with the smartest people in the world and there has never been a more exciting time to join our team!

What you'll be doing:

  • Work within a multi-functional team on various projects involving Pre-silicon and Post Silicon custom circuit design and related data, Circuit/Layout Optimization and Spice correlation

  • Work on projects with applications ranging from analysis of silicon data, manufacturing process variation analysis, VLSI circuit design and timing etc.

  • Responsible for translating the requirements into a data science problem, architect and build solutions.

  • In charge for testing and release of models that integrate with existing machine learning and visualization tools within the organization.

  • Responsible for analyzing the datasets, raise and validate hypotheses, extract relevant features and build models on top of them.

  • Optimize the models and algorithms until they reach the desired QOR.

What we need to see:

  • MS or PhD in Electrical/Computer Engineering (or equivalent experience).

  • Experience with VLSI, Circuit Design, CMOS Device Physics, Timing, ASIC, EDA is a strict requirement for this role.

  • Shown ability in writing code in Python and C++.

  • Experience in Applied Math/ML/Software programming.

Ways to stand out from the crowd:

  • Experience with ML/DL algorithms with frameworks such as TensorFlow, PyTorch, Spark is a plus.

  • Working with multiple levels and teams across organizations (engineering/research, product, sales and marketing teams).

  • Prior experience in CMOS layout drawing, including schematic-to-layout translation and DRC/LVS compliance, is a definite plus.

  • Effective verbal/written communication, and technical presentation skills.

  • Proven continuous learning and sharing findings across the team.

NVIDIA is widely considered to be one of the technology world’s most desirable employers. We have some of the most forward-thinking and hardworking people in the world working for us. If you're creative and autonomous, we want to hear from you!

#LI-Hybrid

Your base salary will be determined based on your location, experience, and the pay of employees in similar positions. The base salary range is 116,000 USD - 189,750 USD for Level 2, and 136,000 USD - 218,500 USD for Level 3.

You will also be eligible for equity and benefits.

Applications for this job will be accepted at least until January 27, 2026.

This posting is for an existing vacancy.

NVIDIA uses AI tools in its recruiting processes.

NVIDIA is committed to fostering a diverse work environment and proud to be an equal opportunity employer. As we highly value diversity in our current and future employees, we do not discriminate (including in our hiring and promotion practices) on the basis of race, religion, color, national origin, gender, gender expression, sexual orientation, age, marital status, veteran status, disability status or any other characteristic protected by law.

SIMILAR OPPORTUNITIES

No similar jobs available at the moment.