LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Snowflake Data Engineer

at Snowflake

Back to all Data Engineering jobs
S
Industry not specified

Snowflake Data Engineer

at Snowflake

Mid LevelNo visa sponsorshipData Engineering

Posted 4 hours ago

No clicks

Compensation
Not specified USD

Currency: $ (USD)

City
Not specified
Country
United States

Join Snowflake's Product and Data Science team as a Snowflake Data Engineer. You will design, build, and maintain scalable data pipelines using Snowflake and cloud technologies, optimize costs, and ensure data quality and governance. You'll develop dashboards with Streamlit, collaborate with product and data science teams, and help scale the enterprise data warehouse. Preference for strong Python, SQL, Streamlit apps, and experience with cloud platforms.

JOB DESCRIPTION

Snowflake is about empowering enterprises to achieve their full potential — and people too. With a culture that’s all in on impact, innovation, and collaboration, Snowflake is the sweet spot for building big, moving fast, and taking technology — and careers — to the next level.

We are looking for a skilled Snowflake Data Engineer to join our Product and Data Science team. The ideal candidate brings a solid background in data engineering, with deep expertise in Snowflake and cloud technologies.

In this role, you will design, build, and maintain robust data pipelines, ensuring our data infrastructure is scalable, reliable, and efficient. You will partner closely with product and data scientists, providing tools, automation, and best practices to accelerate their work, improve data quality, and ensure consistency across pipelines and dashboards.

Additionally, you will play a critical role in cost optimization for data operations, owning efficient data modeling and pipeline performance. This position is also key in building and scaling our enterprise data warehouse, helping to shape the future of our data ecosystem.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using Snowflake and cloud technologies.

  • Own and manage cost optimization for Product and Data Science team operations.

  • Optimize pipelines for performance, reliability, and efficiency.

  • Build and maintain dashboards and reporting platforms using Streamlit.

  • Collaborate with cross-functional teams to gather requirements and deliver robust data solutions.

  • Work with modern AI tools, including Cursor, to accelerate development.

  • Implement data governance, security best practices, and access control policies.

  • Develop and enforce data quality checks, SLA monitoring, and dependency tracking.

  • Utilize version control systems (e.g., Git) and manage deployment workflows.

  • Proactively troubleshoot and resolve data-related issues.

Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or a related field.

  • Proven hands-on experience with Snowflake, including data modeling, ETL/ELT development, and performance tuning.

  • Advanced proficiency in Python, with experience scripting and automating data workflows.

  • Demonstrated experience building Streamlit applications integrated with Snowflake.

  • Strong command of SQL for complex data querying and analysis.

  • Familiarity with cloud platforms such as AWS, Azure, or GCP, and services like S3, Redshift, or BigQuery.

  • Strong problem-solving and communication skills.

  • Ability to thrive both independently and as part of a collaborative, fast-paced team environment.

Preferred Qualifications:

  • Master’s degree in Computer Science, Engineering, or a related discipline.

  • Strong understanding of data warehousing principles, architecture, and best practices.

  • Experience working within Agile development frameworks (e.g., Scrum, Kanban).

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.

How do you want to make your impact?

For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

Snowflake Data Engineer

at Snowflake

Back to all Data Engineering jobs
S
Industry not specified

Snowflake Data Engineer

at Snowflake

Mid LevelNo visa sponsorshipData Engineering

Posted 4 hours ago

No clicks

Compensation
Not specified USD

Currency: $ (USD)

City
Not specified
Country
United States

Join Snowflake's Product and Data Science team as a Snowflake Data Engineer. You will design, build, and maintain scalable data pipelines using Snowflake and cloud technologies, optimize costs, and ensure data quality and governance. You'll develop dashboards with Streamlit, collaborate with product and data science teams, and help scale the enterprise data warehouse. Preference for strong Python, SQL, Streamlit apps, and experience with cloud platforms.

JOB DESCRIPTION

Snowflake is about empowering enterprises to achieve their full potential — and people too. With a culture that’s all in on impact, innovation, and collaboration, Snowflake is the sweet spot for building big, moving fast, and taking technology — and careers — to the next level.

We are looking for a skilled Snowflake Data Engineer to join our Product and Data Science team. The ideal candidate brings a solid background in data engineering, with deep expertise in Snowflake and cloud technologies.

In this role, you will design, build, and maintain robust data pipelines, ensuring our data infrastructure is scalable, reliable, and efficient. You will partner closely with product and data scientists, providing tools, automation, and best practices to accelerate their work, improve data quality, and ensure consistency across pipelines and dashboards.

Additionally, you will play a critical role in cost optimization for data operations, owning efficient data modeling and pipeline performance. This position is also key in building and scaling our enterprise data warehouse, helping to shape the future of our data ecosystem.

Responsibilities

  • Design, develop, and maintain scalable data pipelines using Snowflake and cloud technologies.

  • Own and manage cost optimization for Product and Data Science team operations.

  • Optimize pipelines for performance, reliability, and efficiency.

  • Build and maintain dashboards and reporting platforms using Streamlit.

  • Collaborate with cross-functional teams to gather requirements and deliver robust data solutions.

  • Work with modern AI tools, including Cursor, to accelerate development.

  • Implement data governance, security best practices, and access control policies.

  • Develop and enforce data quality checks, SLA monitoring, and dependency tracking.

  • Utilize version control systems (e.g., Git) and manage deployment workflows.

  • Proactively troubleshoot and resolve data-related issues.

Qualifications

  • Bachelor’s degree in Computer Science, Engineering, or a related field.

  • Proven hands-on experience with Snowflake, including data modeling, ETL/ELT development, and performance tuning.

  • Advanced proficiency in Python, with experience scripting and automating data workflows.

  • Demonstrated experience building Streamlit applications integrated with Snowflake.

  • Strong command of SQL for complex data querying and analysis.

  • Familiarity with cloud platforms such as AWS, Azure, or GCP, and services like S3, Redshift, or BigQuery.

  • Strong problem-solving and communication skills.

  • Ability to thrive both independently and as part of a collaborative, fast-paced team environment.

Preferred Qualifications:

  • Master’s degree in Computer Science, Engineering, or a related discipline.

  • Strong understanding of data warehousing principles, architecture, and best practices.

  • Experience working within Agile development frameworks (e.g., Scrum, Kanban).

Snowflake is growing fast, and we’re scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake.

How do you want to make your impact?

For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com

SIMILAR OPPORTUNITIES

No similar jobs available at the moment.