LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

AWS Solution Architect

at Capgemini

Back to all Cloud & DevOps jobs
Capgemini logo
Industry not specified

AWS Solution Architect

at Capgemini

Mid LevelNo visa sponsorshipAWS/GCP/Azure DevOps

Posted 19 hours ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Chennai
Country
India

Capgemini Invent is seeking an AWS Solution Architect to design, develop and maintain scalable data pipelines using AWS services such as Kinesis, TwinMaker, IoT Core, ECS, Athena, CloudFormation, Lambda and S3, with Snowflake. The role requires hands-on data engineering, CI/CD and DevOps practices, ensuring data quality, governance, and security across pipelines and storage. You will collaborate with data scientists, analysts, and application developers to deliver high-quality data solutions, and work with big data tools and timeseries data to build robust data architectures. Experience in Digital Twin technologies and working knowledge of both SQL and NoSQL databases is highly valued.

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities, collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow. Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose.

Your Role

  • Must have working knowledge in AWS Kinesis, AWS TwinMaker, AWS IoT Core, AWS ECS, AWS Athena, AWS CloudFormation, AWS Lambda, Amazon S3. Design, develop, and maintain scalable data pipelines using snowflake.Proven experience in Digital Twin technologies, preferably in maritime or industrial domains .Collaborate with data scientists, analysts, and application developers to deliver high-quality data solutions.
  • Hands-on experience with CI/CD tools and DevOps practices. Ensure data quality, governance, and security across all pipelines and storage layers. Write efficient and maintainable code in Python and PySpark.Develop and optimize SQL queries for data extraction and transformation. Experience with big data tools: Hadoop, Spark, Kafka, etc.Experience with relational databases such as Microsoft SQL Server, MySQL, PostgreSQL, Oracle and NoSQL databases such as Hadoop, Cassandra, Mongo dB
  • Strong analytic skills related to working with structured, semi structured, unstructured datasets.Build processes supporting data transformation, data structures, metadata, dependency and workload management. Implement data quality checks and monitoring, ensuring data accuracy and reliability.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.Strong problem-solving skills with an emphasis on sustainable and reusable development.Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.Should have demonstrable knowledge and expertise in working with timeseries data. Working knowledge in delivering data engineering

Your Profile

  • Provide innovative solutions to the Platform engineering problems that are faced in the project and solve them with technically superior code & skills.
  • Where possible, should document the process of choosing technology or usage of integration patterns and help in creating a knowledge management artefact that can be used for other similar areas.
  • Create & apply best practices in delivering the project with clean code.
  • Should work innovatively and have a sense of proactiveness in fulfilling the project needs.

What You Will Love about working Here

  • We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance.
  • At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities.
  • Equip yourself with valuable certifications in the latest technologies such as Generative AI.

Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

AWS Solution Architect

at Capgemini

Back to all Cloud & DevOps jobs
Capgemini logo
Industry not specified

AWS Solution Architect

at Capgemini

Mid LevelNo visa sponsorshipAWS/GCP/Azure DevOps

Posted 19 hours ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Chennai
Country
India

Capgemini Invent is seeking an AWS Solution Architect to design, develop and maintain scalable data pipelines using AWS services such as Kinesis, TwinMaker, IoT Core, ECS, Athena, CloudFormation, Lambda and S3, with Snowflake. The role requires hands-on data engineering, CI/CD and DevOps practices, ensuring data quality, governance, and security across pipelines and storage. You will collaborate with data scientists, analysts, and application developers to deliver high-quality data solutions, and work with big data tools and timeseries data to build robust data architectures. Experience in Digital Twin technologies and working knowledge of both SQL and NoSQL databases is highly valued.

At Capgemini Invent, we believe difference drives change. As inventive transformation consultants, we blend our strategic, creative and scientific capabilities, collaborating closely with clients to deliver cutting-edge solutions. Join us to drive transformation tailored to our client's challenges of today and tomorrow. Informed and validated by science and data. Superpowered by creativity and design. All underpinned by technology created with purpose.

Your Role

  • Must have working knowledge in AWS Kinesis, AWS TwinMaker, AWS IoT Core, AWS ECS, AWS Athena, AWS CloudFormation, AWS Lambda, Amazon S3. Design, develop, and maintain scalable data pipelines using snowflake.Proven experience in Digital Twin technologies, preferably in maritime or industrial domains .Collaborate with data scientists, analysts, and application developers to deliver high-quality data solutions.
  • Hands-on experience with CI/CD tools and DevOps practices. Ensure data quality, governance, and security across all pipelines and storage layers. Write efficient and maintainable code in Python and PySpark.Develop and optimize SQL queries for data extraction and transformation. Experience with big data tools: Hadoop, Spark, Kafka, etc.Experience with relational databases such as Microsoft SQL Server, MySQL, PostgreSQL, Oracle and NoSQL databases such as Hadoop, Cassandra, Mongo dB
  • Strong analytic skills related to working with structured, semi structured, unstructured datasets.Build processes supporting data transformation, data structures, metadata, dependency and workload management. Implement data quality checks and monitoring, ensuring data accuracy and reliability.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.Strong problem-solving skills with an emphasis on sustainable and reusable development.Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.Should have demonstrable knowledge and expertise in working with timeseries data. Working knowledge in delivering data engineering

Your Profile

  • Provide innovative solutions to the Platform engineering problems that are faced in the project and solve them with technically superior code & skills.
  • Where possible, should document the process of choosing technology or usage of integration patterns and help in creating a knowledge management artefact that can be used for other similar areas.
  • Create & apply best practices in delivering the project with clean code.
  • Should work innovatively and have a sense of proactiveness in fulfilling the project needs.

What You Will Love about working Here

  • We recognize the significance of flexible work arrangements to provide support. Be it remote work, or flexible work hours, you will get an environment to maintain healthy work life balance.
  • At the heart of our mission is your career growth. Our array of career growth programs and diverse professions are crafted to support you in exploring a world of opportunities.
  • Equip yourself with valuable certifications in the latest technologies such as Generative AI.

Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.

SIMILAR OPPORTUNITIES

No similar jobs available at the moment.