The Senior Data Engineer will lead the implementation of a modern data platform based on data lakehouse principles, focusing on scalable data ingestion, transformation, and governance. The role requires strong expertise in data engineering technologies and collaboration with analytics and business teams to optimize data as a valuable asset in the evolving European energy markets.
When you join EDF Trading, you’ll become part of a diverse international team of experts who challenge conventional ideas, test new approaches, and think outside the box. Energy markets evolve rapidly, so our team needs to remain agile, flexible, and ready to spot opportunities across all the markets we trade in power, gas, LNG, LPG, oil, and environmental products. EDF Group and our customers all over the world trust that their assets are managed by us in the most effective and efficient manner and are protected through expert risk management. Trading for over 20 years, it’s experience that makes us leaders in the field. Energy is what we do. Become part of the team and you will be offered a great range of benefits, which include (location dependent) hybrid working, a personal pension plan, private medical and dental insurance, bi-annual health assessments, corporate gym memberships, an electric car lease programme, childcare vouchers, a cycle-to-work scheme, season ticket loans, volunteering opportunities, and much more. Gender balance and inclusion are very high on the agenda at EDF Trading, so you will become part of an ever-diversifying family of around 750 colleagues based in London, Paris, Singapore, and Houston. Regular social and networking events, both physical and virtual, will ensure that you always feel connected to your colleagues and the business. Who are we? We are EDF Trading, part of the EDF Group - a world leader in low-carbon, sustainable electricity generation. Join us, make a difference, and help shape the future of energy. Data is Energy. EDF Trading is a data business. Trading is transitioning into a data driven business. High quality data and the agility of the analysis are becoming the differentiator. EDF Trading has a leading footprint in the European energy markets and wants to monetise and optimise data as an asset. The European energy space is complex and has a huge appetite for data. Power production from renewables in response to weather, capacity limitations across borders, storage optimisation modelling are some of the complex data opportunities we trade on every day. We’re looking for talented people who share our passion for data to join our team and seize these opportunities with us. The Data team is responsible for providing business solutions aimed at extracting value from large amounts of data. It covers activities such as collecting market data and building analysis tools, processing real-time data streams, data governance and data science. The role focuses on building the foundational data platform enabling all other data services. Main responsibilities include driving the implementation of a modern data platform based on data lakehouse principles, designing data storage solutions, creating scalable ingestion and transformation workflows, capturing and managing data lineage, ensuring data quality and consistency, collaborating on data modeling, adhering to governance and security standards, staying current with emerging technologies, participating in agile processes, mentoring team members, contributing to platform architecture, and collaborating with the data support team to monitor and resolve incidents. Required skills include hands-on experience with data lakehouse implementations using open standards like Parquet and Apache Iceberg, data transformation and lineage tools like dbt or Apache Spark, distributed data processing frameworks such as Spark, strong SQL expertise, familiarity with Git, and understanding of cloud and storage concepts like Azure ADLSv2. Desirable skills include experience with query engines like Trino or Dremio, vendor platforms such as Databricks or Microsoft Fabric, analytical databases like Clickhouse, containerization and orchestration with Kubernetes, workflow orchestration tools like Airflow or Dagster, streaming data platforms such as Apache Kafka, observability and monitoring of data pipelines, and knowledge of data governance and compliance best practices. The person specification highlights a hands-on approach, flexibility, positive attitude, quick understanding of complex problems, passion for building quality systems, strong communication skills, and ability to work fully in a multi-faceted team. Hours of work are 8.30am – 5.30pm Monday to Friday with hybrid working arrangements.