cover image
ENGIE - International Supply & Energy Management

ENGIE - International Supply & Energy Management

gems.engie.com

1 Job

1,470 Employees

About the Company

ENGIE supports global businesses in their decarbonization journey through reliable, flexible, and affordable energy solutions. The energy transition becomes tangible, scalable, and impactful by connecting clients to energy assets via global markets and local operations.
Through long-term supply, risk and flexibility management, and sustainable energy services, ENGIE helps clients navigate complex markets, reduce emissions, and secure their energy future -- leveraging deep market expertise and an extensive international footprint.

Key figures - International Supply & Energy Management perimeter:
* 500 TWh of energy delivered annually
* 59 GW of renewable & flexible assets managed
* 200,000+ global business clients
* 4.3 GW of green PPAs signed
* 250 LNG cargoes handled per year
* 19 offices around the world

Our commitments
* Deliver 24/7 carbon-free electricity for global business needs
* Help businesses manage energy risks and market exposure
* Enable smarter energy systems through data, digital tools, and expertise

Listed Jobs

Company background Company brand
Company Name
ENGIE - International Supply & Energy Management
Job Title
Data Engineer
Job Description
Job Title: Data Engineer Role Summary: Design, develop, and maintain large‑scale data pipelines and data architecture for energy analytics. Ensure data availability, quality, and performance while collaborating with data scientists, business, and IT teams. Deploy solutions on cloud platforms (AWS, Azure, GCP) and optimize processing with big‑data frameworks (Spark, Databricks). Expactations: * Deliver robust, scalable pipelines that support real‑time and historical asset data analysis. * Maintain high data quality and governance standards. * Resolve incidents swiftly and implement continuous improvements. * Adapt pipeline designs to evolving data scientist requirements. * Uphold coding standards, documentation, and CI/CD practices. Key Responsibilities: * Architect and implement ETL/ELT processes using Python, SQL, Spark, and Databricks. * Configure and manage cloud services (AWS S3, Glue, Athena, or equivalents). * Build and optimize data warehouses, ensuring performance and reliability. * Collaborate with data scientists to translate analytical needs into technical solutions. * Monitor production pipelines, troubleshoot issues, and conduct root‑cause analysis. * Apply DevOps principles: Infrastructure as Code, CI/CD, and automated testing. * Participate in Agile ceremonies and sprint planning. * Enforce data governance and security best practices. Required Skills: * Strong programming in Python and SQL. * Experience with Spark and Databricks. * Proven knowledge of ETL tools and pipeline orchestration. * Cloud proficiency (AWS, Azure, or GCP) – S3, Glue, Athena, etc. * Data warehousing concepts and best practices. * DevOps, IaC, CI/CD fundamentals. * Agile methodology and team collaboration. * Analytical problem‑solving and detail orientation. * Excellent communication in English; French required. Required Education & Certifications: * Master’s degree in engineering or equivalent with IT focus. * Minimum 3 years of professional experience designing and maintaining data pipelines. * Demonstrated proficiency in big‑data and cloud technologies. ---
Brussels, Belgium
On site
28-11-2025