- Company Name
- LR TECHNOLOGIES - GROUPE
- Job Title
- Ingénieur(e) Data
- Job Description
-
Job title: Data Engineer
Role Summary: Design, develop and industrialize end‑to‑end data pipelines to collect, clean, model, catalogue and deliver high‑quality data for analytics across a shared data platform.
Expactations:
- Deliver robust, scalable, and monitored data solutions that meet business requirements.
- Maintain and improve data pipelines, ensuring performance, reliability, and data quality.
- Operate core data & analytics projects, providing second‑line support to operational teams.
- Lead technological innovation, contributing to the data & analytics roadmap.
Key Responsibilities:
- Build and optimize ELT processes using Matillion, Azure Data Factory, and Airflow.
- Integrate and orchestrate data from multiple sources into Snowflake, Denodo, and other data lakes.
- Develop and maintain data models, catalogs, and documentation for accessibility.
- Implement monitoring and alerting for pipeline health and data quality.
- Deploy and manage solutions on Azure and Google Cloud environments.
- Create CI/CD pipelines (Git/GitLab, Jenkins, Docker, Kubernetes, Ansible).
- Produce dashboards and reports using Power BI (MS Fabric) and Tableau.
- Collaborate with business stakeholders to translate functional requirements into technical specifications.
- Coordinate and evaluate external vendors or contractors.
- Conduct continuous learning and stay current on emerging data tools and best practices.
Required Skills:
- Programming: Python, Java, Shell scripting.
- Databases & Platforms: Snowflake (expert), Denodo (virtualization).
- ELT/ETL & Orchestration: Matillion, Azure Data Factory, Airflow.
- Analytics & Visualization: Power BI (MS Fabric), Tableau, Alteryx.
- Cloud & DevOps: Azure, Google Cloud, Jenkins, Git/GitLab, CI/CD, Docker, Kubernetes, Ansible.
- Strong analytical, problem‑solving, and project management abilities.
- Excellent communication in English (reading, writing, speaking).
- Ability to work independently, manage multiple projects, and lead technical discussions.
Required Education & Certifications:
- Minimum of 5 years of experience in data engineering, data architecture, or a comparable role.
- Bachelor’s degree or higher (Bac+5) in Computer Science, Data Engineering, Information Systems, or related field.
- Certifications in cloud platforms (Azure, GCP) or data engineering tools are advantageous.