- Company Name
- UPPLI
- Job Title
- Data Ingeneer confirmé(e) (H/F)
- Job Description
-
**Job title**
Senior Data Engineer (H/F)
**Role Summary**
Architect and delivery expert for end‑to‑end data solutions. Design, build, and maintain scalable data warehouses, lakes, and pipelines on public cloud environments, ensuring high quality, security, and availability for enterprise clients.
**Expactations**
- Deliver production‑grade data platforms for multi‑tenant clients
- Champion best practices in architecture, performance tuning, and cost optimisation
- Lead automation of ingestion, transformation, and monitoring processes
**Key Responsibilities**
- Design, develop, and optimize scalable data architectures (Data Warehouses, Data Lakes)
- Implement cloud data services (AWS, GCP, Azure) and manage data storage, compute, and networking resources
- Create and maintain ETL/ELT pipelines using Airflow, Prefect, Dagster, or equivalent
- Automate data ingestion, transformation, and catalogue updates with Docker, Kubernetes, and CI/CD pipelines
- Enforce data quality, governance, and security policies; implement monitoring and alerting
- Collaborate with Data Analysts, Data Scientists, and business stakeholders to define requirements and support data use cases
- Scale and tune query performance; apply partitioning, indexing, and cost‑control strategies
- Maintain technical documentation of architectures, data flows, and standards
**Required Skills**
- 3–5+ years as a Data Engineer in Big Data projects
- Python (mandatory) and advanced SQL; experience with Scala, Java, or Go preferred
- Deep knowledge of at least one public cloud (AWS, GCP, Azure) and its data services (S3, BigQuery, Redshift, Databricks, Snowflake)
- Experience with Hadoop, Spark, Kafka, or equivalent data processing frameworks
- Proficiency with Airflow, Prefect, or Dagster for workflow orchestration
- Strong background in SQL databases (PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra, Elasticsearch)
- Infrastructure‑as‑Code with Terraform or CloudFormation
- Containers and orchestration (Docker, Kubernetes) plus CI/CD (GitLab CI/CD, GitHub Actions, Jenkins)
- Excellent problem‑solving, communication, and autonomous teamwork skills
- Professional English (technical documentation and stakeholder interaction)
**Required Education & Certifications**
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Engineering, or related field
- Relevant certifications (e.g., AWS Certified Solutions Architect, Google Cloud Professional Data Engineer, Azure Data Engineer Associate) are a plus but not mandatory.