- Company Name
- KICKLOX
- Job Title
- Data engineer H/F
- Job Description
-
**Job Title**
Data Engineer
**Role Summary**
Design, build, and maintain robust data architectures (data lakes, warehouses) and develop scalable data pipelines to collect, transform, and deliver high‑quality data for analytics and machine learning. Collaborate closely with data scientists and engineers to optimize performance, enforce data security, and ensure reliable data access.
**Expectations**
- 7+ years of professional experience in data engineering.
- Contract: 6‑month freelance/full‑time.
- Daily rate: €425–€595 (seniority dependent).
- Ability to work partially remote.
**Key Responsibilities**
- Architect and implement data lakes, data warehouses, and related data infrastructure.
- Develop, deploy, and maintain end‑to‑end data pipelines (ingestion, ETL/ELT).
- Optimize pipeline performance, scalability, and cost efficiency.
- Collaborate with Data Scientists to deliver clean, well‑documented datasets.
- Implement data security and governance controls (access rights, encryption, audit).
- Monitor, troubleshoot, and resolve pipeline failures and performance issues.
- Produce clear documentation of data models, pipeline logic, and operational procedures.
**Required Skills**
- Strong background in Big Data technologies: Hadoop, Spark.
- Expertise with NoSQL databases (Cassandra, MongoDB, other NoSQL stores).
- Proficiency in programming languages: Python, Java.
- Experience with ETL/ELT tools and data orchestration frameworks (Airflow, NiFI, etc.).
- Solid understanding of data modeling, schema design, and performance tuning.
- Ability to design secure, compliant data pipelines.
- Proven problem‑solving and analytical abilities.
- Excellent communication, teamwork, and documentation skills.
**Required Education & Certifications**
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, Data Engineering, or related field (or equivalent practical experience).
- Certifications in Big Data or Cloud platforms (e.g., Hortonworks, Cloudera, AWS Big Data Specialty, GCP Data Engineer, Azure Data Engineer) are an advantage but not mandatory.