Job Specifications
Every day, we deliver innovative solutions to improve the life of millions of people, connecting employees, companies, and merchants all around the world.
We know there are hundred ways for you to grow. With us, you will expand your skills in a multicultural, challenging, and dynamic environment.
Dare to join our client and get ready to thrive in a global company that will offer you endless opportunities.
Our client is all about meritocracy. You come as you are, and you contribute. Indeed, our client Group recognizes, recruits and develops all talents and singularities.
We are committed to preventing all forms of discrimination and to providing all our candidates with equal opportunities regardless of their gender and gender expression, disability, origin, religious belief and sexual orientation or any other criteria.
YOU WILL VIBE WITH US
As part of our BeLux division of our company you will be working as Data Engineer in our Tech Data Team
The main scope of Tech Data Team is on to create data products on our Global Azure Data Platform, build data sets, provide strategic dashboard & reports in PowerBi and enable self-service reporting across the organization.
As a Data Engineer, you are a professional specialized in the design, implementation, and management of the systems, architectures, and tools necessary for the collection, storage, transformation, and distribution of data. The primary role of a Data Engineer is to develop and maintain the technical infrastructure that allows companies to organize and efficiently exploit their data.
The key stakeholders for this position are:
Internal: End-users & key users, product managers and executives, process owners/team leaders and managers, Agile Teams (Scrum Master and Product Owners).
External: HQ’s corporate IT, Remote Development centers, customers and partners.
Reporting Line : you report directly to the IT Back-end & Data Product Owner and his N+2 will be the CTO BeLux.
As a Data Engineer, you collect and organize the data on an efficient, compliant & secured way by:
Developing processes for collecting, organizing, and storing data, thus creating essential data pipelines for the company's infrastructure.
Ensuring data quality and facilitate optimal access to data while complying with regulations, including GDPR and Data Protection Act.
Building and automating data pipelines, making raw data usable and optimizing the architecture for performance and efficiency.
Ensuring compliance with data governance and security standards.
Building and test complex data products, selects technologies for robust and sustainable solutions, and implements data models across various domains.
Managing metadata, you set up a repository to maintain accuracy and currency of information.
Facilitating data access for users such as data scientists and data analysts, while also contributing to the preparation of data for data science models.
WE WILL VIBE WITH YOU
You hold a master’s degree in IT.
You have a proven track record of minimum 2 years in the Data world as Data Analyst or Engineer.
You are fluent in English and French. Proficiency in Dutch is a plus.
Technical Environment - You have proficiency in:
Cloud Environment:
Use of AWS and primarily Microsoft Azure, Azure Databricks, and Azure Data Factory (bonus: Azure Synapse, Azure Storage, Azure Logic Apps).
Mastery of real-time data processing is a plus (data streaming) with tools like Azure Event Hubs for efficient event stream management and robust architecture design.
Programming Languages, Data Visualization, and Software Development
Proficiency in programming languages (Python, SQL, Java).
Mastery of frameworks for distributed data processing (Apache Spark, Apache Hadoop).
Proficiency in data visualization tools.
Use of version control systems such as Git.
Data Analysis
Use of data analysis languages for data access and manipulation.
Ability to perform statistical analyses to assess the quality of collected data.
Knowledge of the tools and methods required for DataOps.
Ability to verify the accuracy, clarity, and specificity of data before use.
Mastery of data security requirements and standards (GDPR, ethics).
Data Integration
Skills in designing and implementing data integration solutions, including ETL (Extraction, Transformation, Load).
Knowledge of ETL tools: ADF.
Infrastructure and Operations
Deep knowledge of data processing styles for building data pipelines.
Strong knowledge of CI/CD pipelines (Git, GitHub, MLflow, PowerShell).
Agile Methods in Business
Use of project management tools, like Azure DevOps.
Proficiency in collaboration software tools like Git, and the implementation of Microsoft Azure Repos.
Soft skills – you demonstrate:
Strong, assertive personality capable of leading through challenging situations.
Ability to analyze complex information, extract key ideas, & synthesize for decision
Excellent communication and collaboration skills, capable of engaging and influencing.
Stron