cover image
LACO

LACO

www.laco.be

4 Jobs

243 Employees

About the Company

LACO is a Belgian-based business and data intelligence company with over 35 years of experience. Way back in 1994, we were the first SAS partner in Belgium. Today we are still the largest and most experienced partner. Since 2006, we have successfully expanded with a second entity to Microsoft BI. Our third practice specializes in data governance and advanced analytics also covering some best-of-breed technologies like R, Python and Hadoop. Today, our team of over 110 consultants is dedicated to helping businesses in a wide variety of industries to leverage the untapped power of data and navigate the digital realm.

LACO, the reliable data partner in your digital transformation.

Listed Jobs

Company background Company brand
Company Name
LACO
Job Title
Freelance Data Quality & Reporting Specialist
Job Description
**Job Title** Freelance Data Quality & Reporting Specialist **Role Summary** Responsibility for designing, maintaining, and enhancing Power BI data quality reports and data models. Collaborate closely with data domain owners, stewards, engineers, and governance teams to deliver actionable insights, operationalize data quality frameworks, and continuously improve data governance practices. **Expectations** * 4‑7 years of experience in data analysis and report creation * Proven ability to translate data insights into business actions * Strong analytical, problem‑solving, and communication skills * Fluency in English **Key Responsibilities** 1. Own and evolve Power BI data quality reporting suite. 2. Design consistent, reusable data models to support scalable reporting. 3. Work with Data Stewards and Domain Owners to define, validate, and interpret data quality metrics. 4. Operate and expand the Data Quality Framework, incorporating recurring anomalies and insights for governance improvement. 5. Profile datasets, identify quality issues, recommend and implement solutions. 6. Document findings, feed insights into framework evolution, and produce visual reporting artifacts. **Required Skills** * Advanced SQL proficiency * Expertise in Power BI development (reporting, dashboarding, DAX, Power Query) * Solid data modeling knowledge and experience bridging business and IT contexts * Experience collaborating with data governance teams, stewards, and technical experts * Strong analytical, problem‑solving, and business communication abilities **Required Education & Certifications** * Bachelor’s degree in Computer Science, Information Systems, Statistics, or related field (or equivalent practical experience) * Relevant certifications (e.g., Microsoft Certified: Data Analyst Associate, Data Management) are advantageous but not mandatory.
Brussels, Belgium
Hybrid
Junior
04-11-2025
Company background Company brand
Company Name
LACO
Job Title
Freelance Data Engineer Fabric
Job Description
**Job Title** Freelance Data Engineer Fabric **Role Summary** Freelance Senior Data Engineer responsible for end‑to‑end Microsoft Fabric implementation, including green‑field setup, Medallion‑architecture design, and high‑performance pipeline development. Provides mentorship and knowledge transfer to internal team while transitioning leadership to the client. **Expectations** - Own the full lifecycle of Fabric deployment from conception to production. - Ensure scalable, maintainable solutions incorporating best practices in data modeling and DevOps. - Deliver effective training and coaching to elevate the client’s internal data capabilities. - Facilitate smooth handover and long‑term sustainability of the data environment. **Key Responsibilities** 1. Design and launch a production‑ready Microsoft Fabric environment. 2. Architect a modern Medallion data model, applying industry‑standard practices. 3. Build and optimize scalable data pipelines using Python notebooks and Fabric Data Engineering features. 4. Implement Azure DevOps pipelines for continuous integration/continuous delivery of data solutions. 5. Conduct workshops, coaching sessions, and hands‑on knowledge‑transfer to the client’s data team. 6. Liaise with stakeholders to coordinate handover activities and ensure smooth transition of ownership. **Required Skills** - 5+ years of professional data engineering experience. - Completed at least one end‑to‑end Microsoft Fabric project. - Proficient in SQL, Python, and Azure DevOps. - Strong data modeling expertise. - Fluency in Dutch and French. **Required Education & Certifications** - No explicit education or certification prerequisites were listed in the posting. ---
Ghent, Belgium
Hybrid
Mid level
09-12-2025
Company background Company brand
Company Name
LACO
Job Title
Data Scientist (Client)
Job Description
Job title: Data Scientist (Client) Role Summary: Data Scientist responsible for developing and improving predictive models and AI-driven solutions within the Microsoft ecosystem. Acts as a bridge between technology and business, translating business questions into actionable insights, coaching colleagues, and enhancing data‑science capabilities across projects. Expactations: • Minimum 3 years of experience in data science or related analytics roles. • Bachelor’s or Master’s degree in Data Science, Computer Science, Mathematics, Statistics, or equivalent. • Proficiency in Python for predictive modeling (regression, classification, time series) and familiarity with machine‑learning frameworks. • Hands‑on experience or strong interest in Microsoft Fabric and Azure data & analytics services. • Experience working in agile environments. • Strong communication skills in English (written and spoken). Fluent in Dutch and/or French is a plus. Key Responsibilities: - Design, build, and deploy predictive models and AI solutions using Python. - Translate business requirements into data‑science use cases and deliver actionable insights. - Coach and mentor team members; share knowledge through knowledge‑sharing sessions and internal initiatives. - Collaborate closely with business stakeholders to understand challenges and objectives. - Experiment with advanced analytics, machine‑learning, and AI techniques to drive innovation. - Explore and leverage Microsoft Fabric and related Azure data services. - Communicate complex model outcomes clearly to non‑technical audiences. - Contribute to proposals, presentations, and internal best‑practice documentation. Required Skills: - Python programming (pandas, NumPy, scikit‑learn, etc.). - Predictive modeling: regression, classification, time‑series forecasting. - Machine‑learning frameworks (TensorFlow, PyTorch, ML.NET, etc.). - Familiarity with Azure services (Data Factory, Synapse, Azure ML) and Microsoft Fabric. - Knowledge of data‑science tooling (Jupyter, VS Code, Git, CI/CD pipelines). - Agile methodologies (Scrum, Kanban). - Clear written and verbal communication; ability to explain technical concepts to business stakeholders. - Coaching and mentoring mindset. Required Education & Certifications: - Bachelor’s or Master’s degree in Data Science, Computer Science, Mathematics, Statistics, or related field. - Certifications related to Azure AI/ML or Microsoft Fabric are advantageous.
Brussels, Belgium
Hybrid
Junior
13-01-2026
Company background Company brand
Company Name
LACO
Job Title
Analytics Engineer
Job Description
Job Title: Analytics Engineer Role Summary: Develop reliable data products by translating business requirements into structured analytics solutions. Focus on data modeling, pipelines, visualizations, and collaboration with stakeholders to drive actionable insights. Expectations: Deliver robust data models and pipelines; create dashboards; uphold governance standards; act as a bridge between technical and non-technical teams. Key Responsibilities: - Translate business needs into data solutions and models. - Design and maintain analytical data models (dimensional, data marts). - Develop and optimize ETL/ELT processes using SQL, Python. - Produce dashboards and reports via BI tools (Power BI, Tableau). - Collaborate with stakeholders to align solutions with business goals. - Ensure governance, documentation, and quality of data assets. Required Skills: - Advanced SQL proficiency. - Data modeling experience (dimensional modeling, star schemas). - Python for data transformation. - BI tool expertise (Power BI, Tableau). - Cloud data platforms (AWS, GCP, Azure). - Business requirement translation and problem-solving. - Clear communication for cross-team collaboration. Required Education & Certifications: Bachelor’s or Master’s in Computer Science, Data Analytics, or related field.
Flemish region, Belgium
Hybrid
14-01-2026