cover image
Loop Earplugs

Senior Data Engineer I

Hybrid

Antwerp, Belgium

Senior

Full Time

18-01-2026

Share this job:

Skills

Communication Leadership Python SQL Data Engineering GitHub CI/CD DevOps Docker Kubernetes Monitoring Decision-making Networking Azure Analytics GCP Snowflake Terraform Grafana CloudOps Infrastructure as Code GitHub Actions

Job Specifications

We are Loop

Loop Earplugs is reinventing how people experience the world by enabling them to live life at their own volume. 2024 was a pivotal year at Loop – we hit 14,000,000 earplugs sold, were named ‘The Best Earplugs for Concerts’ by The New York Times, and expanded our product offering to a complete range of solutions for day and night. Oh – and not to mention making history as the first company to win Deloitte’s Fast 50 Belgium twice.

But this is just the beginning. As we continue to develop disruptive products, we’re constantly on the lookout for the next big idea - and building a team who can make it happen.

Being a Looper means making a difference. Whatever your role, this is your stage to make a real impact and bring your ideas to life. Better yet, you’ll join a fast-growing, global team of 250+ people who are passionate, down-to-earth and eager to tackle challenges together.

5+ yrs data engineering experience | Hybrid | Antwerp or Amsterdam

In this role, you’ll work with a state-of-the-art data stack—DBT, dltHub, Jinja, Snowflake, and Dagster—and extend your impact into cloud operations across GCP/Azure. You’ll take direct ownership of building, deploying, and optimizing data pipelines and cloud infrastructure, ensuring high reliability, scalability, and security. You’ll collaborate closely with Data Engineers, AI Engineers, and Data Analysts to design robust, cloud-native data solutions that drive actionable insights and empower the business

Drive scalable cloud-native data infrastructure and empower decision-making

As a Senior Data Engineer I, you’ll ensure data and infrastructure flow seamlessly across GCP/Azure environments. You’ll automate, monitor, and optimize cloud-based data services—enabling the Analytics team and AI team to make confident, data-driven decisions. You’ll take ownership of both data pipelines and their cloud foundations, ensuring cost efficiency, resilience, and compliance as our operations scale rapidly across multiple regions. As a senior technical contributor, you’ll influence architectural decisions, define engineering standards, and mentor peers while fostering a strong culture of knowledge sharing within the data and AI community at Loop. Your work will ensure that our data platform remains secure, high-performing, and future-proof—a critical enabler of business success.

Your key responsibilities:

Design, develop and maintain ETL/ELT pipelines using Python, experience with dltHub is a plus.
Optimize data storage and performance within Snowflake.
Implement workflow orchestration with Dagster for reliable data processing.
Automate infrastructure and deployments, applying IaC and DevOps best practices.
Monitor, secure, and optimize cloud environments, ensuring uptime, cost efficiency, and compliance.
Collaborate with AI engineers, data analysts and business stakeholders to align data solutions with strategic goals, ensuring high data quality and accessibility.
Establish engineering standards, document best practices, and mentor junior engineers.
Contribute to knowledge-sharing initiatives that elevate data literacy and technical capability across teams.

How you’ll succeed:

You combine deep data engineering expertise with a cloud operations mindset. You’re passionate about building robust systems that scale, perform, and stay resilient. You enjoy collaborating with multidisciplinary teams and have a proactive approach to automation and

optimization. Your curiosity drives innovation, your mentorship elevates peers, and your technical leadership ensures that Loop’s data foundation is strong, flexible, and future-ready.

These key competences will set you up for success:

5+ years of experience building secure and scalable cloud data infrastructure, a degree in Computer Science or related field is a plus.
Proficiency in Python and SQL for data manipulation, automation, and orchestration.
Hands-on experience with Snowflake for data modeling, performance tuning, and integration with cloud ecosystems.
Strong CloudOps expertise in GCP and/or Azure, including compute, storage, IAM, and networking.
Proven experience with DBT for data transformation, testing, and documentation.
Familiarity with workflow orchestration tools such as Dagster. - Experience implementing Infrastructure as Code (IaC) using Terraform.
Experience with DevOps practices, including CI/CD pipeline tools such as Docker, Kubernetes, Github Actions.
Familiarity implementing monitoring and observability tools (e.g. Grafana, ELK).
Strong communication and collaboration skills, working effectively with cross-functional and non-technical stakeholders.

Job evolution

Within your first month, you’ll immerse yourself in Loop’s data and cloud ecosystem—learning how analytics, AI, and business teams interact with our data platform. You’ll help refine development practices, strengthen observability, and contribute to establishing standards for data quality, orchestration, and infrastructure management.
By you

About the Company

Our story began when Loop's founders, Maarten and Dimitri, had a common problem: ringing ears after a night out. They discovered that they were not alone, as 1 in 4 adults suffer from hearing damage. But why weren't they wearing earplugs? This question sparked the inception of Loop. At Loop, we are dedicated to revolutionizing the way earplugs look, feel, and sound. Whether you need hearing protection or simply want to ensure comfort for your ears, Loop allows you to live life at your own volume. Based in Antwerp with offi... Know more