cover image
Lighthouse

Lighthouse

www.mylighthouse.com

2 Jobs

847 Employees

About the Company

Lighthouse is the leading commercial platform for the travel & hospitality industry. We transform complexity into confidence by providing actionable market insights, business intelligence, and pricing tools that maximize revenue growth.

Trusted by over 70,000 hotels in 185 countries, and supported by 700+ employees, Lighthouse is the only solution that provides real-time hotel and short-term rental data in a single platform. We strive to deliver the best possible experience with unmatched customer service. We consider our clients as true partners — their success is our success.

How can Lighthouse help you?

Rate Insight
Track competitor pricing to maximize revenue

Market Insight
Accurately predict future market demand

Parity Insight
Identify and solve parity issues portfolio wide

Benchmark Insight
Effectively navigate competitive performance trends

Distribution Insight
Optimize distribution channel performance

Destination Insight
Enrich your strategy with short-term rental insights

Business Intelligence
Monitor, optimize, and report on your performance

Pricing Manager
Boost revenue with AI-driven room price recommendations

Channel Manager
AI-powered channel management that transforms pricing and distribution for independent hotels

Hotel Data Solutions
Bespoke data sets to elevate your commercial strategy

Short-Term Rental Data Solutions
Power your short-term rental strategy with custom insights

Revenue Management Services
Master commercial strategy with expert revenue support

Listed Jobs

Company background Company brand
Company Name
Lighthouse
Job Title
Junior Data Engineer
Job Description
Job Title: Junior Data Engineer Role Summary: Support the stability and reliability of data integrations by maintaining existing pipelines, debugging issues, and collaborating with cross‑functional teams to ensure high‑quality data flow. Expactations: - Deliver stable integrations and data quality. - Resolve technical problems promptly and communicate solutions. - Contribute to process, tooling, and documentation improvements. Key Responsibilities: - Maintain and enhance existing data integrations to ensure reliability and data quality. - Investigate, debug, and resolve technical issues, performing root‑cause analysis. - Communicate clearly with internal stakeholders (e.g., Customer Care) about integration status and resolutions. - Collaborate with the data engineering team on pipeline development, optimization, and scalability. - Identify and implement improvements in processes, tooling, and documentation to increase efficiency. Required Skills: - Proficient in Python and SQL. - Strong debugging and problem‑solving skills in data pipelines. - Knowledge of cloud data services (Google Cloud Pub/Sub, BigQuery, BigTable, Dataflow) preferred. - Familiarity with Kubernetes and container orchestration. - Excellent written and verbal communication in English. - Collaborative, ownership mindset, meticulous attention to detail. Required Education & Certifications: - Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
Ghent, Belgium
Hybrid
Junior
04-02-2026
Company background Company brand
Company Name
Lighthouse
Job Title
Data Engineer
Job Description
**Job title:** Data Engineer **Role Summary:** Design, build, deploy, and maintain large‑scale data pipelines ingesting ~100 TB/day from diverse sources (APIs, webhooks, SFTP, cloud storage). Architect modern, scalable, and reliable data architecture for the hospitality domain, ensuring high‑fidelity data products and operational excellence. **Expectations:** - Deliver end‑to‑end pipeline solutions for 3 trillion‑record hotel dataset. - Transition legacy processes to modern, scalable architectures. - Apply AI, automation, and observability to improve pipeline reliability and speed. - Collaborate cross‑functionally with product, engineering, support, business, and external partners. **Key Responsibilities:** 1. **Pipeline Engineering:** - Design and implement scalable ingestion, transformation, and storage pipelines. - Integrate polyglot data sources (APIs, webhooks, SFTP, cloud storage). 2. **Domain Modeling & Transformation:** - Build complex domain models to unify heterogeneous data into high‑quality data products. 3. **Architectural Improvements:** - Modernize legacy processes, adopt cloud-native data services, and optimize cost/throughput. 4. **Operational Excellence:** - Develop advanced observability, automated testing, self‑healing mechanisms. - Conduct deep‑dive root‑cause analysis of production incidents. 5. **Team Collaboration & Leadership:** - Partner with product, engineering, support, business, and external stakeholders. - Mentor junior engineers; influence engineering velocity and quality. **Required Skills:** - **Programming:** Python (large‑scale data processing). - **Data Streaming & Messaging:** Kafka, Google Cloud Pub/Sub. - **Data Warehousing & Cloud Databases:** BigQuery, Snowflake, Databricks, Spanner. - **Cloud Platforms:** Proficient in GCP, AWS, or Azure; experience with Kubernetes. - **Data Architecture:** Design, deployment, testing, monitoring of complex pipelines. - **Observability & Automation:** Logging, metrics, automated testing, self‑healing. - **AI & Automation:** Use AI‑assisted tools for code quality and debugging. - **Communication:** Fluent English, stakeholder management, technical storytelling. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Computer Science, Engineering, or related technical field. - Certifications in cloud platforms (e.g., GCP Professional Data Engineer, AWS Big Data Specialty) preferred but not mandatory.
Ghent, Belgium
Hybrid
06-03-2026
Company background Company brand
Company Name
Lighthouse
Job Title
DevOps Engineer
Job Description
**Job Title:** DevOps Engineer **Role Summary:** Design, automate, and maintain scalable cloud infrastructure and CI/CD pipelines for a data‑centric commercial platform. Work cross‑functionally with product, integration, and data teams to ensure reliable, secure, and performant deployments at scale. **Expectations:** - Deliver end‑to‑end GitOps solutions that reduce deployment errors and increase engineer productivity. - Own cloud environments, scaling, and disaster‑recovery strategies across multiple regions. - Continuously research, evaluate, and adopt emerging DevOps technologies to improve platform performance. - Enforce security best practices in all infrastructure and deployments. **Key Responsibilities:** 1. Automate deployment, operation, and monitoring of applications using GitOps principles (ArgoCD, Terraform, cdk8s). 2. Optimize CI/CD pipelines to support 150+ engineers, reducing cycle time and improving code quality. 3. Manage cloud infrastructure (GCP, Azure, or AWS), including provisioning, scaling, and cost optimization. 4. Design and implement disaster‑recovery and high‑availability strategies. 5. Build and maintain observability stack (Prometheus, Grafana, Mimir), alerting, and incident response workflows. 6. Collaborate with security teams to ensure compliance with security policies and standards. 7. Drive technology evaluation and adoption (Kubernetes, Istio, Helm, FoundationDB, AI/LLM tooling like Claude/Gemini). 8. Coach and support engineering peers on DevOps best practices and tooling. **Required Skills:** - Proven DevOps experience with end‑to‑end pipeline ownership. - Strong proficiency in Kubernetes and related ecosystems (Istio, Helm). - Expertise in IaC tools (Terraform, cdk8s). - Experience with GitOps tools (ArgoCD, Flux). - Competence in cloud platforms (GCP) and familiarity with others (Azure, AWS). - Knowledge of monitoring/observability tools (Prometheus, Grafana). - Programming/scripting: Python, Golang, Typescript. - Familiarity with database technologies (FoundationDB). - Experience or interest in AI/LLM integration for development workflows (Claude, Gemini). - Excellent written and verbal communication in English. - Proactive, quality‑focused mindset with strong problem‑solving skills. **Required Education & Certifications:** - Bachelor’s degree in Computer Science, Information Technology, or related field *or* equivalent practical experience. - Certifications in cloud (e.g., GCP Associate Cloud Engineer, AWS Certified Solutions Architect) and/or Kubernetes (CKAD) are desirable.
Ghent, Belgium
Hybrid
16-03-2026