cover image
Viridien

Viridien

www.viridiengroup.com

3 Jobs

694 Employees

About the Company

Viridien is an advanced technology, digital and Earth data company that pushes the boundaries of science for a more prosperous and sustainable future. With our ingenuity, drive and deep curiosity we discover new insights, innovations, and solutions that efficiently and responsibly resolve complex natural resource, digital, energy transition and infrastructure challenges.

Building on our achievements and track record of innovation, Viridien continues to serve its clients in the energy industry with a unique portfolio of solutions across its core businesses of Geoscience, Earth Data and Sensing & Monitoring. As it accelerates its growth as an advanced technology company, Viridien has new offerings in the Low-Carbon markets of Minerals & Mining and Carbon Capture and Storage (CCS), and markets beyond energy in High-Performance Computing (HPC) and Infrastructure Monitoring.

Listed Jobs

Company background Company brand
Company Name
Viridien
Job Title
DevOps Engineer
Job Description
Job Title DevOps Engineer Role Summary Design, build, and maintain infrastructure and CI/CD pipelines for a data transformation and ML‑driven platform. Supports both on‑premise HPC and public cloud (Azure) deployments, ensuring high scalability, cost efficiency, and security across Kubernetes clusters. Drives DevOps best practices, automation, and observability to enable reliable software delivery. Expectations * Deliver robust, platform‑agnostic infrastructure and deployment solutions. * Collaborate with software, database, and cloud teams to resolve issues promptly. * Continuously improve processes, tooling, and documentation. * Maintain strong security posture through image scanning and vulnerability remediation. * Own end‑to‑end deployment pipelines and ensure adherence to performance, reliability, and cost targets. Key Responsibilities * Develop deep understanding of the data system stack and construct IaC assets (Terraform, Ansible). * Build, support, and evolve CI/CD pipelines using GitOps (Argo CD), GitLab CI, or Jenkins. * Deploy and manage application components in Kubernetes, including cluster and deployment strategy governance. * Integrate with Azure DevOps for cloud‑based deployments and collaborate with the cloud team. * Interface with on‑prem HPC, client IT, and cloud teams to diagnose and resolve deployment issues. * Enhance observability tooling (Prometheus, Grafana, etc.) across environments. * Promote and document DevOps best practices, auditing processes, and standard operating procedures. * Coordinate with external suppliers and internal practice owners to drive cross‑company improvements. * Perform general development and maintenance tasks as required. Required Skills * GitOps (Argo CD) and CI/CD pipeline development (GitLab CI, Jenkins). * Infrastructure as Code: Terraform, Ansible. * Containerization: Docker; vulnerability scanning, image size and build optimization. * Kubernetes: management of multiple clusters, deployment strategies, Helm or Kustomize templating. * Cloud project management (Azure, on‑prem HPC). * Observability tools (Prometheus, Grafana). * Python support and packaging for service deployment. * Shell scripting for automation. * Linux administration (Debian/Alpine). Required Education & Certifications * Bachelor‑level education or equivalent experience in Computer Science, Software Engineering, or related field. * Relevant technical certifications: Certified Kubernetes Administrator (CKA). * Azure foundational certification (AZ‑900) or higher. ---
Crawley, United kingdom
Hybrid
06-01-2026
Company background Company brand
Company Name
Viridien
Job Title
Data Engineer
Job Description
Job Title: Data Engineer Role Summary: Design, build, and maintain robust, secure, and scalable data pipelines and integration solutions that connect internal data platforms to external client systems. Lead the development of modular transformation frameworks and support deployment, monitoring, and performance optimization across the data lifecycle. Expectations: - Deliver high‑quality, production‑ready data integrations and transformations on schedule. - Collaborate with data scientists, analysts, and domain experts to align platform capabilities with business needs. - Uphold best practices in code quality, security, and performance, continually reducing technical debt. Key Responsibilities: - Plan, develop, deploy, and maintain connectors and integrations between the data system and client systems (record, downstream). - Contribute to architecture and infrastructure, including orchestration, processing logic, and component interactions. - Build and maintain end‑to‑end, metadata‑driven data pipelines with focus on monitoring, access control, and maintainability. - Create a modular framework that enables rapid addition of new transformation logic. - Influence technology choices and produce clear architectural diagrams and documentation for technical and non‑technical stakeholders. - Partner with end‑users to understand requirements and ensure data accessibility and reliability. - Share best practices, mentor junior engineers, and align engineering efforts with growth objectives. Required Skills: - Proficiency in Python and SQL; ability to write secure, performant code and optimize queries. - Experience with orchestration and ETL tools (e.g., Airflow). - Strong RDBMS background (PostgreSQL, Oracle); knowledge of NoSQL (Neo4j, Elastic) and vector databases beneficial. - Data architecture expertise: data modelling, 3NF, dimensional modelling, medallion architecture. - Containerization (Docker), version control (Git/GitLab), and CI/CD fundamentals. - Familiarity with API design (RESTful, GraphQL), networking, and security. - Proven experience in geoscience, oil & gas, or mining data integration and transformation. - Ability to communicate complex technical ideas to diverse audiences. Required Education & Certifications: - Bachelor’s degree in Computer Science, Software Engineering, or a related discipline. - No mandatory certifications; cloud or data‑engineering certifications (e.g., Azure Data Engineer Associate) are a plus.
Crawley, United kingdom
Remote
21-02-2026
Company background Company brand
Company Name
Viridien
Job Title
Analytics Engineer Intern
Job Description
**Job Title:** Analytics Engineer Intern **Role Summary:** Support the enterprise analytics platform for HR, Finance, Supply Chain, and HSE by designing semantic models in Microsoft Fabric, building Power BI dashboards, and delivering data‑driven insights. Work within structured project frameworks, collaborate with business stakeholders, and contribute to continuous improvement of data management processes. **Expectations:** - Demonstrate curiosity and strong analytical mindset. - Work autonomously with rigor while collaborating effectively in a team. - Handle incomplete or imperfect data and ensure data quality. - Communicate insights clearly to non‑technical stakeholders. **Key Responsibilities:** - Collect, organize, and structure field and bibliographic data. - Build and maintain reliable data flows and pipelines. - Analyze and process data using Microsoft Fabric. - Design, develop, and optimize Power BI dashboards and interactive reports. - Explore complex datasets to identify trends, opportunities, and actionable insights. - Collaborate with business teams to define KPI requirements and deliver analyses. - Monitor business performance and contribute to KPI definition. - Support continuous improvement of data management and analytics processes. **Required Skills:** - Proficiency in Power BI (data modeling, Power Query, report development) and advanced DAX. - Strong SQL skills and experience with relational databases. - Ability to consolidate and harmonize multi‑source data. - Familiarity with data structuring, modeling, and data quality concepts. - Strong analytical and problem‑solving abilities; autonomous work style. **Preferred Skills:** - Experience with Snowflake or similar cloud analytics platforms. - Basic PySpark programming skills. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Data Science, Data Analytics, Business Intelligence, Computer Science (data specialization), or Database Management.
Massy, France
On site
Fresher
02-03-2026