cover image
Cognite

Cognite

hubs.ly

2 Jobs

760 Employees

About the Company

Cognite is a global industrial Software-as-a-Service (SaaS) leader, with an eye on the future and a drive to digitalize the industrial world.

We've created a new class of industrial software which allows asset-intensive industries to operate more sustainably, securely, and efficiently.

Our core software product is Cognite Data Fusion (CDF), designed to quickly contextualize OT/IT data to develop and scale company solutions. We use technology like hybrid AI, big data, machine learning, and 3D modelling to get there.

We serve oil and gas, power and utilities, renewable energy, manufacturing, and other heavy-asset industries. Our technology helps them operate through transitions, sustainably and to scale, and without sacrificing bottom lines. We believe data must be made accessible, insightful, and open.

In other words, we help our customers make data do more. And with that pave way for a full-scale digital transformation of heavy industry.

Founded in 2016, we now number 700 strong, including some of the best software developers, data scientists, designers, and 3D specialists in the field. Hailing from 50 nationalities, we're talented, curious, and fun to be around.

Get to know us better:
Life at Cognite: linkedin.com/company/cognitedata/life/
Make data do more: makedatadomore.cognite.com/
Twitter: @CogniteData
Facebook: @CogniteData
IG: @CogniteData
Youtube: /c/cognite

Listed Jobs

Company background Company brand
Company Name
Cognite
Job Title
Data Engineer
Job Description
Job Title: Data Engineer Role Summary: Design, implement, and optimize scalable data engineering solutions using Cognite Data Fusion and industry-standard tools. Collaborate with cross-functional teams to enhance data pipelines and support industrial data operations. Expectations: Demonstrate ownership and initiative in solving complex data challenges. Work collaboratively in cross-functional teams. Independently investigate and resolve technical issues. Maintain a DevOps mindset with proficiency in Git, CI/CD, and deployment environments. Foster a culture of knowledge sharing and continuous learning. Key Responsibilities: Lead design and implementation of scalable data engineering solutions; manage data integrations, extractions, modeling, and analysis; develop custom data models for discovery, mapping, and cleansing; collaborate with data scientists, architects, and project managers on project deliveries; perform code reviews and enforce best practices; support customers and partners with data engineering tasks; contribute to the development of official tools and SDKs. Required Skills: 3-5+ years in data engineering; proficiency in Python, SQL, REST APIs; experience with distributed computing (Kubernetes), cloud platforms (GCP, Azure). Required Education & Certifications: Bachelor's or Master's degree in Computer Science or related field; equivalent relevant experience acceptable.
Lille, France
Remote
Junior
23-12-2025
Company background Company brand
Company Name
Cognite
Job Title
Senior Backend Software Engineer, Atlas AI
Job Description
Job Title: Senior Backend Software Engineer – Atlas AI Role Summary Design, build, and maintain high‑performance, scalable backend services that power AI agents and industrial data workloads. Lead architectural decisions for microservices, APIs, and AI infrastructure, ensuring reliability, security, and meet strict SLOs in a multi‑cloud SaaS environment. Expectations - Deliver robust, performant, and secure backend components for large‑scale industrial AI applications. - Own end‑to‑end service lifecycle: design, implementation, testing, deployment, monitoring, and on‑call support. - Mentor junior engineers and influence engineering culture across cross‑functional teams. Key Responsibilities - Develop and maintain REST/GraphQL APIs in Python to serve industrial time‑series and relational data, ensuring high code quality and test coverage. - Architect and deploy containerized microservices on Kubernetes (Azure, AWS, GCP) using Docker, Terraform, and CI/CD pipelines (Jenkins / GitHub Actions). - Evaluate, benchmark, and integrate LLMs (OpenAI, Anthropic, LangChain) and generative AI pipelines (RAG, agentic workflows) into production systems. - Design data schemas, indexing strategies, and access patterns for SQL, NoSQL, or graph databases (e.g., Neo4j) to support industrial protocols (OPC-UA, MQTT) and time‑series workloads. - Implement telemetry, automated testing, and observability to meet SaaS SLOs, including structured logging, metrics, and alerting. - Participate in on‑call rotations, root‑cause analysis, and service availability improvement initiatives. Required Skills - 5+ years professional backend development experience; strong in Python ecosystem (asyncio, FastAPI, Pydantic). - Proficient with Kubernetes, Docker, Terraform, and multi‑cloud orchestration. - Proven ability to design scalable APIs and microservices interacting with complex data stores (SQL, NoSQL, Graph). - Experience building and deploying production‑grade applications using LLMs and AI frameworks. - Solid understanding of CI/CD, automated testing, and DevOps tooling (Jenkins, GitHub Actions). - Excellent communication (English) to explain architectural trade‑offs to technical and non‑technical stakeholders. - Preferred: familiarity with GraphQL/graph databases, industrial protocols, time‑series databases, and AI/ML model plumbing. Required Education & Certifications - Bachelor’s or Master’s degree in Computer Science, Software Engineering, or related field. - Preferred certifications: AWS Certified Solutions Architect, Google Cloud Professional Architect, Azure Solutions Architect, or Certified Kubernetes Administrator.
Phoenix, United states
On site
Senior
14-03-2026