cover image
evoteo

evoteo

www.evoteo.fr

4 Jobs

20 Employees

About the Company

evoteo est une entreprise spécialisée dans l’ingénierie des données, la gouvernance et l’intelligence artificielle. Nous accompagnons les entreprises dans la structuration, l’exploitation et la valorisation de leurs données en mettant en place des architectures performantes et évolutives. À l’ère de l’intelligence artificielle, les données sont plus que jamais un actif stratégique. Chez evoteo, nous combinons expertise en data engineering, en gouvernance des données et en modélisation IA pour garantir à nos clients des solutions innovantes et sécurisées. Nous intervenons sur l’ensemble de la chaîne de valeur des données : de la collecte et l’intégration à l’analyse avancée et aux modèles prédictifs basés sur l’IA. Nos équipes, composées d’ingénieurs data, de data stewards, de data scientists et de chefs de projet spécialisés, vous accompagnent dans l’optimisation de votre stratégie data & IA, tout en garantissant la conformité aux réglementations (RGPD, ISO 27001). Avec evoteo, transformez vos données en un moteur de croissance et d’innovation. evoteo intervient sur tous les aspects de la gestion des données. Nous accompagnons nos clients depuis l’émergence de nouvelles idées jusqu’à la phase de production. Nos experts vous assistent à chaque étape. Conception – Architecture – Pilotage – Déploiement – Optimisation – Analyse. Pour nous contacter : contact@evoteo.com

Listed Jobs

Company background Company brand
Company Name
evoteo
Job Title
DevOps Confirmé F/H
Job Description
Job title: Senior DevOps Engineer (F/M) Role Summary: Manage deployment, monitoring, and continuous improvement of a strategic Business Intelligence application’s infrastructure in a dynamic environment. Collaborate with developers, BI teams, and Product Owners to ensure stable, high‑performance production. Expactations: Minimum 3 years DevOps experience, strong scripting (Python), containerization, distributed architecture knowledge, and proficiency in monitoring, metrics, and security (TLS). Experienced in Agile methodologies and testing tools. Key Responsibilities: - Maintain production applications and environments. - Continuously improve DevOps infrastructure and processes. - Build and maintain CI/CD pipelines using GitLab. - Monitor production quality with Grafana and other tools. - Manage IaC with Terraform for automated deployments. - Optimize application metrics and supervision dashboards. - Estimate task effort, analyze impacts, and report alerts. - Write and update technical documentation. - Conduct performance analysis and optimization scenarios. - Facilitate technical meetings and collaborate on security and operations issues. Required Skills: - Mandatory: Ansible, GitLab, Terraform. - Proficient: Grafana, Kafka, Jenkins, Vault. - Desirable: Spring Boot, Java, Informatica, SQL. - Scripting: Python. - Containerization: Docker, Kubernetes. - Monitoring & metrics: Prometheus, Grafana, alerting. - Security: TLS, secrets management (Vault), secure deployments. - Agile knowledge: Scrum, Kanban, testing frameworks. Required Education & Certifications: - Bachelor’s (BAC+3) minimum; Master’s (BAC+5), DEA, or equivalent. - 3 + years of professional DevOps experience (excluding internships/alternatives).
Lyon, France
Hybrid
Junior
04-11-2025
Company background Company brand
Company Name
evoteo
Job Title
Chef de projet Data (H/F)
Job Description
Job Title: Data Project Manager (M/F) Role Summary Lead end‑to‑end data and billing process improvement initiatives. Align sales, middle office, IT, and finance teams to deliver missing invoices, resolve rejections, and eliminate inconsistencies. Drive project scope, schedule, budget, and stakeholder communication while applying agile best practices. Expectations • Deliver the defined MVP and long‑term roadmap within agreed time, cost, and quality targets. • Ensure clear, actionable documentation and regular status updates to all stakeholders. • Maintain high compliance with technical and functional specifications, ensuring deliverables meet KPI and risk thresholds. Key Responsibilities • Identify and map required data treatments across cross‑functional services. • Conduct workshops on business scope, ROI, budget, and architecture approval. • Translate business needs into epics, features, and user stories; prioritize release backlog. • Define project resources, timeline, and budget trajectory. • Establish and monitor KPIs (case volume, recurrence, fix effectiveness). • Coordinate with IT teams to specify system requirements and oversee development progress. • Validate deliverables against commitments, working with delivery managers and QA. • Manage risks, escalating issues and recommending mitigation plans. • Communicate project status, risks, and changes to all stakeholders. • Organize and support business acceptance testing. • Facilitate change adoption within business units. Required Skills * Minimum 4 years senior project management in data initiatives (excluding internships or apprenticeships). * Strong knowledge of agile methodologies (Scrum/Kanban) and related tooling (Jira, Confluence). * Experience in data reconciliation, invoice management, and financial processes. * Excellent stakeholder management, facilitation, and documentation skills. * Ability to translate business requirements into technical specifications and user stories. * Proficient in KPI definition, reporting, and continuous improvement practices. * Risk management and change‑change, communication across technical and non‑technical audiences. Required Education & Certifications * Master’s level (BAC+5, e.g., Master 2, DESS, DEA) or equivalent advanced degree. * Project management certification preferred (PMP, Prince2, or equivalent). * Data governance or analytics certification is a plus.
Lyon, France
Hybrid
Junior
04-11-2025
Company background Company brand
Company Name
evoteo
Job Title
Lead IA / Lead IA Générative H/F
Job Description
Job title: Lead AI / Lead Generative AI (H/F) Role Summary Strategic technical lead for end‑to‑end AI solution delivery: architecture, development, industrialisation, and mentorship across data, ML, and product teams. Expectations - Drive AI strategy and deliver high‑quality, scalable generative models. - Champion best practices in ethics, governance, and explainability. - Operate autonomously while coordinating cross‑functional stakeholders. Key Responsibilities - Analyse functional & technical needs; document requirements and architecture. - Design, develop, and fine‑tune generative models (LLM, diffusion, multimodal). - Build end‑to‑end pipelines: training, validation, monitoring, deployment. - Develop APIs and microservices for text, image, audio, video. - Ensure model quality, robustness, security, latency, and cost efficiency. - Implement MLOps governance, CI/CD, and scalable infrastructure. - Mentor Data, AI, and Dev teams; conduct code reviews and agile ceremonies. - Collaborate with Product, DevOps, and business leaders on roll‑outs. Required Skills - Programming: Python (mandatory); bonus Rust/C++/Go. - ML Frameworks: PyTorch, TensorFlow, JAX. - Generative AI: GPT, LLaMA, Mistral, Claude, Gemini, diffusion models. - Tooling: LangChain, LlamaIndex, Hugging Face Transformers. - Application: FastAPI, Gradio, Streamlit. - Vector stores: FAISS, Pinecone, Weaviate, Chroma. - Environments: Docker, Kubernetes, CI/CD, MLOps best practices. - Methodologies: Agile (Scrum/Kanban). - Soft skills: clear communication, teaching, autonomy, organization, cross‑functional collaboration. Required Education & Certifications - Master’s (Bac+5) or equivalent in Computer Science, AI, Applied Mathematics, Data Science, or related field. - Minimum 5 years of AI/ML experience, including significant generative AI work. - Prior technical lead, mentoring, or supervisory experience preferred.
Lyon, France
Hybrid
Senior
05-12-2025
Company background Company brand
Company Name
evoteo
Job Title
Data ingénieur (F/H)
Job Description
Job title: Data Engineer (F/M) Role Summary: Design, develop, and industrialize large‑scale data processing solutions. Focus on Hadoop, Spark, and Scala, with a progressive transition to cloud platforms (GCP, Azure). Expectations: - Deliver scalable, performant, and robust Big Data pipelines. - Produce complete technical documentation and automated unit tests. - Collaborate in Agile Scrum teams and adhere to CI/CD best practices. Key Responsibilities: - Build and refine distributed Big Data workflows (Spark, Hive, Oozie). - Integrate new features into existing processing chains. - Optimize performance and increase resilience of data pipelines. - Maintain and evolve solutions in production environments. - Write and update detailed technical design, deployment, and operational docs. - Create automated unit tests and participate in code reviews. - Submit clear progress and activity reports. - Orchestrate executions using tools such as Oozie and Spark. Required Skills: - Programming: Scala (expert), SQL (proficient), Java (solid). - Version control: Git (proficient). - Big Data technologies: Spark, Hadoop, Hive (advanced to expert), Oozie (proficient). - Messaging & search: Kafka, ElasticSearch, Kibana. - Operating systems & containers: Linux, Docker. - Cloud: GCP (intermediate); Azure (desired). - CI/CD: GitLab CI/CD. - Agile/Scrum methodology. - Language proficiency: French (native), Technical English (read/write). Required Education & Certifications: - Master’s degree (Bac +5) in Computer Science, Data Engineering, or equivalent. - Minimum 5 years of professional experience in Big Data or Data Engineering environments. - Demonstrated autonomy, rigor, and initiative in complex data projects. - Commitment to quality, documentation standards, security, and confidentiality practices.
Lyon, France
On site
Mid level
16-01-2026