cover image
LINCOLN

LINCOLN

www.lincoln.fr

3 Jobs

365 Employees

About the Company

LINCOLN est une société de Conseil qui aide les entreprises à valoriser leurs données. Nous associons des compétences techniques et fonctionnelles pour répondre aux problématiques de nos clients. Nos 30 années d’expérience nous permettent aujourd’hui, de proposer des solutions sur mesure auprès de plus de 85 clients : du conseil sur des problématiques d’architecture, de gouvernance de la donnée, d’acculturation à la Data ou tout simplement de l’expertise sur un outil ou sur un sujet Data Driven. Avec 48% de croissance et plus de 650 projets délivrés sur 2023, nous continuons de grandir grâce à nos 400 collaborateurs. Experts, passionnés et impliqués dans les projets Clients, nous plaçons nos consultants au cœur de notre développement et ils nous permettent d’insuffler une belle dynamique au quotidien. Merci à nos consultants et nos clients pour leur confiance ! Light Up Your Data

Listed Jobs

Company background Company brand
Company Name
LINCOLN
Job Title
Agentic AI - Stage (F/H)
Job Description
**Job Title:** Agentic AI Internship (F/H) **Role Summary:** Support Lincoln’s generative AI R&D by researching agentic AI methods, identifying use cases, designing and implementing prototypes in an Azure cloud environment, and deploying user-facing interfaces, evaluation tools, and monitoring systems. Contribute to the company’s innovation community and future offerings. **Expectations:** - Full‑time internship starting March. - Final year engineering or computer science student (or equivalent). - Strong scientific background and motivation for state‑of‑the‑art AI technologies. **Key Responsibilities:** - Build technology watch and literature review on agentic AI. - Identify and scope potential business use cases. - Define project scope and methodology to meet objectives. - Develop the project following coding standards (Python, Azure services). - Create user interface or CLI for easy use. - Implement evaluation framework and performance monitoring. - Train end users on system usage. - Share technical findings with Lincoln teams and support other innovation initiatives. - Contribute to community activities and concept design for new service offerings. **Required Skills:** - Proficient in Python programming. - Familiarity with Git version control. - Experience deploying solutions on Microsoft Azure (cloud services, containers, or serverless). - Ability to read and interpret scientific/technical literature. - Strong English reading and written communication. - Autonomous, analytical, and rigorous work ethic. **Required Education & Certifications:** - Engineering or Computer Science degree, final year or equivalent. - Knowledge of AI, machine learning, and data analytics fundamentals. - No formal certification required; relevant coursework in AI/ML/Data Science preferred.
Boulogne-billancourt, France
On site
22-12-2025
Company background Company brand
Company Name
LINCOLN
Job Title
Data Engineer GCP/AWS
Job Description
**Job title:** Data Engineer GCP/AWS **Role Summary:** Design, build, and maintain scalable data pipelines for large enterprise cloud environments (GCP or AWS). Optimize batch and real‑time processing, model data for quality and performance, and industrialize data flows in a cloud context while collaborating cross‑functionally with Data, DevOps, and Business teams. **Expectations:** - Senior‑level expertise in a cloud platform (GCP or AWS). - Proficient in SQL and Python for data transformation and automation. - Solid grasp of cloud architectures, DataOps practices, and performance tuning. - Experience architecting and deploying production‑ready pipelines at scale. - Ability to navigate complex, high‑performance environments and deliver continuous improvement. **Key Responsibilities:** - Design, develop, and maintain robust, scalable data pipelines across GCP (BigQuery, Dataflow, Composer) or AWS (Glue, Redshift, EMR). - Optimize batch and streaming data workflows for performance and cost efficiency. - Model data schemas, enforce data quality, and implement governance practices. - Industrialize data flows into automated, repeatable processes. - Collaborate with Data, DevOps, and business stakeholders to define requirements and deliver solutions. - Continuously improve Data Engineering practices, tooling, and documentation. **Required Skills:** - Deep knowledge of GCP services (BigQuery, Dataflow, Composer) or AWS services (Glue, Redshift, EMR). - Strong SQL (analytic queries, schema design). - Python programming for data processing, orchestration, and automation. - Experience with cloud-native orchestration, scheduling, and monitoring (e.g., Cloud Composer, Airflow). - Familiarity with DataOps principles, CI/CD for data pipelines, and infrastructure as code. - Performance tuning, capacity planning, and cost optimization in cloud environments. **Required Education & Certifications:** - Bachelor’s or Master’s degree in Computer Science, Engineering, Data Science, or related field. - Relevant cloud certifications (e.g., Google Professional Data Engineer, AWS Big Data – Specialty).
Boulogne-billancourt, France
On site
27-02-2026
Company background Company brand
Company Name
LINCOLN
Job Title
Data Scientist Senior
Job Description
Job Title: Senior Data Scientist – Generative AI (GCP) Role Summary: Design, develop, and industrialize generative AI solutions on Google Cloud Platform (GCP) for large enterprise clients, focusing on large language models (LLMs), natural language processing (NLP), and Retrieval-Augmented Generation (RAG). Expectations: - Senior‑level expertise in generative AI with proven delivery on production‑grade models. - Ability to work independently in high‑pressure, structured environments such as CAC 40 accounts. - Strong cross‑functional collaboration with Data, IT, and business stakeholders. Key Responsibilities: - Develop LLM, NLP, and RAG‑based solutions tailored to client use cases. - Implement, deploy, and scale models using Vertex AI on GCP. - Build proof‑of‑concepts (POCs), evaluate performance, and transition successful prototypes to production. - Optimize model accuracy, latency, and resource usage for enterprise deployment. - Collaborate with Data Engineering, IT Operations, and domain experts to integrate solutions into business workflows. Required Skills: - Advanced knowledge of generative AI techniques and large language models. - Proficiency in Vertex AI, BigQuery, and Python (TensorFlow/PyTorch). - Experience cleaning, preprocessing, and feature engineering for NLP pipelines. - Strong understanding of model versioning, monitoring, and CI/CD for AI workflows. - Excellent communication skills in English; additional French a plus. Required Education & Certifications: - Bachelor’s degree in Computer Science, Data Science, Artificial Intelligence, or related field (Master’s preferred). - Relevant certifications: Google Cloud Professional Data Engineer or Professional AI Engineer (or equivalent).
Île-de-france, France
On site
Senior
27-02-2026