cover image
Ippon Technologies

Ippon Technologies

fr.ippon.tech

4 Jobs

540 Employees

About the Company

USING OUR COLLECTIVE ENERGY FOR POSITIVE TECHNOLOGY
Ippon is a consulting and expertise firm, who is convinced that technology is a source of progress for society. We help our clients leverage their digital assets to design an appropriate strategy and deploy their transformation roadmap at scale.

Product Data Cloud & DevOps Software Engineering

STRATEGY
Design your organization's strategy and deploy your roadmap at scale to deliver value expected by the market.

TECHNOLOGY
We rely on powerful technology platforms to drive innovation and modernization.

TRANSFORMATION
Accelerating digital transformation for your organization by applying adapted support methodologies.

Listed Jobs

Company background Company brand
Company Name
Ippon Technologies
Job Title
Ingénieur Full Stack/Ingénieure Full Stack
Job Description
**Job title** Full Stack Engineer **Role Summary** Deliver end‑to‑end software solutions using cloud‑native and web technologies. Work within Agile frameworks, influence architectural decisions, enforce coding standards, and mentor peers in a collaborative engineering culture. **Expactations** * Apply Agile practices (SCRUM, XP, SAFe) in day‑to‑day delivery. * Lead technical choices of frameworks, libraries, and CI/CD pipelines. * Write maintainable, well‑tested code following TDD/BDD principles. * Share knowledge through code reviews, pair programming, and internal sessions. **Key Responsibilities** * Design, develop, and maintain scalable back‑end services in Java/Spring, Node.js, or Python. * Build interactive front‑end interfaces with Angular, Vue, or React. * Integrate front‑end and back‑end components via REST/GraphQL, ensuring secure and performant data flow. * Create and manage CI/CD pipelines (Git, Docker, Kubernetes, Jenkins, GitHub Actions). * Collaborate with cross‑functional teams (product, design, QA) to deliver high‑quality releases on schedule. * Participate in architectural reviews, propose optimizations, and keep technical debt under control. * Mentor junior developers and promote a culture of craftsmanship and continuous improvement. **Required Skills** * Proficiency in at least one modern back‑end language: Java (Spring), Node.js, or Python. * Strong front‑end skills in Angular, Vue, or React and experience with state management libraries. * Hands‑on experience with REST/GraphQL APIs, WebSockets, and micro‑service patterns. * Solid understanding of CI/CD tooling and container orchestration. * Familiarity with Agile methodologies (SCRUM, XP, SAFe) and modern project management tools (Jira, Trello). * Strong test‑driven development mindset (unit, integration, BDD). * Excellent communication, collaboration, and problem‑solving skills. **Required Education & Certifications** * Bachelor’s degree in Computer Science, Software Engineering, or related discipline. * Optional certifications: Java SE/EE certification, Scrum Master, or cloud platform credentials (AWS, Azure, GCP).
Nantes, France
On site
17-11-2025
Company background Company brand
Company Name
Ippon Technologies
Job Title
Architecte Data (Snowflake) – Richmond / Charlotte / Washington
Job Description
**Job Title** Data Architect – Snowflake **Role Summary** Lead end‑to‑end Snowflake migrations for highly regulated clients, design scalable data architectures, drive pre‑sales engagements, and provide strategic consulting on data strategy, governance, and optimization. **Expectations** - ≥10 years building cloud data architectures, 5+ years specializing in Snowflake. - Strong client‑oriented advisory skills with proven pre‑sales success. - SnowPro Core certified (SnowPro Advanced preferred). - Bachelor’s degree in Computer Science or equivalent experience. **Key Responsibilities** 1. Lead complex Snowflake migrations in regulated environments. 2. Conduct client data assessments, identify optimization opportunities, and develop actionable data roadmaps. 3. Design and implement scalable Snowflake architectures, virtual warehouses, data sharing, and security configurations. 4. Build and optimize ETL/ELT pipelines (Airflow, DBT) for ingestion, transformation, and loading. 5. Implement data governance, quality checks, and compliance controls. 6. Deliver pre‑sales presentations demonstrating Snowflake value. 7. Mentor junior engineers and collaborate cross‑functionally to ensure on‑time, on‑budget delivery. 8. Stay current on Snowflake and cloud data platform trends; share insights internally and with clients. **Required Skills** - Snowflake: architecture, performance tuning, cost optimization, Snowpipe, data sharing. - ETL/ELT tooling: Airflow, DBT, or equivalent. - Data modeling, advanced relational and dimensional design. - Data governance, security, and compliance in regulated settings. - Strong verbal and written communication; ability to explain technical concepts to non‑technical stakeholders. - Project management, time management, problem‑solving, and meticulous attention to detail. **Required Education & Certifications** - Bachelor’s degree in Computer Science, Information Systems, or related field (or equivalent experience). - SnowPro Core Certification (mandatory). - SnowPro Advanced: Data Engineering Certification (preferred).
France
Remote
Senior
20-11-2025
Company background Company brand
Company Name
Ippon Technologies
Job Title
ML Engineer H/F
Job Description
**Job title** ML Engineer (Data & AI) **Role summary** Design, develop, and deploy scalable AI/ML solutions on public cloud platforms. Integrate models into existing applications, industrialize workflows with MLOps, and implement generative AI services including LLMs, Retrieval Augmented Generation (RAG), and vector databases. **Expectations** - Proven experience in ML/AI engineering and deployment in production. - Strong knowledge of cloud services (AWS, GCP, Azure) and associated AI platforms (SageMaker, Vertex AI, Bedrock). - Hands‑on expertise with MLOps tools (MLflow, Kubeflow, Airflow, Terraform, GitLab). - Ability to design end‑to‑end data pipelines and model deployment strategies. - Solid programming skills in Python and experience with web frameworks (FastAPI, Flask) and generative AI libraries (LangChain, LlamaIndex). - Understanding of vector search technologies (PostgreSQL pgvector, OpenSearch, Vertex AI Vector Search). - Familiarity with CI/CD, containerization (Docker, Kubernetes), and serverless deployments (ECS, EKS, Lambda, Cloud Run). - Strong collaboration skills with data science, data engineering, and DevOps teams, plus awareness of ethical and legal AI considerations. - Continuous learner with a passion for emerging AI technologies. **Key responsibilities** - Architect and develop AI/ML systems that scale on public cloud infrastructures. - Build and maintain MLOps pipelines for training, evaluation, and production deployment. - Integrate ML models into enterprise applications using APIs or embedded services. - Deploy both custom and managed models across compute services (ECS, EKS, Lambda, Cloud Functions). - Implement generative AI solutions: large language models, RAG workflows, vector DB integration. - Perform model versioning, monitoring, and performance tuning. - Lead code reviews, unit testing, and follow software craftsmanship practices. - Provide insights on AI ethics, legal compliance, and best practices to stakeholders. - Contribute to technical blog posts and continuous learning activities. **Required skills** - Python programming (deep learning frameworks, data pipelines). - Cloud platforms: AWS, GCP, Azure – IAM, VPC, compute services. - AI/ML services: SageMaker, Vertex AI, Bedrock, LlamaIndex, LangChain. - MLOps: MLflow, Kubeflow, Airflow, Step Functions, Cloud Composer. - CI/CD & IaC: GitLab CI, Terraform, Docker, Kubernetes. - Serverless/pContainerized deployment: Lambda, ECS, EKS, Cloud Functions. - Database & vector search: PostgreSQL pgvector, OpenSearch, Vertex AI Vector Search. - Agile development, Test-driven development, DevOps culture. - Strong written and verbal communication for cross‑functional collaboration. **Required education & certifications** - Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Applied Mathematics, or related field. - Industry certifications (preferred): AWS Certified Machine Learning – Specialty, GCP Professional Data Engineer/ML Engineer, Azure AI Engineer Associate. - Experience with professional training programs (e.g., BlackBelt) is a plus.
Lyon, France
Hybrid
24-12-2025
Company background Company brand
Company Name
Ippon Technologies
Job Title
Data Engineer F/H
Job Description
**Job Title:** Data Engineer (F/M) **Role Summary:** Design, build, and industrialize data pipelines for clients across cloud platforms. Collaborate with business users and data scientists to support end‑to‑end data workflows, from ingestion through processing to exposure, while ensuring scalable, high‑quality delivery. **Expectations:** - Deliver robust, reusable data pipelines in a fast‑moving agile environment. - Provide cloud infrastructure as code and enable data science workflows. - Share knowledge through community engagement and internal documentation. **Key Responsibilities:** - Develop new data pipelines (ingestion, transformation, exposure) on client data platforms. - Support data scientists in industrializing models (testing, CI/CD, scalability, craftsmanship). - Deploy full cloud infrastructure using IaC (Terraform, CloudFormation). - Participate in internal and external data community events. - Author technical articles, retrospectives, and internal blog posts to disseminate best practices. **Required Skills:** - Strong data‑engineering background with pipelines, batch and stream processing. - Experience with distributed compute frameworks (Spark, Storm, Flink). - Proficient in SQL and data storage systems (SQL & NoSQL); familiarity with Snowflake is an asset. - Knowledge of streaming technologies (Kafka, Amazon Kinesis). - Hands‑on exposure to cloud services (AWS, GCP, Azure) and IaC tools (Terraform, CloudFormation). - Experience with DBT preferred. - Agile delivery mindset and commitment to code quality. - Excellent communication and collaborative skills; ability to share and learn within a community. **Required Education & Certifications:** - Master’s‑level (Bac+5) in Computer Science, Engineering, Data Engineering, or related field. - Professional certifications (e.g., cloud, data engineering) are a plus but not mandatory.
Toulouse, France
Hybrid
08-01-2026