cover image
WHIZE

WHIZE

whize.fr

3 Jobs

21 Employees

About the Company

Crée en 2022, WHIZE est une filiale du groupe Neurones spécialisée dans :
Le développement de solutions sur mesure en architecture Serverless (Azure, AWS, GCP)
Le déploiement de solutions décisionnelles grâce à notre offre Data Intelligence

Chez WHIZE, nos collaborateurs sont notre ressource la plus précieuse et sont la clé du succès de nos projets pour nos clients. Notre politique RH se caractérise par une écoute attentive, une bienveillance constante et un suivi personnalisé, mettant ainsi en avant nos valeurs essentielles.

Listed Jobs

Company background Company brand
Company Name
WHIZE
Job Title
Consultant Devops Docker/Kubernetes
Job Description
Job title: Consultant DevOps Docker/Kubernetes Role Summary: Enable, deploy, and maintain containerized applications on Kubernetes across on-premises and cloud environments, ensuring production readiness, monitoring, and automation while guiding developers and infrastructure teams. Expectations: Deliver high‑availability, secure, and scalable solutions; demonstrate strong DevOps practices; collaborate across development, security, and operations; continuously improve deployment pipelines and platform operations. Key Responsibilities: - Onboard and migrate applications to Kubernetes platform. - Deploy new technology solutions and manage existing applications on Kubernetes. - Implement production‑ready environments with comprehensive monitoring and logging. - Provide consulting, training, and support to development and infrastructure teams. - Expand on‑prem deployments to cloud providers (AWS, Azure, GCP). - Automate and review existing processes and pipelines. - Integrate new platforms during mergers, projects, or expansions. - Maintain operational health of applications and underlying platform. Required Skills: - Proficient with Docker, Kubernetes, Rancher, and ArgoCD. - Experience with CI/CD platforms (Jenkins, GitLab CI, etc.). - Knowledge of monitoring and observability tools (Grafana, Prometheus, Sysdig). - Familiarity with SSO and application security tools. - Understanding of middleware management in both traditional and cloud environments. - Programming exposure in Java, .NET, Node.js, or Python. - Strong scripting and automation skills. Required Education & Certifications: - Minimum bachelor’s degree in Computer Science, Software Engineering, or related field. - Certifications in Kubernetes (CKA, CKAD) and Docker (DCA) or equivalent DevOps credentials are highly desirable.
Neuilly-sur-seine, France
Hybrid
31-12-2025
Company background Company brand
Company Name
WHIZE
Job Title
Consultant Big Data / PySpark
Job Description
Job Title: Consultant Big Data / PySpark Role Summary: Serve as a senior data engineer responsible for designing, developing, and operating scalable distributed data pipelines within a serverless cloud environment. Deliver automated data availability on a Big Data platform, optimize batch and streaming processes, and evolve data models to meet evolving business requirements. Expectations: Deliver reliable, high‑performance data pipelines using PySpark; ensure data quality, scalability, and governance; maintain platform availability and performance; collaborate across technical and business domains to translate requirements into data solutions. Key Responsibilities: - Develop and deploy distributed data pipelines with PySpark on cloud-native platforms (Azure, AWS, GCP). - Automate batch and streaming data ingestion, transformation, and delivery workflows. - Optimize existing pipelines for performance, cost, and resource utilization. - Refactor and evolve data schemas, ETL logic, and metadata to align with business needs and Big Data best practices. - Monitor platform health, data quality, and resource consumption; implement alerting and remediation strategies. - Evaluate, test, and recommend new tools, frameworks, and technologies to enhance performance, security, and governance. - Participate in platform migration or upgrade initiatives and support version transitions. Required Skills: - Proficient in PySpark for large‑scale distributed data processing. - Strong Python and SQL programming skills. - Experience with Big Data ecosystems (Hadoop, Spark, Data Lake, Data Warehouse). - Knowledge of cloud data services (Azure Data Lake, AWS Redshift/Athena, GCP BigQuery). - Ability to design and maintain scalable, fault‑tolerant data pipelines. - Familiarity with DevOps practices (CI/CD, infrastructure as code, monitoring). - Good communication skills for collaboration with technical and business stakeholders. Required Education & Certifications: - Bachelor’s degree in Computer Science, Information Technology, Data Engineering, or related field. - Optional certifications: Microsoft Certified: Azure Data Engineer Associate; AWS Certified Data Analytics – Specialty; Databricks Certified Associate Developer.
Neuilly-sur-seine, France
Hybrid
30-01-2026
Company background Company brand
Company Name
WHIZE
Job Title
Consultant Technique Tableau
Job Description
**Job Title:** Technical Tableau Consultant **Role Summary:** Support clients in building data visualization solutions using Tableau to derive actionable insights from their data. Focus on designing, developing, and optimizing dashboards and reports while ensuring data accuracy, performance, and security. **Expectations:** - Experience in business intelligence or data visualization projects - Mastery of Tableau tools (Desktop, Server/Cloud, Prep) for data analysis and reporting - Strong analytical skills to translate business needs into technical solutions **Key Responsibilities:** - Gather and analyze client requirements to design tailored BI solutions - Develop and optimize interactive dashboards, reports, and data models in Tableau - Prepare/clean data using Tableau Prep or ETL tools for analytical readiness - Integrate diverse data sources (SQL databases, APIs, ERPs, etc.) - Ensure performance optimization and data governance in Tableau environments - Train end-users on effective use of Tableau for decision-making - Monitor and adopt advancements in Tableau technologies **Required Skills:** - Advanced proficiency in Tableau Desktop, Server/Cloud, and Prep - Strong SQL skills and understanding of relational databases - Data storytelling abilities to communicate complex insights simply - Familiarity with ETL concepts and multi-source data integration - Analytical mindset with attention to detail and problem-solving **Required Education & Certifications:** - Bachelor’s degree in computer science, business analytics, or related field - Professional certifications in Tableau or BI methodologies advantageous but not mandatory
Île-de-france, France
Hybrid
30-01-2026