- Company Name
- JSAN Consulting Ltd
- Job Title
- Confluent Consulting Engineer
- Job Description
-
Job title: Confluent Consulting Engineer
Role Summary:
Provide expert consulting on event‑driven architecture and Confluent platform deployments. Design, implement, and optimize Kafka clusters on major cloud providers, integrate streaming services, and advise on best practices for data governance, observability, and scalability. Engage with cross‑functional teams across Europe via fully remote collaboration with quarterly on‑site visits.
Expectations:
- Deliver end‑to‑end Kafka solutions for enterprise clients.
- Maintain high quality code, architecture, and production‐ready deployments.
- Share knowledge through documentation, demos, and team training.
Key Responsibilities:
1. Design, configure, and operate Apache Kafka clusters on AWS, GCP, or Azure.
2. Implement Confluent platform components: Kafka Connect, Kafka Streams, ksqlDB, Schema Registry, and Confluent Control Center.
3. Build and maintain CI/CD pipelines, containerized deployments (Docker, Kubernetes), and infrastructure as code.
4. Monitor and tune streaming performance using Prometheus, Grafana, Splunk, or similar tools.
5. Implement data governance, RBAC, and lineage controls within Confluent environments.
6. Evaluate and integrate new streaming technologies (Apache Flink, ksqlDB Cloud, Confluent Cloud).
7. Mentor internal teams and client staff on best practices and architecture design.
Required Skills:
- 5+ years working with Apache Kafka ecosystem.
- Strong programming in Java, Python, or Scala.
- Hands‑on experience deploying Kafka on AWS, GCP, or Azure.
- Proficient with Docker, Kubernetes, and CI/CD workflows.
- Deep understanding of event‑driven architecture, streaming patterns, and data pipelines.
- Familiarity with Kafka Connect, Streams, ksqlDB, Confluent Schema Registry.
Bonus/Preferred Skills:
- Experience with Confluent Cloud (ksqlDB Cloud) or Apache Flink.
- Knowledge of stream governance, RBAC, and data lineage tools.
- Confluent certifications (Developer, Admin, Flink).
- Exposure to Prometheus, Grafana, Splunk, data lake or warehouse integration.
Required Education & Certifications:
- Bachelor’s degree in Computer Science, Software Engineering, or related technical field.
- Confluent Certified Developer or Administrator (preferred).
---