We are looking for a Kafka Engineer to join a cross-functional team managing real-time data streaming architecture and operations. In this role, you will contribute to designing, implementing, and maintaining Kafka-based messaging solutions supporting business-critical applications. You’ll collaborate closely with infrastructure, development, and operations teams to ensure stability, scalability, and performance of the platform.
You will play a key part in transforming event-driven architecture and streaming services, while also acting as a reference for best practices and Kafka expertise across the team.
What we expect from you
- Proven experience working with Apache Kafka in production environments.
- Hands-on knowledge of Kafka ecosystem tools : Kafka Connect, Kafka Streams, Schema Registry, and Confluent components.
- Ability to design, deploy, monitor, and maintain Kafka clusters.
- Strong understanding of distributed systems , high availability , and data replication principles.
- Good scripting and automation skills ( Bash , Python , or similar).
- Familiarity with DevOps practices , CI / CD pipelines, and monitoring tools (e.g., Prometheus, Grafana).
- Experience with Docker and Kubernetes is an advantage.
- Understanding of data security , message serialization formats (Avro, JSON, Protobuf), and data governance principles.
- Ability to troubleshoot complex production issues and ensure system resilience.