Talent.com
Diese Stelle ist in deinem Land nicht verfügbar.
Distributed Systems Engineer & Consultant (m / f / d)

Distributed Systems Engineer & Consultant (m / f / d)

beON consultCologne, DE
Vor 30+ Tagen
Anstellungsart
  • Remote
  • Vollzeit
Stellenbeschreibung

Responsibilities

  • As Apache Kafka Consultant with profound distributed systems experience, you will be responsible for the administration and deployment of customized and advanced event streaming platforms based on Apache Kafka, current industry standards and using the latest tools and methods.
  • You are in contact with your customer and are responsible for the preparation, planning, migration, control, monitoring and implementation of highly scalable event streaming platforms or Kafka projects and for comprehensive customer consulting on the current state of these technologies.
  • As a Consultant for Big Data Management and Stream Processing, your goal is to implement the design and architectures for streaming platforms and stream processing use cases using open source and cloud tools.

Qualifications

  • Completed studies or comparable training with a technical background
  • Sound experience and knowledge in Java
  • Experience with Apache Kafka or similar large-scale enterprise distributed data systems technologies e.g. Kafka, Spark, CockroachDB, HDFS, Hive, etc.
  • Experience in software development and automation to run big data systems
  • Experience with implementing complex solutions for Big Data and Data Analytics applications
  • Knowledge in system deployment and container technology with building, managing, deploying and release managing Docker containers and container images based on Docker, OpenShift and / or Kubernetes
  • Knowledge in developing resilient scalable distributed systems and microservices architecture
  • Experience with at least one of the distributed technologies (e.g. Kafka , Spark, CockroachDB, HDFS, Hive, etc.)
  • Experience with at least one of the stream processing frameworks (e.g. Kafka Streams , Spark Streaming, Flink, Storm)
  • Knowledge in Continuous Integration / Continuous Delivery (CI / CD) using Jenkins, Maven, Automake, Make, Grunt, Rake, Ant, GIT, Subversion, Artefactory and Nexus.
  • Understanding of SDLC processes (Agile, DevOps), Cloud Operations and Support (ITIL) Service Delivery
  • Knowledge in authentication mechanism with OAuth, knowledge of Vert.x and Spring Boot
  • Knowledge in SQL Azure , AWS development and cloud migration to one of AWS, Azure, Google Cloud Platform and / or Hybrid / Private Cloud; as well as cloud-native end-to-end solutions, especially their key building blocks, workload types, migration patterns and tools
  • Experience with monitoring tools and logging systems such as NewRelic, ELK, Splunk, Prometheus and Graylag
  • Ability to communicate technical ideas in a business-friendly language
  • Interest in modern organizational structure and an agile working environment (SCRUM)
  • Customer-oriented and enjoy working in an international environment in German and English