Eberhard Wolff about Kafka Patterns and Anti-Patterns

แชร์
ฝัง
  • เผยแพร่เมื่อ 29 มิ.ย. 2024
  • Kafka has become one of the main drivers for event-oriented APIs and systems and ecosystems using that style. Kafka is a reliable and highly scalable tool that as an additional feature has the ability to store all messages indefinitely. This allows certain patterns such as event sourcing to be based on Kafka relatively easily.
    Eberhard Wolff explains why this ability of Kafka (or similar message-oriented systems) may lead to unfortunate design decisions. It may cause the messaging system to become the new centralized database, where all events are centrally stored and managed. This means that implementations using it that way violate encapsulation constraints which say that components should not expose (parts of) their implementation to the outside.
    Instead, a better model of using messaging systems like Kafka is to not treat them as part of the implementation of components, but instead to treat them strictly as communications infrastructure. At the very least, even when they are used as part of component implementations, then the models that these components do store should be treated as private data instead of exposing those implementation details.
    The original video by Eberhard Wolff: • Kafka - The New Databa...
  • วิทยาศาสตร์และเทคโนโลยี

ความคิดเห็น • 6

  • @mindhoc
    @mindhoc ปีที่แล้ว

    pure gold, i want to hear more

    • @ErikWilde
      @ErikWilde  ปีที่แล้ว

      Thank you, I'll ask Eberhard! ;-)

  • @kevinfleischer2049
    @kevinfleischer2049 10 หลายเดือนก่อน

    It would be worth to discuss the different strategies for event-Design. First of all "what is an Event" and secondly "what data should be inside an Kafka-Event".
    Eberhard told that one customer just adds "everything they have" into an Event.
    Alternatively you could do the opposite: Just write the "type" + the Key Identifiers. And all the attached Consumers have to know the corresponding data that is important for them.

  • @julianfranco7689
    @julianfranco7689 ปีที่แล้ว +2

    Great conversation. Really eye opening when thinking about the bigger picture and the dependency that gets created if the choice of having Kafka as the source of truth is made. This without even going in to the actual implementation overhead that comes from working with Kafka on some scenarios.
    I've started thinking of Kafka as a log more than a broker. The persistence is nice, but the added complexity on the consumer side due to the "dumb broker" model just didn't fit the bill for my use case.
    Nice shiny and powerful tool, but nothing is a silver bullet and everything comes at a cost. You just have to pick your poison in the end.

    • @ErikWilde
      @ErikWilde  ปีที่แล้ว

      Exactly. It's a nice tool but one that comes, like all tooks, with side-effects snd constraints. And since it's so simple you can use it in many different ways. I tend to ignore the persistence, but that's because I see it mostly as a broker.

  • @rahulsood81
    @rahulsood81 ปีที่แล้ว

    There's no context built up as part of the discussion. Everything seems so abstract, that only these experts can understand.