Why You Should Adopt Event-Driven Architecture
When an event is emitted, it determines which processors or functions will use it. This allows for decoupling, meaning that apps using the event can change without requiring changes in the emitter. For example, when entering a room and generating an “entered room” event, the light turns on in response to the event rather than a command. In contrast, a command-based approach would involve flipping a light switch to turn on the light. Adopting an event-first mindset is crucial when constructing something, going beyond primary use cases, and comprehending processes and architecture on a deeper level.
The event-first process supports repeated evaluation and processing, providing a “time machine.” It also offers horizontal scaling and a shared language with the business. In this paradigm shift, we discard traditional messaging and send events without API coupling to a remote service. This enables processing events as reactions without the emitter calling on a specific function. It simplifies the architecture and allows for greater flexibility. At some point, we need to go back to basics, back to the first principles of system design, and start again.
The common element of all these new world problems is that they revolve around the notion of real-time events. It has taken the last 25 years to reach the point where the event is a core system design principle. Throughout that period, much of the world was stuck thinking that remote object calling was the right way of doing things. REST and web services started by exposing the API as a concern, done by a synchronous protocol.
The value of events is that a sequence of related events represents behavior. For example, an item is added and removed from a shopping cart, an error recurs every 24 hours, or users always click through a site in a particular order. Start with the *event* rather than the coupled concept of command. This thinking underpins event-driven streaming systems. A stream of events captures temporal behavior. There are many considerations when evaluating event-driven architecture. Events start as atomic and drive reactionary callbacks (functions). This raises many questions. Is ordering important? Do I want to ensure transactionality? How do I trust the execution? Security? Lineage? Where did the event come from?
The critical realization for adopting event-first thinking is that an event represents a fact. Something happened, and it is immutable, thus changing our domain model. The other consideration is that circumstances do not exist in isolation. Due to their very nature, an event tends to be part of a flow of information, a stream. A stream is an unbounded sequence of related events associated with an “eventKey,” a key that ties many events together. Applications must run with 99.999% uptime and be superelastic, global, and cloud-native. This is where event-driven architecture provides a solid foundation to accomplish today’s business demands.
[¹]: Implementing Domain-Driven Design
[²]: Event Modeling