Paradigm Shift for Real-Time System Design
Event-first thinking changes how you think about what you are building. Capture facts. Capture behavior. Provide a representation of the real world. Model use cases of how. Supports repeated evaluation and processing (a time machine). Provide horizontal scaling. Speak the same language as the business. The approach is unique in that it processes the event as a reaction. The emitter doesn’t call on a specific function; the API has been removed and instead sends an event. Don’t do anything else, have no API coupling to a remote service.
The world is changing. New problems need to be solved. Applications must be able to run 24×7 with 5–9s (uptime of 99.999%) and be superelastic, global, and cloud-native. Enter event-driven architecture. A software architecture that ingests, processes, stores, and reacts to data. It opens new capabilities in how businesses run. It’s often said and well-proven throughout the NoSQL era that systems are designed to suit particular workloads and use cases. Traditional architectures need to meet real-time challenges on a mass scale. They have no mechanical sympathy and won’t work. A time-series database doesn’t handle relational use cases or modeling well. Likewise, a document database is great at structure but not so good at analytics against those documents.
The emitter of the event doesn’t know which processors (or functions) will consume it, and the event becomes the API. This decoupling allows the consuming apps to change over time without any upstream changes required in the emitter.
Event-first analog, I enter a room, generate an “entered room” event, and the light turns on. This is a reaction to an event.
Event-command analog, I walk into a room, flip the light switch, and the light turns on. This is a command. In the event-first analog, I do not know and don’t ask for the lights to be turned on; Instead, a sensor detects my presence; it has the responsibility.
The power of the event-first process becomes clear when we try to enhance the system. In the event-command analog, I am responsible for knowing how to turn on the light and make it happen. In the event-command pattern, the service knows the endpoints and API of the other services and calls them. At some point, we need to go back to basics, back to the first principles of system design, and start again.
The common element of all these new world problems is that they revolve around the notion of real-time events. It’s taken the last 25 years to reach the point where the event is a core system design principle. Throughout that period, much of the world was stuck thinking that remote object calling was the right way of doing things. REST and web services started by exposing the API as a concern, done by a synchronous protocol.
The value of events is that a sequence of related events represents behavior. For example, an item is added and removed from a shopping cart, an error recurs every 24 hours, or users always click through a site in a particular order. Start with the event rather than the coupled concept of command, e.g. An event-X has occurred, rather than *command-Y* should be executed. This thinking underpins event-driven streaming systems. A stream of events captures temporal behavior. There are many considerations when evaluating event-driven architecture. Events start as atomic and drive reactionary callbacks (functions). This raises many questions. Is ordering important? Do I want to ensure transactionality? How do I trust the execution? Security? Lineage? Where did the event come from?
The critical realization for adopting event-first thinking is that an event represents a fact. Something happened; it is immutable and thus changes how our domain model. The other consideration is that circumstances do not exist in isolation. Due to their very nature, an event tends to be part of a flow of information, a stream.