Understanding Event-Driven Architecture
Before diving into specific patterns, let’s establish a clear understanding of what constitutes an event-driven architecture.
What is an Event?
An event is a record of something that has happened—a fact. Events are immutable, meaning once an event has occurred, it cannot be changed or deleted. Events typically include:
- A unique identifier
- Event type or name
- Timestamp
- Payload (the data describing what happened)
- Metadata (additional contextual information)
Examples of events include:
- UserRegistered
- OrderPlaced
- PaymentProcessed
- InventoryUpdated
- ShipmentDelivered
Core Components of Event-Driven Systems
Event-driven architectures typically consist of these key components:
- Event Producers: Systems or services that generate events when something notable happens
- Event Channels: The infrastructure that transports events from producers to consumers
- Event Consumers: Systems or services that react to events
- Event Store: Optional component that persists events for replay, audit, or analysis
┌───────────┐ ┌───────────┐ ┌───────────┐
│ Event │ │ Event │ │ Event │
│ Producer │────▶│ Channel │────▶│ Consumer │
└───────────┘ └───────────┘ └───────────┘
│
▼
┌───────────┐
│ Event │
│ Store │
└───────────┘
Key Event-Driven Architecture Patterns
Let’s explore the most important patterns in event-driven architecture and how they can be applied in distributed systems.
1. Publish-Subscribe Pattern
The publish-subscribe (pub-sub) pattern is the foundation of most event-driven systems. In this pattern:
- Publishers emit events without knowledge of who will consume them
- Subscribers express interest in specific types of events
- An event broker or message bus handles delivery
Implementation Example
Using Apache Kafka:
// Producer code
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
Producer<String, String> producer = new KafkaProducer<>(props);
// Create an order event
OrderCreatedEvent event = new OrderCreatedEvent(
UUID.randomUUID().toString(),
"customer-123",
Arrays.asList(new OrderItem("product-456", 2, 25.99))
);
// Serialize to JSON
String eventJson = objectMapper.writeValueAsString(event);
// Publish the event
ProducerRecord<String, String> record = new ProducerRecord<>("order-events", event.getOrderId(), eventJson);
producer.send(record);
producer.close();
// Consumer code
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("group.id", "inventory-service");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
consumer.subscribe(Arrays.asList("order-events"));
while (true) {
ConsumerRecords<String, String> records = consumer.poll(Duration.ofMillis(100));
for (ConsumerRecord<String, String> record : records) {
// Deserialize from JSON
OrderCreatedEvent event = objectMapper.readValue(record.value(), OrderCreatedEvent.class);
// Process the event
inventoryService.reserveItems(event.getOrderId(), event.getItems());
}
}
When to Use
The pub-sub pattern is ideal when:
- Multiple consumers need to react to the same event
- Publishers and subscribers need to evolve independently
- You need loose coupling between components
Challenges
- Ensuring message delivery at least once
- Handling duplicate events
- Managing schema evolution
2. Event Sourcing Pattern
Event sourcing persists the state of a business entity as a sequence of state-changing events. Instead of storing just the current state, you store the full history of actions that led to that state.