Event-Driven Pub-Sub Design
Event-Driven Pub-Sub Design
We take a look at the benefits that a pub-sub design can bring to the way applications deal with large amounts of data in various ways.
Join the DZone community and get the full member experience.Join For Free
In a complex, enterprise-level application, fundamental entities can be updated in various integration points or endpoints. You will build many business rules based on these entities' data. As the system grows bigger and bigger, it will be very difficult to track all the integration points that will update the entities' data.
In such cases, there are chances that it would be very difficult to get things right. It will be very hard to identify all integration points that will update the entities' data to plug the new business rules.
Let’s look at an example to make it clear. Assume you have an inventory system. It has basic entities like a Customer, Item, Item Categories, etc. Now, if there are any CRUD operations on these entities, you would like to execute some new business rules, like sending data to a new partner, MIS system, or another module in the system. In a case where these entities have multiple use cases to update the data, it would be difficult to trace all of them, capture the data, and execute dependent business use cases.
One way of achieving this is by directly integrating new business rules with the entities' class CRUD operation methods. It will make the integration more tightly coupled with the business entity's class and violates the single principle design principle. It is required to change the base class methods of the business entities every time you need to make some new change to the business logic. This invites repeated regression testing as well.
Instead of integrating it with base class methods for capturing data, we should follow the pub-sub design pattern. When the base entities are updated from any integration point, the pub-sub will push or broadcast the updated data to a single repository class. Anyone who is interested in making use of that data will have to plug into the repository class.
The bases entities should publish the updated data on every change. If the number of updates is too large then a message queuing system is recommended. In cases where the updates are not really large then we should use the below approach.
The below diagram depicts how entity data can be feed into listeners
To ensure that all listeners have a common set of events, we should implement an interface like IcustomerSub-pub in the above diagram. Anyone who wants to listen to the data update events of the Customer entity should implement the ICustomerSub-pub interface.
When customer data is changed, the new data will be sent to the <<CustomerDataProcessor>> class. The CustomerDataProcess class will have an IList of listeners. It will send the updated customer data to all listeners.
The listeners will receive the data and use it based on their requirements.
- It is not required to implement the complex queuing system.
- The Customer entity will follow the Open Close Principle and rule of Single Responsibility. When any new business rules need to be implemented, it is required to add a new class that implements the ICustomersub-pub interface. Later, we can add it to the Customer Data Processor list. No changes are required to the actual data entity class, like Customer or Item.
- Any changes required to listeners will not impact the base Customer entity.
The defined approach is only useful for smaller data transactions. If you have a case of large scale data transactions then go for message queueing patterns.
Opinions expressed by DZone contributors are their own.