Event Driven Arquitecture Series Part III — Building Blocks and Flows
Writing a few lines on what are the elements that make up an event-driven solution and its underlying architecture can be a good starting point to resume this little series after some time, especially in order to clarify the basics that must be taken into account.
in this article we will list the building blogs and elements to create an organized workflow in kafka
Big Building Blocks
From the point of view of the large blocks we need in order to develop this type of technological solutions, we can consider that data sources are applications and systems that send information outside their main domain of action.
These data sources are diverse and heterogeneous in nature, characterized by the need to process information on line and to be able to distribute under this medium, performing an accurate and standard data sharing in real time.
Considering that the information that is sent is events that happen in the application with associated information that in a basic way is divided into
- unique identifier of the event
- values related to the event itself
In the related values section, it took importance to validate the schemas and their validation in both production and consumption of events. which together with its inherent event governance in the face of extensive use.
Data destinations can be heterogeneous in nature but may require information processing with a transformational character (e.g. in the data format, in the conjunction of events) enrichment (e.g. through an external data source), which must be performed by the event engine
Elements for the solution
Therefore to service these elements we can propose an architecture using kafka, using :
- Producer and consumer.
- Event engine based on this elememts : topics, from 1 to n, for the producer and the consumer for the…