Integrating Red Hat Single Sign-On With AMQ Streams for Auditing Events
Do you need to do audit events of your SSO Instance? Check this easy way to do it with AMQ Streams.
Join the DZone community and get the full member experience.Join For Free
Here I am again with another take from the field.
Red Hat Single Sign-on (RH-SSO) is the enterprise-ready version of Keycloak, and one thing that is most commonly asked, especially for big customers is, "How do we audit all the events?"
The out of the box way to do this is to save on the database, but this poses a silent and somewhat problematic threat: performance. RH-SSO generates a ton of events and, when doing complex authorization flows with lots of steps, this can become a burden on performance.
And then enters AMQ Streams, Red Hat's enterprise-ready version of Apache Kafka, with a simple Service Provider Interface (SPI) that we're going to build below.
All the code written below is available on GitHub.
So first, we'll Download RH-SSO the ZIP distribution and unzip and start it
After deployment, we check http://localhost:8080/auth to check if started correctly.
Now, with RH-SSO started and working we're going to build our custom SPI
First, we create a simple Java project with these dependencies
And to build our custom SPI, we start by creating a class that implements the EventListener Provider
We can see that this is a very straightforward interface. We have two methods, one to handle login events and one to handle admin events.
Now we start configuring our AMQ Streams client
We configured our bootstrap server, and out topics to send our records.
Now we create a simple send method
And we can just call the send method
The stringifier methods are just to create a string representation of the events.
Creating the Provider Factory
When deploying a custom SPI, we need to create a provider factory that is going to instantiate our EventListener, and we can also handle some configurations on creation and destruction.
To achieve this we implement the interface EventListenerProviderFactory
And the last step to make Keycloak see this and be able to show it on the web interface is creating a file on src/main/resources/META-INF/ named org.keycloak.events.EventListenerProviderFactory with the full qualified name of our ProviderFactory class
Now we can finally deploy our application, and we're going to take advantage of RH-SSO automatic deployment by simply copying our SPI to the deployments folder
In a terminal window, extract the amq-streams ZIP and start Zookeeper
Now start the Kafka broker
Now we just need to tell RH-SSO to send events to our SPI
In the events session, if everything is working, we should see our custom listener in the select box
And then we enable both the Login events and Admin events settings
We save these configs and now we go to AMQ-Streams to see the magic working. Try some logins and logouts with various users and, to check that the events are being generated, we can use the command below to consume the messages.
And that's it, our events are in a reliable tool that can be used to do many things, maybe real-time intrusion detection or plain, simple auditing. Feel free to explore.
Opinions expressed by DZone contributors are their own.