Use Apache Kafka SASL OAUTHBEARER With Python

This post will teach you how to use the Confluent Python client with Simple Authentication and Security Layer Mechanisms (SASL)/OAUTHBEARER security protocol to produce and consume messages to topics in Apache Kafka and use the service registry to manage the JSON schema.

SASL/Oauthbearer is more secure than SASL/PLAIN where the username/password is configured in the client application. In case user credentials are leaked, the blast radius would be more significant as the user might have other access. In SASL/OAUTHBEARER, service accounts are preferred, which reduces the blast radius on leakage. It is recommended to have one service account per client application.  With service accounts, resetting credentials can be quickly compared to user credentials. 

Publish Keycloak Events to Kafka With a Custom SPI

In this post, you will build a custom extension known as Service Provider Interfaces (SPI) for Keycloak. The purpose of this SPI is to listen to the Keycloak events and publish these events to an Apache Kafka cluster as a topic per event type. These events will be consumed by a Quarkus consumer client application which will store and expose the API end-points that can be used for analysis like login counts when client x was created etc. For the demo, I am limiting the event types to only two events: Client and Client login. but all event types can be analyzed. 

Event types: