How to Use PEM Certificates With Apache Kafka

It’s been a long time waiting but it’s finally here: starting with Apache Kafka 2.7, it's now possible to use TLS certificates in PEM format with brokers and Java clients. So, why does it matter?

PEM is a scheme for encoding x509 certificates and private keys as Base64 ASCII strings. This makes it easier to handle your certificates. You can simply provide keys and certificates to the app as string parameters (e.g. through environment variables). This is especially useful if your applications are running in containers, where mounting files to containers makes the deployment pipeline a bit more complex. In this post, I'll show you two ways to use PEM certificates in Kafka.

How to Use Protobuf With Apache Kafka and Schema Registry

Since Confluent Platform version 5.5, Avro is no longer the only schema in town. Protobuf and JSON schemas are now supported as first-class citizens in Confluent universe. But before I go on explaining how to use Protobuf with Kafka, let’s answer one often-asked question:

Why Do We Need Schemas?

When applications communicate through a pub-sub system, they exchange messages and those messages need to be understood and agreed upon by all the participants in the communication. Additionally, you would like to detect and prevent changes to the message format that would make messages unreadable for some of the participants.