Apache Kafka in Cybersecurity for Threat Intelligence

Apache Kafka became the de facto standard for processing data in motion across enterprises and industries. Cybersecurity is a key success factor across all use cases. Kafka is not just used as a backbone and source of truth for data. It also monitors, correlates, and proactively acts on events from various real-time and batch data sources to detect anomalies and respond to incidents. This blog series explores use cases and architectures for Kafka in the cybersecurity space, including situational awareness, threat intelligence, forensics, air-gapped and zero trust environments, and SIEM/SOAR modernization. This post is part three: Cyber Threat Intelligence.

Blog Series: Apache Kafka for Cybersecurity

This blog series explores why security features such as RBAC, encryption, and audit logs are only the foundation of a secure event streaming infrastructure. Learn about use cases,  architectures, and reference deployments for Kafka in the cybersecurity space:

Navigating Through Logs for Information Disclosure Requests

In a world of compliance and disclosure requests, the ability to investigate raw log files whilst shutting out the noise can not only be a time-saving maneuverer in your process but also reduce the risk of mistakes. The ability to analyse large volumes of log files, be it on the cloud, or hidden away in on-prem archives, will make a great difference on how your tech team operates.

Using higher education as an example. Every year, new students join a University and for IT teams, this means new logs. But it also means new devices on the networks, in Europe, this includes Eduroam, a 3rd party network point where logs may not be as easily accessible. On average, a student will bring in a mobile phone & laptop. But in this ever-growing IoT world, students are expected to bring more smart devices as well as devices such as tablets. This increases a student’s footprint on any SIEM solution.

Export Kubernetes Logs to Azure Log Analytics With Fluent Bit

Every container you run in Kubernetes is going to be generating log data. No one has time to go through and regularly check individual container logs for issues, and so in production environments, it is often required to export these logs to an aggregator for automated analysis.

If you're using Azure, then Log Analytics may be your log aggregator of choice, and so you need a way to export your container logs into Log Analytics. If you are using AKS, you can deploy the Azure Monitor solution which does this for you, however, if you are running your own cluster, or even using another cloud provider and still want to use Log Analytics, then that it's not quite so simple. This is where Fluent Bit can help.