Apache Kafka in the Gaming Industry: Use Cases + Architectures

This blog post explores how event streaming with Apache Kafka provides a scalable, reliable, and efficient infrastructure to make gamers happy and gaming companies successful. Various use cases and architectures in the gaming industry are discussed, including online and mobile games, betting, gambling, and video streaming.

Learn about:

Apache Kafka Is NOT Real-Time!

Is Apache Kafka really real-time? This is a question I get asked every week. Real-time is a great marketing term to describe how businesses can add value by processing data as fast as possible. Most software and product vendors use it these days, including messages frameworks (e.g., IBM MQ, RabbitMQ), event streaming platforms (e.g., Apache Kafka, Confluent), data warehouse/analytics vendors (e.g., Spark, Snowflake, Elasticsearch), and security / SIEM products (e.g., Splunk). This blog post explores what "real-time" really means and how Apache Kafka and other messaging frameworks accomplish the mission of providing real-time data processing.

Definition: What Is Real-Time?

The term "real-time" is not easily defined. However, it is essential to define it before you start any discussion about this topic.

SQL Provision and Azure SQL Database: Creating Local Development and Test Databases

Your organization, like many others, is probably moving towards the use of cloud-hosted database platforms, such as the Azure SQL Database. However, even if some of your production databases are on the Azure SQL Database, you still need to provide development and test copies of that database, whether they are hosted on a local, physical, or virtual machine, on a virtual machine in the Azure cloud, or in the Azure SQL Database. The problem with hosting lots of copies of the database in the Azure SQL Database is that it gets cost prohibitive, it places more restrictions on developers than they often like, and sometimes, network connections and bandwidth cause problems. This means you’ll sometimes need to move a copy of the database from Azure SQL Database to the local system for development and test work.

This article will describe how you can create a local copy of the database from a BACPAC of your Azure SQL Database production database and then use SQL Provision to deploy as many development and test copies (clones) of that database as you need. If your database contains any non-public data, you can also use the Data Masker component of SQL provision to ensure this data is obfuscated before deploying the clones.