Apache Kafka for the Connected World: Vehicles, Factories, Cities

The digital transformation enables a connected world. People, vehicles, factories, cities, digital services, and other "things" communicate with each other in real-time to provide a safe environment, efficient processes, and a fantastic user experience. This scenario only works well with data processing in real-time at scale. This blog post shares a presentation that explains why Apache Kafka plays a key role in these industries and use cases but also to connect the different stakeholders.

Software is Changing and Connecting the World

Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way integrating with various legacy and modern data sources and sinks.

The Growing Importance of an Open-Data Commons for Mobility

Public transit data was one of the first data sets in what’s known today as “smart cities” or the “Internet of Things” (IoT). The simple reason was that publishing a transit schedule in machine-readable format, or providing real-time tracking over a cellular data connection, is pretty cheap, relative to the cost of a subway train or bus.

Transit quickly became one of the best examples of open data. Civic leaders at transit agencies, like Portland’s TriMet, found that if they packaged data in a standard format, developers could use it to make better user experiences than transit agencies could offer. The GTFS (General Transit Feed Specification) standard for transit schedules was originally created by TriMet and Google to solve the problem of sharing data between agencies and developers. Now, many agencies worldwide use the same data standard.