Apache Kafka Essentials

Dive into Apache Kafka: Readers will review its history and fundamental components — Pub/Sub, Kafka Connect, and Kafka Streams. Key concepts in these areas are supplemented with detailed code examples that demonstrate producing and consuming data, using connectors for easy data streaming and transformation, performing common operations in KStreams, and more.

Event Stream Processing Essentials

With an increasing number of connected, distributed devices, there has been a gradual shift in how data is processed and analyzed. The trend is also based on the growth of emerging technologies, such as the Internet of Things (IoT), microservices, and event-driven applications, which influence the development of real-time analytics. This Refcard dives into how event stream processing represents this evolution by allowing continuous data analysis in the modern technology landscape.

The Benefits of MFT Modernization: Moving Beyond Siloed File Transfers

Organizations increasingly rely on data to serve customers, drive decision-making, and power business processes. This data is typically stored in files, across disparate applications, databases, and local machines and must be moved across the organization quickly to respond to real-time customer demands.

Businesses have long used multiple file transfer protocols and mechanisms to move data for different purposes. They might employ FTP for file transfer between computers on a network, protocols such as AS2 for EDI, and APIs to exchange data between applications. Typically, they have had no centralized or integrated processes for these different types of data exchange.