Apache Kafka Patterns and Anti-Patterns

Apache Kafka offers the operational simplicity of data engineers' dreams. A message broker that allows clients to publish and read streams of data — Kafka has an ecosystem of open-source components that, when combined together, help store, process, and integrate data streams with other parts of your system in a secure, reliable, and scalable manner. This Refcard dives into select patterns and anti-patterns spanning across Kafka Client APIs, Kafka Connect, and Kafka Streams, covering topics such as reliable messaging, scalability, error handling, and more.

Debugging Collections, Streams, and Watch Renderers

In the last two ducklings, I finished the extensive discussion on breakpoints and switched my focus to the watch area. In it, we have several amazing and lesser-known tools that let us build insight into our running application. Being able to tell at a glance if something works correctly is crucial for many applications.

This is very important for collections and arrays. We can have thousands or millions of elements within a collection. Debugging this is very difficult without some basic tools. 

Stream Landing Kafka Data to Object Storage using Terraform

You can easily archive data to IBM Cloud Object Storage for long-term storage or to gain insight by leveraging interactive queries or big data analytics. You can achieve this through the Event Streams UI, where topics can be selected and linked to Cloud Object Storage buckets, with data automatically and securely streamed using the fully-managed IBM Cloud SQL Query service. All data is stored in Parquet format, making it easy to manage and process. Check out " Streaming to Cloud Object Storage by using SQL Query" for more info.

In this post, you will set up the Cloud Object Storage stream landing using Terraform.

Event Stream Processing Essentials

With an increasing number of connected, distributed devices, there has been a gradual shift in how data is processed and analyzed. The trend is also based on the growth of emerging technologies, such as the Internet of Things (IoT), microservices, and event-driven applications, which influence the development of real-time analytics. This Refcard dives into how event stream processing represents this evolution by allowing continuous data analysis in the modern technology landscape.

Java Streams: An Implementation Approach

In this tutorial, we will learn what Streams are in Java and how we can develop an implementation approach. We will compare the Stream API to SQL statements as an implementation approach.

Audience

All Java Developers who want to learn the new feature of Java 8 i.e. Streams API.

Spring Boot: Reading Resources

Learn how to read files from the resources folder in a Spring Boot application.

Reading files from a Spring Boot application is possible through Java 8 NIO API. This article demonstrates how to read files from the resources folder in a Spring Boot application, which is running as a standalone application or inside the Docker container. 

You may also like: Working With Resources in Spring

26 Items for Dissecting Java Local Variable Type Inference (Var Type)

This article is also part of my book Java Coding Problems.
Image title

The Java Local Variable Type Inference (LVTI), or shortly, the var type (the identifiervaris not a keyword, is a reserved type name), was added in Java 10 via JEP 286: Local-Variable Type Inference. As a 100 percent compile feature, it doesn't affect bytecode, runtime, or performance. Mainly, the compiler will inspect the right-hand side and infer the concrete type. It looks at the right-hand side of the declaration, and if there is an initializer, then it simply uses that type to replacevar. Addtionally, it is useful to reduce verbosity, redundancy, and boilerplate code. It is also meant to speed up the ceremonies involved while writing code. For example, it is very handy to writevar evenAndOdd =...instead of Map <Boolean, List <Integer>> evenAndOdd.... Depending on the use case, it has a trade-off in code readability that is covered in the first item below.

Become a Master of Java Streams, Part 2: Intermediate Operations

Want to become a Java Streams Master?

Just like a magic wand, an Intermediate operation transforms a Stream into another Stream. These operations can be combined in endless ways to perform anything from simple to highly complex tasks in a readable and efficient manner.

This article is the second out of five, complemented by a GitHub repository, containing instructions and exercises to each unit.

Become a Master of Java Streams — Part 1: Creating Streams

Check out the article below to become a master of Java Streams!

Declarative code (e.g. functional composition with Streams) provides superior code metrics in many cases. Code your way through this hands-on-lab article series and mature into a better Java programmer by becoming a Master of Java Streams.

The whole idea with Streams is to represent a pipeline through which data will flow and the pipeline's functions operate on the data. This way, functional-style operations on Streams of elements can be expressed. This article is the first out of five where you will learn firsthand how to become a Master of Streams. We start with basic stream examples and progress with more complex tasks until you know how to connect standard Java Streams to databases in the Cloud.

The Best of Node and Express [Articles and Tutorials]

All aboard!

Built on top of Google Chrome's V8 Engine, Node.js (and its companion framework, Express.js) have come to dominate much of backend development, especially when JavaScript is your language of choice on the server-side. In this edition of "Best of DZone," we're going to take a look into the two frameworks to better understand key pieces of functionality and how they work in tandem to create applications.  

Before we begin, we'd like need to thank those who were a part of this article. DZone has and continues to be a community powered by contributors like you who are eager and passionate to share what they know with the rest of the world. 

Should I Parallelize Java 8 Streams?

[Java] parallel streams

In Java 8, the streams API is easy to iterate over collections, and it's easy to parallelize a stream by calling the parallelStream() method. But should we be using parallelStream() wherever we can? What are the considerations? 

You may also like: Think Twice Before Using Java 8 Parallel Streams

Look at the following ParallelStreamTester class to generate collections of different sizes for the purpose of testing parallel streams performance against a sequential stream.  

All Things Java 8 [Tutorials]

Java 8
No matter what version of the JDK we are on, Java 8 is not going anywhere.

Java 8 introduced a new era of Java. Everything from lambda expressions and functional programming to Streams and collections — DZone was there to document it all.

So whether you're migrating over to Java 9 or Java 11, or maybe even Java 13, Java 8 concepts and features are still very much alive in the JDK. And understanding these core concepts can help tremendously when it's time to move beyond Java 8.

The Java Platform Module System

Although Java 9 is not the latest JDK release (to be more specific, it was released back in 2017), it was the biggest update in the history of the JDK. Despite that, most Java programmers hardly mention its most significant feature — the Java Platform Module System.

This is primarily because most business applications still use Java 8. For now, it has demonstrated stability and newer releases need time to spread across the IT world.

Java Streams Overview, Part II

In my previous article, I wrote about the fundamentals of streams in Java 8. Now, let's augment our skills with some additional information about streams, like how we can chain them, and we can use them to access files.

Chaining Streams

When working with streams, they are often chained together.

Java 8 Steaming (groupBy Example)

In this article, I would like to share few scenarios where we will see how the Java 8 Stream and related functions can be useful for reading a file and applying some aggregate functions, which we generally do in our day-to-day SQL/queries on the database side.

So, I am taking an example of employee CSV files, which is shown below: