Error Handling for Apache Beam and BigQuery (Java SDK)

Design the Pipeline

Let’s assume we have a simple scenario: events are streaming to Kafka, and we want to consume the events in our pipeline, making some transformations and writing the results to BigQuery tables, to make the data available for analytics. 

The BigQuery table can be created before the job has started, or, the Beam itself can create it.