Stream Landing Kafka Data to Object Storage using Terraform

You can easily archive data to IBM Cloud Object Storage for long-term storage or to gain insight by leveraging interactive queries or big data analytics. You can achieve this through the Event Streams UI, where topics can be selected and linked to Cloud Object Storage buckets, with data automatically and securely streamed using the fully-managed IBM Cloud SQL Query service. All data is stored in Parquet format, making it easy to manage and process. Check out " Streaming to Cloud Object Storage by using SQL Query" for more info.

In this post, you will set up the Cloud Object Storage stream landing using Terraform.

IBM App Connect Enterprise

Introduction

WebSphere Adapter for SAP Software provides multiple ways to interact with applications and data on SAP servers. The adapter uses the SAP Java™ Connector (SAP JCo) API to communicate with SAP applications supporting inbound and outbound interactions. The adapter has a property “Select assured once-only delivery”, which it uses as a data source to persist the event data received from the SAP server. Event recovery is provided to track and recover events in case a problem occurs when the adapter attempts to deliver the event to the endpoint. Any user looking into exploiting this feature will have to set up a remote queue manager for the persistence of message and recovery to work.

In this article, we describe the configuration and steps required for running an SAP Inbound Adapter based message flow in IBM App Connect Enterprise running in IBM Cloud Pak for Integration (CP4I). This particular scenario is focused on the SAP Inbound Adapter set with the Assured Once delivery option and the Integration Server configured to use a remote default QMGR option

Deploy an ASP.NET Core Application in the IBM Cloud Code Engine

While there are many use cases to explore, in this blog we are going to explore how can you deploy a dot net core application from scratch into the IBM Cloud code engine. I would also suggest looking into this article for understanding when to use an application or a job.

What You’ll Learn in This Tutorial

Upon completion of this tutorial, you will know how to:

IBM Cloud Satellite in India Chennai DC

Introduction

Recently, IBM launched IBM Cloud Satellite. It sure looks like a game-changer in the multi-cloud world. While Satellite is fairly new, this is my attempt with a very basic setup in the IBM Cloud Chennai DC setup.  

With IBM Cloud Satellite, you use your own compute infrastructure that is in your on-premises data center, other cloud providers, or edge networks to create a Satellite location. Then, you use the capabilities of Satellite to run IBM Cloud services on your infrastructure, and consistently deploy, manage, and control your app workloads.

3 Lessons DevOps Can Learn From 5 Biggest Outages of Q2 2020

‘Learn from the mistakes of others. You can't live long enough to make them all yourself’ – Eleanor Roosevelt.

Nobody is immune from outages but it’s better to learn from other’s mistakes than from your own. The second quarter of 2020 was marked by several serious outages at prominent services including IBM Cloud, GitHub, Slack, Zoom and even T-Mobile (Source: StatusGator Report). I’m sure you noticed these outages like our team did. I decided to share the lessons we learned from this downtime, hoping we can all grow from it.

Extend VPC Instances with Cloud Functions, Activity Tracker with LogDNA, and Schematics

Reserving a floating IP for one or two VSIs sounds easy. But how about for tens of VSIs provisioned in your Virtual Private Cloud (VPC)? Ever thought of auto-assigning a floating IP on-the-fly as and when a new VSI is provisioned in your VPC? 

In this post, you will use the IBM Cloud Activity Tracker with LogDNA service to track how users and applications interact with IBM Cloud Virtual Private Cloud (VPC). You will then create a view and an alert on Activity Tracker with LogDNA filtering VSI creation logs. The logs are then passed to IBM Cloud Functions Python action as JSON. The action reserves a floating IP to the newly provisioned VSI (instance) using the instance ID in the passed JSON.

How to Choose a Public Cloud Platform — the Right Questions to Ask

First of all, who I am. I am an engineer, a software architect (not an ivory tower one), a tool builder, thinker, and a huge fan of common sense. And I wanted answers to a few questions that I think many IT and business leaders, technology adoption decision-makers and engineers have.

Unfortunately, many of us don't know what the right questions are to start with. Many of us, unfortunately, get influenced in some way or the other by million-dollar (if not billions) marketing stunts that try to sell "basically bullshit" low-calorie fluff packaged in nice-looking packages.

Running Docker Containers on Cloud Foundry

If you are an experienced developer already familiar with Docker, here's a quick way to just deploy your containers into the cloud without having to worry about setting up and managing a Kubernetes cluster. And also important...it comes for free using Cloud Foundry.

Let's start by creating a simple NodeJS application locally using 'npm init', give your app a name e.g. 'tinyapp' and use 'server.js' as the entry point. 

Automated Deployment of a Microservice to Kubernetes on IBM Cloud

In this blog post, I want to explain how to automate the deployment of a microservice using a delivery pipeline on IBM Cloud.

Maybe you already know Niklas, Harald, and I made the project called Cloud Native Starter. That project includes a Hands-on workshop that is called “Build a Java Microservice and deploy the Microservice to Kubernetes on IBM Cloud.” In Lab 4, you have to deploy the Authors Microservice to Kubernetes on IBM Cloud. Sometimes we have limited time in workshops. The limited-time is the reason why we want to reduce the manual effort for students in a workshop to a minimum, therefore, we took the IBM Cloud Continuous Delivery, and I created a repeatable way with minimal human interaction. The delivery pipeline contains sequences of stages, which retrieve inputs and run jobs such as builds, and deployments.

My Passwordless App on IBM Cloud With FIDO2

Passwords are bad. Often, they are the cause of data breaches. They are too short, too easy to guess, shared across multiple sites, and repeatedly stored. That has to change. FIDO2, with the underlying WebAuthn and CTAP2 specifications, seems to have the ability to move us to a passwordless world. 

I tried to make an existing application on IBM Cloud support passwordless login. Here is what I did and how I succeeded.

Build and Push a Container Image from Source Code With S2I

Create an image right from your source code with this tool.

Recently, while drafting an OpenShift solution tutorial, I explored an interesting tool called S2I (Source-to-Image). In this post, you will learn how to create a container image directly from your source code and push the generated container image to a private IBM Cloud Container registry.

You may also enjoy:  How to Create a Builder Image With S2I 

What is S2I (Source-to-Image)?

S2I is a tool for building reproducible, Docker-formatted container images. It produces ready-to-run images by injecting application source into a container image and assembling a new image. The new image incorporates the base image (the builder) and built source and is ready to use with the docker run   command. S2I supports incremental builds, which reuses previously downloaded dependencies, previously built artifacts, etc.

Rotating Service Credentials for IBM Cloud Functions

If you have followed some of my work, you know that I use IBM Cloud Functions, i.e., a serverless approach, for many projects. The tutorials with a database-driven (Db2-backed) Slackbot and the GitHub traffic analytics are such examples. In this blog post, I want to detail some of the security-related aspects. This includes how to share service credentials (think of a database username and password) with a cloud function and how to rotate the credentials.

Create and Bind Credentials

In order for a user or an app to access a service like a database system or a chatbot, a username and password or API keys are needed. In general, they are called service credentials. For many cloud computing technologies, sharing those credentials between services and apps is called binding a service.

Knative Log Analysis With LogDNA on IBM Cloud

In this post, you will learn how to use the IBM Log Analysis with LogDNA service to configure cluster-level logging for an app named "Knative-node-app" published in IBM Cloud Kubernetes Service. Refer to this post to set up a Node.js app.

IBM Log Analysis with LogDNA offers administrators, DevOps teams, and developers advanced features to filter, search, and tail log data, define alerts, and design custom views to monitor application and system logs.

From the moment you provision a cluster with IBM Cloud Kubernetes Service, you want to know what is happening inside the cluster. You need to access logs to troubleshoot problems and pre-empt issues. At any time, you want to have access to different types of logs such as worker logs, pod logs, app logs, or network logs. In addition, you want to monitor different sources of log data in your Kubernetes cluster. Therefore, your ability to manage and access log records from any of these sources is critical. Your success in managing and monitoring logs depends on how you configure the logging capabilities for your Kubernetes platform.