Solution Thinking Design

Context

You are modernizing a Core System and have defined a Target Business Architecture (BA) that will replace it. You are following the defined modernization roadmap and are in charge of the modernization of one of the Business Capabilities of the Target Business Architecture. Now, you need to understand how the current Core System is addressing the Business Capability, i.e., you have to map which legacy system is providing the business capabilities associated with this system in the BA. 

Usually, the core system will be a set of monoliths, applications, services, and transactions that are integrated and tightly coupled so that if you map all of them and its integrations, you will end up with a picture like this one:

Mainframe Modernization Acceleration Through OpenShift

Mainframes have evolved significantly from being a legacy platform to incorporating some of the best cloud capabilities over the recent years. Mainframes have truly been one of the most reliable platforms for business-critical workloads for several years if not decades.  The mainframe has been looked at as a growth area in many recent surveys as below:

  • 90% of survey respondents see the mainframe as a platform for growth with compliance and security, cost optimization, mainframe modernization, data recovery, staffing and skills, and Implement AI / ML technologies as top focus areas (BMC Mainframe Survey 2020)
  • 74% have long-term viability of Mainframe as a strategic platform whereas 66% would never fully replace Mainframes.

The fact that >50% of enterprise application transactions touch the mainframe is a key indicator of its significance and relevance when enterprises are trying to modernize their IT estate. Also, the new generation of Mainframes - IBM Z and LinuxONE Enterprise systems are also being built with capabilities to support modern cloud-native and AI workloads. 

The Advent of Data Hyper-Protection

Critical system-of-record data must be compartmentalized and accessed by the right people and applications, at the right time.

Since the turn of the millennium, the art of cryptography has continuously evolved to meet the data security and privacy needs of doing business at Internet speed, by taking advantage of the ready processing horsepower of mainframe platforms for data encryption and decryption workloads.

The Kano Model: Developing for Value and Delight

Even though we have been developing software on the mainframe platform for decades we still have ways to learn and improve. We continually face the problem of meeting user needs with the resources we have on hand. This forces us to be careful about what we choose to do—we must look at whether we are focusing on the right things. While there are basic expectations that must be met, are we providing things that excite and delight? That is, the things that make users feel connected and generate passion. Users that feel a connection with your applications can obtain greater value from them.

How to provide things that excite while still delivering on basic needs is difficult to manage, but there are tools to help. For one the Kano model created by Japanese educator, lecturer, writer, and consultant in the field of quality management, Dr. Noriaki Kano. He sought to resolve these issues with a prioritization framework. The framework focuses on the three patterns of customer expectations versus the investments organizations make to delight their customers and do what it takes to positively impact customer satisfaction.

How Many API Calls Should You Do?

Overview: APIs provide access to valuable mainframe data, but deciding whether to add to an existing API or create a new one can be a difficult task. Evaluating a variety of factors can help make it easier to determine which option makes sense.


As applications evolve, they will often need access to data and processing on the mainframe. If the necessary APIs are available, this becomes an easy task, however, the architecture isn’t always there, creating a roadblock. Teams are faced with choices on how to deliver value quickly and set up the architecture to address future challenges.

Apache Kafka and Machine Learning in Pharma and Life Sciences Industry

This blog post covers use cases and architectures for Apache Kafka and Event Streaming in Pharma and Life Sciences. The technical example explores drug development and discovery with real time data processing, machine learning, workflow orchestration and image / video processing.

Use Cases in Pharmaceuticals and Life Sciences for Event Streaming and Apache Kafka

The following shows some of the use cases I have seen in the field in pharma and life sciences:

Baby Yoda and Stranger Things: The Case for Shorter Mainframe Software Release Cycles

Overview: Careful thought must be paid to how we want our users to consume our mainframe software updates. Releasing incrementally has definite advantages over big releases. Just look at how we consume popular TV shows.

BABY YODA! Now that I have your attention, let’s consider how Baby Yoda became a big thing over the last few months, growing with each new episode of Disney’s The Mandalorian. Baby Yoda was able to stay in our minds for the period that the episodes where dropped weekly. What Disney did here was smart: they put out incremental weekly releases of the show. This meant that we couldn’t binge; we had to watch each week and wait for more. We had time to think about and discuss each episode. Each episode could stand on its own and have a week to be dissected before the next was presented. It provides for a much longer period in front of the public.

Contrast that with a series that is dropped all on one day like Netflix’s Stranger Things. This produces a big buzz, but mostly for a few days until everyone has seen it. The interest does not last as long as if it were stretched out over several weeks. There isn’t the time to review and discuss each episode because you wait until everyone has caught up and then you can discuss the whole season.

Agile Transformation Leadership: Insight from Compuware CEO Chris O’Malley

Earlier in 2018, Compuware CEO Chris O’Malley spoke with Jeff Dalton, host of the AgileCxO “Agile Leadership Podcast,” about Compuware’s Agile transformation from a company “dominated by maintenance…Waterfall thinking” to a DevOps-enabled enterprise delivering innovation every 90 days.

“We embarked on an aggressive journey to remake ourselves first, adopt things like Agile and DevOps, and then become an innovative force in remaking the mainframe. And the fate of the company has changed as a result of it,” Chris said.