Simultaneous, Multiple Proportion Comparisons Using Marascuilo Procedure

Often, there is a need to compare multiple proportions among samples, wherein proportions can be about product's performance or any other characteristic which can be classified among identifiable categories. Examples of binary classifications can be pass/fail or yes/no.

Just the way equality of central tendencies or variance around them cannot be assumed to be equal, equality of proportions also cannot be assumed to be equal.

A Rapid Overview of ISA-88 and How It Aligns With ISA-95 and IIoT Platforms

ISA-88 is a long-standing standard for managing batch processes, while ISA-95 is focused on defining the progressive complexity of information that is expected to be available at each layer.

Often, interconnect between the two standard definitions and further the evolving IIOT platform definitions cause some degree of confusion as manufacturing firms grapple with defining and shaping their IT strategies and new tech/platform/apps adoption frameworks.

A Methodical Approach to Measuring ROI on Incident Response Systems

Often MSMEs operating small-scale machine workshops struggle to map and prioritize their technology platform adoption needs, specifically, when they learn about broad concepts such as IT-OT integration, secure and seamless data extraction, and availability in cloud platforms for visualization and decision making. They get the concept but lack the analytical toolsets to quantify what it all means for them. 

The perceived benefits through data visualization ( the “dashboarding” or “digital/virtual cockpit views”) and corresponding task-driven process automation involving connectors and middleware that promises to connect assets, machines, tooling, and sensors, should be quantified considering the on-field realities to justify funding.

Breezing Through Support Vector Machines

When we add "Machine" to anything it looks cool... perhaps due to an assumption made about the introduction of both "intelligence" and "automation" hinted by the use of such a term.

So, let's take that out and we are back to old, classical vector algebra. It's like a person with a bunch of sticks to figure out which one to lay where in a 2-D plane to separate one class of objects from another, provided class definitions are already known. 

Doing Residual Analysis Post Regression in R

Residuals are essentially gaps that are left when a given model, in this case, linear regression, does not fit the given observations completely.

A close analogy for residual analysis is found in medical pathology. What remains post-metabolism usually becomes an indicator of what was processed and absorbed versus what was not so.

A Rapid Introduction to Decision Model Notation

To those who are familiar with the OASIS WS-BPEL standard, Decision Model Notation should not be difficult to process. Here is a rapid, intuitive introduction to this notation.

The following infographic attempts to capture the essence of Decision Model Notation as depicted here:

Leveraging PAM (Partitioning Around Medoids) Implementation in R

For those who are aware of K-means clustering, Partitioning Around Medoids (PAM) should be easier to understand and utilize.

Before I discuss and show the rapidity with which R can accomplish such partitioning in a given data set, it will be good to understand what PAM is and how it works algorithmically. Hopefully, this will serve as an intuitive and no-code introduction to the algorithm for readers who do not have a Computer Science or Data Science background.

Tableau + R: Back Your Data Visualizations With Statistical Testing

To speak bluntly, when it comes to its visualization capabilities, Tableau, while it appears so promising, astonishingly lacks in its ability to integrate seamlessly with statistical, hypothesis-driven testing. You may be let down constantly if you feel the need to not only visualize but compare your set of observations between groups on hard statistical grounds.

Hence, one must admit that there is still a strong value gap between visualization tools like Tableau, and pure statistical software such as Minitab, SPSS, SAS, and, of course, the humble yet tremendously powerful and open source workhorse, R.

How to Use R for Conjoint Analysis

Conjoint analysis is a frequently used ( and much needed), technique in market research. 

To gauge interest, consumption, and continuity of any given product or service, a market researcher must study what kind of utility is perceived by potential or current target consumers.