From eCommerce Integration to Location-Based Controls, Block Visibility Pro Expands Upon Its Free Version

It has been several months since I last dived into Nick Diego’s Block Visibility plugin, and it is now one year since the initial release. Recently moved on from his past job into the WordPress product space, he has been building one of the best context-based plugins for showing or hiding content.

In January, Diego touted some of the ideas he had for a yet-to-be-released Block Visibility Pro. He was already fulfilling user needs, but there was so much left to be explored.

“As Block Visibility grows, there will be advanced and/or niche functionality that will be useful for certain users,” he said at the time. “Think integrations with other third-party plugins. There will always be a free version of the plugin but some of these additional features will ultimately be provided by a premium (paid) add-on called Block Visibility Pro.”

Diego quietly released the pro add-on in June, which does not take away from the free version. Everything in it is a pure value-add and helps specific sets of users.

Last week, he released Block Visibility Pro 1.1.0, and I managed to get a test copy to play around with. In short, I am more impressed than I was when I first covered the free version in January.

Pro Additions

Early versions of the free plugin had visibility controls for all visitors, user roles, and start-and-stop dates. Since then, Diego has beefed up the options to include screen size, logged-in status, and user accounts. It also integrates with Advanced Custom Fields and WP Fusion. That is more than many other content-visibility solutions will offer before needing to upgrade to a commercial or pro version.

The current pro version includes conditional controls for the following:

  • Location (Query and Post)
  • Time-based and day of week
  • WooCommerce
  • Easy Digital Downloads
  • Browser and Device
  • URL Path
  • Referral Source

The Location controls are what I have found myself tinkering with the most. They are handy at the moment but will offer more power when used in conjunction with WordPress’s upcoming site editor.

Block visibility controls based on the query.
Location, query-based visibility controls.

The Location controls are essentially query-based visibility options. Users can choose to show or hide blocks based on post type, taxonomy, and more. Everything from individual post attributes to the archive type is available. Users can also create multiple rule sets, combining various location-based options.

For shop owners, the WooCommerce and Easy Digital Downloads integrations are extensive. Users can display blocks based on shopping cart content, customer metrics, and product metrics. This could come in handy for promotions, coupons, and similar features.

One of my favorite features, which is also included in the free version, is a popup option for selecting which visibility settings should appear in the sidebar.

Popup display options panel for controlling with Block Visibility controls are available.
Toggling visibility controls in the Visibility tab.

This feature reduces the footprint of the plugin’s Visibility tab in the block sidebar panel while giving users control over which options they would like to use.

It looks similar to a current proposal for the Gutenberg plugin that would allow users to toggle specific controls:

Gutenberg block editor proposal for toggling typography controls.
Proposal for toggling block typography controls.

The differences between the two are in the location of the “ellipsis” button to open the popup. The Gutenberg proposal has it at the top of the tab. Block Visibility adds it as a control within its Visibility tab. However, the concept is the same, and the plugin provides a real-world test of how the feature could work. Thus far, I am happy with the result. It allows me to hide options that I would rarely use. I am eager for something similar to eventually work its way into core WordPress.

From Developer to Developer

If I am being honest, I am a bit envious of the work Diego has done. Many do not know this, but I also built a similar solution to Block Visibility in 2019. It was before I joined the staff here at WP Tavern. Before seeing that project mature, I handed it over as part of a larger IP sale.

I point this out because I understand the complexities of building a solution that works from a technical standpoint while also being user-friendly. It is not easy, but Block Visibility seems to hit the right balance.

And I do not say this often, but Diego’s work far exceeds anything I had built or even had in the pipeline. It is on another level, so a part of me is glad that he and I are not competing in this space. At the same time, I wish I could go back and implement some of these ideas on my former project.

Cloud factory – Architectural introduction

cloud factoryThis article launches a new series exploring a cloud factory architecture. It's focusing on presenting access to ways of mapping successful implementations for specific use cases.

It's an interesting challenge creating architectural content based on common customer adoption patterns. That's very different from most of the traditional marketing activities usually associated with generating content for the sole purpose of positioning products for solutions.

4 Ways Predictive Analytics and Machine Learning are Redefining CRO

Although conversion rate optimization (CRO) is nothing new, adopting new technologies to better enhance your CRO practices can seem a little unfamiliar and challenging. However, when it comes to utilizing predictive analytics and machine learning, the benefits to your conversion rates can be significant.

Through analyzing user behavior and entrusting technology to make educated decisions about your website’s and funnel’s performance, it’s possible to provide a massive boost to your funnel. This can be particularly important for businesses as we begin to transition towards the era of the ‘new normal,’ away from the COVID-19 pandemic and the long periods of social isolation that affected business models worldwide.
Survey of Challenges With CRO

Spotlight on CockroachDB

The construction, process, and usage of databases has evolved a lot over the last few decades. Traditional relational databases were enough to work with the data present at that time, but with the innate reliance on the Internet, the progression of cloud-native architecture, and the advancement of how businesses utilize and analyze data science, relational databases are not cutting it. What happens if a node fails in a traditional single machine of a relational database? Your database would go down along with any applications that depend on it. 

Over time as NoSQL databases were introduced—which are capable of handling a large amount of data in real-time—the risk of apps failing began to decrease but the risk of data inconsistencies increased. So, there has been a growing need for a better storage solution for data to cope with today’s dynamic cloud-native architecture. CockroachDB was specifically designed to solve and meet this need. 

Get Started With Kafka Connector for Azure Cosmos DB Using Docker

Having a local development environment is quite handy when trying out a new service or technology. Docker has emerged as the de-facto choice in such cases. It is especially useful in scenarios where you’re trying to integrate multiple services and gives you the ability to start fresh before each run.

This blog post is a getting started guide for the Kafka Connector for Azure Cosmos DB. All the components (including Azure Cosmos DB) will run on your local machine, thanks to:

Runners in Spring Boot

Runners is a java class/spring beans of Spring boot Application implement the XXXRunner(I) directly or indirectly, and its auto-executable component is called by Container.

Runner classes are used to deal with one-time executing logics and those logics will be executed where SpringApplication.run(-) is about to complete all of its startup activities.

5 Promising Ways Big Data Deters Cybersecurity Threats

Undoubtedly, companies are blind, deaf, and in the middle of a freeway without big data analytics. Data is the new science whereas big data leverages the answer. Data production rates are evolving at a tremendous pace simultaneously with the human population. Humans produce mind-boggling amounts of data (2.5 quintillion bytes)  on a regular basis and the pace is only accelerating with the evolution of the Internet of Things (IoT). The world has generated 90% of data only in the past two years. Also, according to certain predictions, it is predicted that the world will store 200 zettabytes of data by the year 2025. 

Cybercrimes are progressing by leaps and bounds in parallel to the data production rate. There would be no wrong in saying that cyber attacks seem to be breeding like rabbits. The globe faces more than 10,000 malicious files and 100,000 malicious websites on a daily basis. Phishing attacks account for over 80% of the reported security incidents. As of January 2021, Google has registered more than 2 million phishing sites.  Since the pandemic outbreak, remote workers are also the target of alarming cyberattacks. Hence, people are aware of each and every cyberattack that is encountered all around the world due to their effortless access to the internet.

Automating IoT With Camunda Platform

Some Background

When I first started at Camunda back in October 2020 (what was 2020 anyway?) the very first thing I was asked to do was come up with something I could do for a special Halloween blog post. It being COVID-times, I of course built a Camunda and IoT integration to evaluate costumes and deliver candy.

I am sort of well-known for doing weird, pointless IoT projects like this one and this one, and this one. You get the idea.

Optimizing AWS Architecture for Cost Management

Using AWS services for your projects is a great option, but getting unexpected huge AWS bills can be a nightmare. You must know the cost factors of AWS services before you start using them. This article will discuss how you can optimize AWS architecture for better cost management using AWS cost management tools, following best practices, and applying cost optimization methods.

Typical AWS Costs

A typical AWS cost list in AWS bills is based on computing, storage, and data transfer services.

Apache Kafka’s Code Under the Scanner

Apache Kafka is the open-source distributed event streaming platform built for data-driven apps that needs real-time handling of the data. Kafka was open-sourced by LinkedIn in 2011. Its use-cases are endless and it's used by thousands of companies for various operations to process real-time data. Kafka provides several [APIs] to process the data streams in real-time with low latency and high throughput. It's used in major companies like Airbnb, Netflix, LinkedIn, etc. It includes publish (write), subscribes (read), store and processes the stream of events for various operations according to the use-case of the application. It uses Binary protocol over TCP for Communication. Since it is open-source licensed under Apache License 2.0, it helps us to examine the code further to explore the inner workings and structure of Apache Kafka with the free static code analyser tool Embold

The results are surprisingly interesting. 

5 Security Measures For Open Source Based Apps

More and more app development teams are utilizing an Open Source base model, with the majority of developers now turning away from custom code.

And with good reason. Open source allows faster development, more innovation, and lower costs.

How to Scrape Target Store Locations Data From Target.Com Using Python?

Web data scraping is a quicker and well-organized way of getting details about the store locations or scrape locations from website rather than using time to collect information physically. This tutorial blog is for scraping store locations as well as contact data accessible on Target.com, amongst the biggest discounted store retailers in the USA.
For the tutorial blog here, our Target store locator will scrape the information for Target store locations by the provided zip code.

We can scrape the following data fields:

Understanding Selenium: The Automation Testing Tool

Introduction

With an increasing demand for test automation services, organizations are looking at investing in the best test automation tools for their business. Selenium is on the top of their list because of its numerous advantages. Research has predicted the growth of the Global Automation testing market to reach US $109.69 billion by 2025, which means that software testing is evolving and growing fast. The impact of Selenium Testing is such that it has turned out to be a game-changer in the Software Testing world. Test automation has become more efficient than ever before, the high levels of accuracy that are achieved are saving a lot of time for the QA team as well. Selenium has contributed a lot in bringing this exponential change in the testing space and continuous development and delivery process.

What Is Selenium?

Selenium does not need any introduction. It is quite popular in the software testing industry and the most liked web-based software test automation tool that has set its benchmark for a long time now. Here is a list of a few of the top features of Selenium:

The New Change View Menu

We’ve been plucking away at some UI changes that will help slowly morph our existing Pen Editor into the editor we’re imagining for the future. That will be a big change, someday, but in order to make it feel less abrupt, we’re doing smaller changes where we can so that the final change won’t feel so big.

So anyway, a little update to the Change View menu. Featuring fun animated rotations!

The post The New Change View Menu appeared first on CodePen Blog.