WPWeekly Episode 359 – Diversity Speaker Training With Jill Binder

In this episode, John James Jacoby and I are joined by Jill Binder, Founder, and Chief Consultant and Trainer at Diverse Speakers In Tech. We discussed how and why the Diverse Speaker Training group was created, how the training encourages underrepresented people to speak at WordCamps, and how the recent 50% sponsorship funds from Automattic will be used.

We also learned that local communities that have participated in the training at the meetup level have seen a sharp increase in the number of diverse speaker applications submitted to WordCamps. Binder is hoping to be sponsored 100% so she can work on the project full-time. If you’re interested in sponsoring her work, please visit her contact page and get in touch.

Stories Discussed:

Announcing Pantheon Localdev Early Access

WooCommerce 3.6.5 security release

Jetpack 7.5

Discuss This Tweet by John O’ Nolan

Transcript:

EPISODE 359 Transcript

WPWeekly Meta:

Next Episode: Wednesday, July 10 3:00 P.M. Eastern

Subscribe to WordPress Weekly via Itunes

Subscribe to WordPress Weekly via RSS

Subscribe to WordPress Weekly via Stitcher Radio

Subscribe to WordPress Weekly via Google Play

Listen To Episode #359:

Menus with “Dynamic Hit Areas”

Flyout menus! The second you need to implement a menu that uses a hover event to display more menu items, you're in tricky territory. For one, they should work with clicks and taps, too. Without that, you've broken the menu for anyone without a mouse. That doesn't mean you can't also use :hover. When you use a hover state to reveal more content, that means an un-hovering state needs to hide them. Therein lies the problem.

The problem is that if a submenu pops out somewhere on hover, getting your mouse over to it might involve moving it along a fairly narrow corridor. Accidentally move outside that area, the menu can close, and it can be an extremely frustrating UX moment.

We've covered this before in our "Dropdown Menus with More Forgiving Mouse Movement Paths" article.

You can get to the menu item you want, but there are some narrow passages along the way.
Many dropdowns are designed such that the submenu where the desired menu item is may close on you when the right area isn't in :hover, or a mouseleave or a mouseout occurs.

The most compelling examples that solve this issue are the ones that involve extra hidden "hit areas." Amazon doesn't really have menus like this anymore (that I can see), and perhaps this is one of the reasons why. But in the past, they've used this hit area technique. We could call them "dynamic hit areas" because they were drawn based on the position of the parent element and the submenus:

I haven't seen a lot of implementations of this lately, but just recently, Hakim El Hattab included a modern implementation of this in his talk at CSS Day 2019. The implementation leverages drawing the areas dynamically with SVG. You don't actually see the hit areas, but they do look like this, thus forming paths for that prevent hover-offs.

I'll include a YouTube embed of the talk starting at that point here:

The way he draws the hit area is so fancy it makes me all kinds of happy:

The live demo of it is up on the Slides.com pattern library thingy.

The post Menus with “Dynamic Hit Areas” appeared first on CSS-Tricks.

Cost of Delay: Why You Should Care

Want to make money? Work smarter, and faster.

The real problem is this: Why should you care about how much a delayed release costs you? Maybe you have a “sweet spot” in the way you start your projects or release them. “It just takes that long here.” (That’s the sign of a system impediment.)

Now, let’s try to calculate that cost of delay.

Google Launches Effort to Make Robots Exclusion Protocol an Internet Standard, Open Sources Robots.txt Parser

Website owners have been excluding web crawlers using the Robots Exclusion Protocol (REP) on robots.txt files for 25 years. More than 500 million websites use robots.txt files to talk to bots, according to Google’s data. Up until now, there has never been an official Internet standard, no documented specification for writing the rules correctly according to the protocol. Over the years, developers shared their various interpretations of the protocol, but this created many different ambiguous methods for controlling crawlers.

Google is working together with Martijn Koster, the original author of the protocol, webmasters, and other search engines to create a proposal to submit to the Internet Engineering Task Force (IETF) for standardizing the REP:

The proposed REP draft reflects over 20 years of real world experience of relying on robots.txt rules, used both by Googlebot and other major crawlers, as well as about half a billion websites that rely on REP. These fine grained controls give the publisher the power to decide what they’d like to be crawled on their site and potentially shown to interested users. It doesn’t change the rules created in 1994, but rather defines essentially all undefined scenarios for robots.txt parsing and matching, and extends it for the modern web.

The proposed specification includes several major items that webmasters and developers will want to review. It extends the use of robots.txt to any URI-based transfer protocol (FTP, CoAP, et al), instead of limiting it to HTTP. It also implements a new maximum caching time of 24 hours and lets website owners update robots.txt whenever they choose, without having crawlers overload their sites with requests. If a previously accessible robots.txt file becomes inaccessible for whatever reason, crawlers will respect the known disallowed pages that were previously identified for “a reasonably long period of time.”

Google has also open sourced the C++ library it uses for parsing and matching rules in robots.txt files, along with a testing tool for testing the rules. Developers can use this parser to create parsers that use the proposed REP requirements. It has been updated to ensure that Googlebot only crawls what it’s allowed to and is now available on GitHub.

“This library has been around for 20 years and it contains pieces of code that were written in the 90’s,” Google’s Search Open Sourcing team said in the announcement. “Since then, the library evolved; we learned a lot about how webmasters write robots.txt files and corner cases that we had to cover for, and added what we learned over the years also to the internet draft when it made sense.”

Lizzi Harvey, who maintains Google’s Search developer docs, updated the robots.txt spec to match the REP draft. Check out the full list of changes if you want to compare your robots.txt file to the proposed spec. If the proposal for standardizing the REP is successfully adopted by the IETF, the days of googling and wading through undocumented robots.txt rules will soon be over.

A Look at WordPress Plugin Ecosystems

Among the many strengths of WordPress is the massive number of available plugins. There are tens of thousands, and those are just the free offerings. They handle all sorts of functionality, from security to image galleries to forms. Just about everything you could possibly want for your website is only a download away.

But it is the rare plugin that is so well-crafted and useful that it inspires a number of companion offerings to use along side of it. In many cases, they are among the most popular plugins out there. So popular and well-liked, in fact, that they have developed their very own ecosystems.

Today, we’ll take a look at the concept of WordPress plugin ecosystems. Along the way, we’ll show you some examples and discuss the advantages (and disadvantages) that come with adopting them into your website.

Prime Examples

Before we dig too deeply into the pros and cons, let’s see what a plugin ecosystem looks like. For our purposes, we’ll define it as such:

  • A “base” or “core” plugin that works on its own, but also has multiple add-on plugins available;
  • Add-ons may be created by the original author, or by outside developers within the WordPress community;
  • Can be free, commercial or any combination thereof;

In short, this means that the term “ecosystem” is rather flexible. It might be that a plugin’s author has created the base and all add-ons themselves. Or, other developers out there may have decided to build their own extensions. Either way, we have a group of related plugins that can scale up functionality based on need.

Here are a few prime examples we can use to better illustrate the concept:

WooCommerce

Perhaps the most well-known plugin ecosystem, WooCommerce turns your website into an online store. The core plugin adds shopping cart functionality and related features that go along with it for things like shipping and accepting payments. However, it is capable of so much more.

Through the use of add-ons (WooCommerce refers to them as “extensions”), you can leverage the cart for all sorts of niche functionality. Among the more basic features are the ability to work with a wider variety of payment gateways and shipping providers. But you can also add some advanced capabilities such as selling membership subscriptions or event tickets.

Gravity Forms

Here’s a great example of a plugin whose ecosystem has taken a core concept and expanded it immensely. Gravity Forms is a form-building plugin, which already includes a lot of advanced functionality. Yet add-ons allow it to perform tasks well beyond what you’d expect from your standard contact form.

Through a community that both includes and goes beyond the plugin’s original author, add-ons allow for a number of advanced tasks. You can accept payments, run polls or surveys, connect with third-party service providers, view and manipulate entry data and a whole lot more. It may one of the best examples of how an ecosystem provides nearly endless flexibility.

WooCommerce and Gravity Forms

Something to Build On

One of the biggest advantages to buying into one of these plugin ecosystems is that you can add what you need, when you need it. Think of it as a building. The base plugin provides you with a solid foundation (and maybe a floor or two). Then, you can add as many floors as it takes to fulfill your needs.

Sometimes, that first core plugin is all you need. But even then, you still have the blueprints to build upon should you want to expand later.

Another potential benefit is that these plugins tend to have been built with expansion in mind. That means that you don’t necessarily have to rely on official or even community-based add-ons. If you have some programming knowledge, you might be able to add functionality by building it yourself.

Plus, by utilizing a related set of plugins, you can avoid one of the more frustrating parts of WordPress site development. So often, we attempt to bring many disparate pieces together to form some sort of cohesively functioning website.

This often means using plugins that were never meant to necessarily work together, which can lead to problems when attempting to make it all run seamlessly. In theory, this shouldn’t be an issue when you tap into an ecosystem.

A skyscraper building.

Potential Drawbacks

Despite the many advantages to using a set of related plugins, there are some possible downsides to consider. Among the most common:

It Can Get Expensive

For plugins with commercial add-ons, you may find yourself being nickeled and dimed for each and every piece of added functionality you’d like to add. WooCommerce is a classic example, where each official add-on requires a yearly investment. That’s not to say it’s not worth the cost – it very well may be. Rather, it is a potential obstacle for the budget-conscious.

Not Everything You Want Is Available

This is something you’ll want to check before making any decisions as to how you’ll build your site. It may be that a base plugin and a selection of add-ons will get you 90% of the functionality you need. However, that missing 10% could be a big deal.

If a companion plugin doesn’t cover this, you might have to either look elsewhere or build it yourself. That could lead to some unexpected issues when it comes to both compatibility and cost. Short of those options, a lack of that one piece of functionality can result in a long wait in hopes of it being added in at a later date.

Unofficial Add-Ons May Not Keep Pace

Plugins are updated with new features and bugfixes all the time. Sometimes, those updates can be major – and that poses a risk when using unofficial add-ons built by community members. It could mean that updating the base plugin means that you have to abandon a particular add-on.

One way to avoid this potential issue is to stick with official add-ons only. If you do utilize those from unofficial sources, look for plugins that are frequently updated. They are more likely to adapt to any major upgrades.

Broken glass.

A Compelling Option

In the right situation, a WordPress plugin with its own ecosystem can be your best option. This is especially so in cases when you are building a website in which a plugin fulfills the core part of your mission.

For instance, an eCommerce site will want to use a shopping cart that can be expanded to meet the specific requirements of the store. This provides the best opportunity for future growth and will help you avoid a costly switch later on.

Of course, there are some potential negatives to consider. But with some due diligence, you may just find a collection of plugins that will successfully power your WordPress website for years to come.

3D Scanning at Home

I used to think that 3D scanning was something that had to be done in a lab or using expensive equipment, but thanks to Steve from CG Geek, I learned that it can be done with some awesome free software and equipment that I already have.

In his tutorial, Steve demonstrates how you can capture a large object, and I highly recommend watching it because he explains the process very well. What I'll be discussing in this post is capturing a small object using a slightly different technique which should be easier for this size. If you prefer to follow along with a video, I got you covered:

Tom’s Tech Notes: Container Advice for Devs [Pocast]

Welcome to our latest episode of Tom's Tech Notes! Let's dive into how to get proficient with containers. Spoiler alert: You're going to be learning Kubernetes. But in addition to orchestration, there is a good number of best practices to consider as well.  

As a primer and reminder from our intial post, these podcasts are compiled from conversations our analyst Tom Smith has had with industry experts from around the world as part of his work on our research guides.

Protecting Privacy While Keeping Detailed Date Information

A common attempt to protect privacy is to truncate dates to just the year. For example, the Safe Harbor provision of the HIPAA Privacy Rule says to remove “all elements of dates (except year) for dates that are directly related to an individual …,” this restriction exists because dates of service can be used to identify people, as explained here.

Unfortunately, truncating dates to just the year ruins the utility of some data. For example, suppose you have a database of millions of individuals and you’d like to know how effective an ad campaign was. If all you have are the dates to the resolution of years, you can hardly answer such questions. You could tell if sales of an item were up from one year to the next, but you couldn’t see, for example, what happened to sales in the weeks following the campaign.

Facebook Authentication and Authorization in Server-Side Blazor App

Introduction

The latest preview for .NET Core 3 (preview-6) has introduced the functionality to add authentication and authorization in a server-side Blazor application. In this article, we will learn how to implement authentication and authorization using Facebook in a server-side Blazor application. You can refer to my previous article Understanding Server-side Blazor to get in-depth knowledge on server-side Blazor.

Prerequisites

  • Install the latest .NET Core 3.0 Preview SDK from here.
  • Install the latest preview of Visual Studio 2019 from here.
  • Install ASP.NET Core Blazor Language Services extension from here.

Source Code

Get the source code from GitHub.

Concurrency and Locking With JPA: Everything You Need to Know

Imagine you have a system used by multiple users where each user is trying to modify the same entity concurrently. How do you ensure that the underlying data's integrity is preserved when accessed concurrently?

The persistence providers an offer locking strategy to manage concurrency. There are two types of locking: Optimistic locking and Pessimistic locking. Before we deep dive into the locking strategy, let's learn a little bit about ACID transactions.

Validate Street Addresses With Vue.js and the HERE Geocoder Autocomplete API

When it comes to geocoding, being able to convert addresses to latitude and longitude coordinates so they can be displayed on a map is not the only use-case. A lot of times being able to geocode an address makes for great address validation to see if an address actually exists. Take for example a tutorial written by Jayson DeLancey titled, Street Address Form Validation with React.js and HERE Geocoder Autocomplete. In this tutorial, he demonstrated how to accept user input, offer suggestions, and ultimately check to see if the addresses are valid using React and the APIs found in the HERE Developer Portal.

We're going to change it up a bit. Instead of using React, we're going to try to validate addresses using Vue.js, another very popular framework for web development.

Are You Practicing Agile Accountability Responsibly?

Does your team need some work on accountability?

I started out this blog post writing about shared accountability for agile program teams. Accountability is an interesting, large topic that gets skipped over quite often in our agile community. In fact, I found in my writing a realization that we don’t have a great track record for defining it. So here’s my take.

You may also like: Creating a Fear-Driven Development Culture

Caution! I am trying to pair down my mindmap of points to discuss in this blog. Apparently, I am not doing a great job and I’ll own that. The way that I’ll own it is by committing to write to conclusion about:

The AI System That Can Detect Lung Cancer

A central tranche of AI in healthcare today has been in spotting signs of problems in medical imaging faster than humans currently can.  The latest example of this comes via a new study from Google and Northwestern Medicine, which proposes to improve the detection of lung cancer using deep learning.

“Radiologists generally examine hundreds of two-dimensional images or ‘slices’ in a single CT scan, but this new machine learning system views the lungs in a huge, single three-dimensional image,” the researchers explain. “AI in 3D can be much more sensitive in its ability to detect early lung cancer than the human eye looking at 2-D images. This is technically ‘4D’ because it is not only looking at one CT scan, but two (the current and prior scan) over time.”

#228: 2019 Trends

Show Description

We're half way through 2019! Marie and Cassidy are here to talk about what's trending on CodePen so far and take a glimpse at the future for the rest of the year.

Time Jumps

Sponsor

18:41 .TECH domains

If you belong to the tech industry, it’s ideal for you to choose a .TECH domain name! Be it your tech startup, community, project, event or your personal brand, a .TECH domain will make a strong “tech” positioning for your tech website.

Get 90% OFF on .TECH Domains for a limited time!

Go to go.tech/codepen

Show Links

CodePen Links

The post #228: 2019 Trends appeared first on CodePen Blog.