From a Single Repo, to Multi-Repos, to Monorepo, to Multi-Monorepo

I’ve been working on the same project for several years. Its initial version was a huge monolithic app containing thousands of files. It was poorly architected and non-reusable, but was hosted in a single repo making it easy to work with. Later, I “fixed” the mess in the project by splitting the codebase into autonomous packages, hosting each of them on its own repo, and managing them with Composer. The codebase became properly architected and reusable, but being split across multiple repos made it a lot more difficult to work with.

As the code was reformatted time and again, its hosting in the repo also had to adapt, going from the initial single repo, to multiple repos, to a monorepo, to what may be called a “multi-monorepo.”

Let me take you on the journey of how this took place, explaining why and when I felt I had to switch to a new approach. The journey consists of four stages (so far!) so let’s break it down like that.

Stage 1: Single repo

The project is leoloso/PoP and it’s been through several hosting schemes, following how its code was re-architected at different times.

It was born as this WordPress site, comprising a theme and several plugins. All of the code was hosted together in the same repo.

Some time later, I needed another site with similar features so I went the quick and easy way: I duplicated the theme and added its own custom plugins, all in the same repo. I got the new site running in no time.

I did the same for another site, and then another one, and another one. Eventually the repo was hosting some 10 sites, comprising thousands of files.

A single repository hosting all our code.

Issues with the single repo

While this setup made it easy to spin up new sites, it didn’t scale well at all. The big thing is that a single change involved searching for the same string across all 10 sites. That was completely unmanageable. Let’s just say that copy/paste/search/replace became a routine thing for me.

So it was time to start coding PHP the right way.

Stage 2: Multirepo

Fast forward a couple of years. I completely split the application into PHP packages, managed via Composer and dependency injection.

Composer uses Packagist as its main PHP package repository. In order to publish a package, Packagist requires a composer.json file placed at the root of the package’s repo. That means we are unable to have multiple PHP packages, each of them with its own composer.json hosted on the same repo.

As a consequence, I had to switch from hosting all of the code in the single leoloso/PoP repo, to using multiple repos, with one repo per PHP package. To help manage them, I created the organization “PoP” in GitHub and hosted all repos there, including getpop/root, getpop/component-model, getpop/engine, and many others.

In the multirepo, each package is hosted on its own repo.

Issues with the multirepo

Handling a multirepo can be easy when you have a handful of PHP packages. But in my case, the codebase comprised over 200 PHP packages. Managing them was no fun.

The reason that the project was split into so many packages is because I also decoupled the code from WordPress (so that these could also be used with other CMSs), for which every package must be very granular, dealing with a single goal.

Now, 200 packages is not ordinary. But even if a project comprises only 10 packages, it can be difficult to manage across 10 repositories. That’s because every package must be versioned, and every version of a package depends on some version of another package. When creating pull requests, we need to configure the composer.json file on every package to use the corresponding development branch of its dependencies. It’s cumbersome and bureaucratic.

I ended up not using feature branches at all, at least in my case, and simply pointed every package to the dev-master version of its dependencies (i.e. I was not versioning packages). I wouldn’t be surprised to learn that this is a common practice more often than not.

There are tools to help manage multiple repos, like meta. It creates a project composed of multiple repos and doing git commit -m "some message" on the project executes a git commit -m "some message" command on every repo, allowing them to be in sync with each other.

However, meta will not help manage the versioning of each dependency on their composer.json file. Even though it helps alleviate the pain, it is not a definitive solution.

So, it was time to bring all packages to the same repo.

Stage 3: Monorepo

The monorepo is a single repo that hosts the code for multiple projects. Since it hosts different packages together, we can version control them together too. This way, all packages can be published with the same version, and linked across dependencies. This makes pull requests very simple.

The monorepo hosts multiple packages.

As I mentioned earlier, we are not able to publish PHP packages to Packagist if they are hosted on the same repo. But we can overcome this constraint by decoupling development and distribution of the code: we use the monorepo to host and edit the source code, and multiple repos (at one repo per package) to publish them to Packagist for distribution and consumption.

The monorepo hosts the source code, multiple repos distribute it.

Switching to the Monorepo

Switching to the monorepo approach involved the following steps:

First, I created the folder structure in leoloso/PoP to host the multiple projects. I decided to use a two-level hierarchy, first under layers/ to indicate the broader project, and then under packages/, plugins/, clients/ and whatnot to indicate the category.

Showing the HitHub repo for a project called PoP. The screen in is dark mode, so the background is near black and the text is off-white, except for blue links.
The monorepo layers indicate the broader project.

Then, I copied all source code from all repos (getpop/engine, getpop/component-model, etc.) to the corresponding location for that package in the monorepo (i.e. layers/Engine/packages/engine, layers/Engine/packages/component-model, etc).

I didn’t need to keep the Git history of the packages, so I just copied the files with Finder. Otherwise, we can use hraban/tomono or shopsys/monorepo-tools to port repos into the monorepo, while preserving their Git history and commit hashes.

Next, I updated the description of all downstream repos, to start with [READ ONLY], such as this one.

Showing the GitHub repo for the component-model project. The screen is in dark mode, so the background is near black and the text is off-white, except for blue links. There is a sidebar to the right of the screen that is next to the list of files in the repo. The sidebar has an About heading with a description that reads: Read only, component model for Pop, over which the component-based architecture is based." This is highlighted in red.
The downstream repo’s “READ ONLY” is located in the repo description.

I executed this task in bulk via GitHub’s GraphQL API. I first obtained all of the descriptions from all of the repos, with this query:

{
  repositoryOwner(login: "getpop") {
    repositories(first: 100) {
      nodes {
        id
        name
        description
      }
    }
  }
}

…which returned a list like this:

{
  "data": {
    "repositoryOwner": {
      "repositories": {
        "nodes": [
          {
            "id": "MDEwOlJlcG9zaXRvcnkxODQ2OTYyODc=",
            "name": "hooks",
            "description": "Contracts to implement hooks (filters and actions) for PoP"
          },
          {
            "id": "MDEwOlJlcG9zaXRvcnkxODU1NTQ4MDE=",
            "name": "root",
            "description": "Declaration of dependencies shared by all PoP components"
          },
          {
            "id": "MDEwOlJlcG9zaXRvcnkxODYyMjczNTk=",
            "name": "engine",
            "description": "Engine for PoP"
          }
        ]
      }
    }
  }
}

From there, I copied all descriptions, added [READ ONLY] to them, and for every repo generated a new query executing the updateRepository GraphQL mutation:

mutation {
  updateRepository(
    input: {
      repositoryId: "MDEwOlJlcG9zaXRvcnkxODYyMjczNTk="
      description: "[READ ONLY] Engine for PoP"
    }
  ) {
    repository {
      description
    }
  }
}

Finally, I introduced tooling to help “split the monorepo.” Using a monorepo relies on synchronizing the code between the upstream monorepo and the downstream repos, triggered whenever a pull request is merged. This action is called “splitting the monorepo.” Splitting the monorepo can be achieved with a git subtree split command but, because I’m lazy, I’d rather use a tool.

I chose Monorepo builder, which is written in PHP. I like this tool because I can customize it with my own functionality. Other popular tools are the Git Subtree Splitter (written in Go) and Git Subsplit (bash script).

What I like about the Monorepo

I feel at home with the monorepo. The speed of development has improved because dealing with 200 packages feels pretty much like dealing with just one. The boost is most evident when refactoring the codebase, i.e. when executing updates across many packages.

The monorepo also allows me to release multiple WordPress plugins at once. All I need to do is provide a configuration to GitHub Actions via PHP code (when using the Monorepo builder) instead of hard-coding it in YAML.

To generate a WordPress plugin for distribution, I had created a generate_plugins.yml workflow that triggers when creating a release. With the monorepo, I have adapted it to generate not just one, but multiple plugins, configured via PHP through a custom command in plugin-config-entries-json, and invoked like this in GitHub Actions:

- id: output_data
  run: |
    echo "quot;::set-output name=plugin_config_entries::$(vendor/bin/monorepo-builder plugin-config-entries-json)"

This way, I can generate my GraphQL API plugin and other plugins hosted in the monorepo all at once. The configuration defined via PHP is this one.

class PluginDataSource
{
  public function getPluginConfigEntries(): array
  {
    return [
      // GraphQL API for WordPress
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/graphql-api-for-wp',
        'zip_file' => 'graphql-api.zip',
        'main_file' => 'graphql-api.php',
        'dist_repo_organization' => 'GraphQLAPI',
        'dist_repo_name' => 'graphql-api-for-wp-dist',
      ],
      // GraphQL API - Extension Demo
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/extension-demo',
        'zip_file' => 'graphql-api-extension-demo.zip',
        'main_file' =>; 'graphql-api-extension-demo.php',
        'dist_repo_organization' => 'GraphQLAPI',
        'dist_repo_name' => 'extension-demo-dist',
      ],
    ];
  }
}

When creating a release, the plugins are generated via GitHub Actions.

Dark mode screen in GitHub showing the actions for the project.
This figure shows plugins generated when a release is created.

If, in the future, I add the code for yet another plugin to the repo, it will also be generated without any trouble. Investing some time and energy producing this setup now will definitely save plenty of time and energy in the future.

Issues with the Monorepo

I believe the monorepo is particularly useful when all packages are coded in the same programming language, tightly coupled, and relying on the same tooling. If instead we have multiple projects based on different programming languages (such as JavaScript and PHP), composed of unrelated parts (such as the main website code and a subdomain that handles newsletter subscriptions), or tooling (such as PHPUnit and Jest), then I don’t believe the monorepo provides much of an advantage.

That said, there are downsides to the monorepo:

  • We must use the same license for all of the code hosted in the monorepo; otherwise, we’re unable to add a LICENSE.md file at the root of the monorepo and have GitHub pick it up automatically. Indeed, leoloso/PoP initially provided several libraries using MIT and the plugin using GPLv2. So, I decided to simplify it using the lowest common denominator between them, which is GPLv2.
  • There is a lot of code, a lot of documentation, and plenty of issues, all from different projects. As such, potential contributors that were attracted to a specific project can easily get confused.
  • When tagging the code, all packages are versioned independently with that tag whether their particular code was updated or not. This is an issue with the Monorepo builder and not necessarily with the monorepo approach (Symfony has solved this problem for its monorepo).
  • The issues board needs proper management. In particular, it requires labels to assign issues to the corresponding project, or risk it becoming chaotic.
Showing the list of reported issues for the project in GitHub in dark mode. The image shows just how crowded and messy the screen looks when there are a bunch of issues from different projects in the same list without a way to differentiate them.
The issues board can become chaotic without labels that are associated with projects.

All these issues are not roadblocks though. I can cope with them. However, there is an issue that the monorepo cannot help me with: hosting both public and private code together.

I’m planning to create a “PRO” version of my plugin which I plan to host in a private repo. However, the code in the repo is either public or private, so I’m unable to host my private code in the public leoloso/PoP repo. At the same time, I want to keep using my setup for the private repo too, particularly the generate_plugins.yml workflow (which already scopes the plugin and downgrades its code from PHP 8.0 to 7.1) and its possibility to configure it via PHP. And I want to keep it DRY, avoiding copy/pastes.

It was time to switch to the multi-monorepo.

Stage 4: Multi-monorepo

The multi-monorepo approach consists of different monorepos sharing their files with each other, linked via Git submodules. At its most basic, a multi-monorepo comprises two monorepos: an autonomous upstream monorepo, and a downstream monorepo that embeds the upstream repo as a Git submodule that’s able to access its files:

A giant red folder illustration is labeled as the downstream monorepo and it contains a smaller green folder showing the upstream monorepo.
The upstream monorepo is contained within the downstream monorepo.

This approach satisfies my requirements by:

  • having the public repo leoloso/PoP be the upstream monorepo, and
  • creating a private repo leoloso/GraphQLAPI-PRO that serves as the downstream monorepo.
The same illustration as before, but now the large folder is a bright pink and is labeled as with the project name, and the smaller folder is a purplish-blue and labeled with the name of the public downstream module,.
A private monorepo can access the files from a public monorepo.

leoloso/GraphQLAPI-PRO embeds leoloso/PoP under subfolder submodules/PoP (notice how GitHub links to the specific commit of the embedded repo):

This figure show how the public monorepo is embedded within the private monorepo in the GitHub project.

Now, leoloso/GraphQLAPI-PRO can access all the files from leoloso/PoP. For instance, script ci/downgrade/downgrade_code.sh from leoloso/PoP (which downgrades the code from PHP 8.0 to 7.1) can be accessed under submodules/PoP/ci/downgrade/downgrade_code.sh.

In addition, the downstream repo can load the PHP code from the upstream repo and even extend it. This way, the configuration to generate the public WordPress plugins can be overridden to produce the PRO plugin versions instead:

class PluginDataSource extends UpstreamPluginDataSource
{
  public function getPluginConfigEntries(): array
  {
    return [
      // GraphQL API PRO
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/graphql-api-pro',
        'zip_file' => 'graphql-api-pro.zip',
        'main_file' => 'graphql-api-pro.php',
        'dist_repo_organization' => 'GraphQLAPI-PRO',
        'dist_repo_name' => 'graphql-api-pro-dist',
      ],
      // GraphQL API Extensions
      // Google Translate
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/google-translate',
        'zip_file' => 'graphql-api-google-translate.zip',
        'main_file' => 'graphql-api-google-translate.php',
        'dist_repo_organization' => 'GraphQLAPI-PRO',
        'dist_repo_name' => 'graphql-api-google-translate-dist',
      ],
      // Events Manager
      [
        'path' => 'layers/GraphQLAPIForWP/plugins/events-manager',
        'zip_file' => 'graphql-api-events-manager.zip',
        'main_file' => 'graphql-api-events-manager.php',
        'dist_repo_organization' => 'GraphQLAPI-PRO',
        'dist_repo_name' => 'graphql-api-events-manager-dist',
      ],
    ];
  }
}

GitHub Actions will only load workflows from under .github/workflows, and the upstream workflows are under submodules/PoP/.github/workflows; hence we need to copy them. This is not ideal, though we can avoid editing the copied workflows and treat the upstream files as the single source of truth.

To copy the workflows over, a simple Composer script can do:

{
  "scripts": {
    "copy-workflows": [
      "php -r \"copy('submodules/PoP/.github/workflows/generate_plugins.yml', '.github/workflows/generate_plugins.yml');\"",
      "php -r \"copy('submodules/PoP/.github/workflows/split_monorepo.yaml', '.github/workflows/split_monorepo.yaml');\""
    ]
  }
}

Then, each time I edit the workflows in the upstream monorepo, I also copy them to the downstream monorepo by executing the following command:

composer copy-workflows

Once this setup is in place, the private repo generates its own plugins by reusing the workflow from the public repo:

This figure shows the PRO plugins generated in GitHub Actions.

I am extremely satisfied with this approach. I feel it has removed all of the burden from my shoulders concerning the way projects are managed. I read about a WordPress plugin author complaining that managing the releases of his 10+ plugins was taking a considerable amount of time. That doesn’t happen here—after I merge my pull request, both public and private plugins are generated automatically, like magic.

Issues with the multi-monorepo

First off, it leaks. Ideally, leoloso/PoP should be completely autonomous and unaware that it is used as an upstream monorepo in a grander scheme—but that’s not the case.

When doing git checkout, the downstream monorepo must pass the --recurse-submodules option as to also checkout the submodules. In the GitHub Actions workflows for the private repo, the checkout must be done like this:

- uses: actions/checkout@v2
  with:
    submodules: recursive

As a result, we have to input submodules: recursive to the downstream workflow, but not to the upstream one even though they both use the same source file.

To solve this while maintaining the public monorepo as the single source of truth, the workflows in leoloso/PoP are injected the value for submodules via an environment variable CHECKOUT_SUBMODULES, like this:

env:
  CHECKOUT_SUBMODULES: "";

jobs:
  provide_data:
    steps:
      - uses: actions/checkout@v2
        with:
          submodules: ${{ env.CHECKOUT_SUBMODULES }}

The environment value is empty for the upstream monorepo, so doing submodules: "" works well. And then, when copying over the workflows from upstream to downstream, I replace the value of the environment variable to "recursive" so that it becomes:

env:
  CHECKOUT_SUBMODULES: "recursive"

(I have a PHP command to do the replacement, but we could also pipe sed in the copy-workflows composer script.)

This leakage reveals another issue with this setup: I must review all contributions to the public repo before they are merged, or they could break something downstream. The contributors would also completely unaware of those leakages (and they couldn’t be blamed for it). This situation is specific to the public/private-monorepo setup, where I am the only person who is aware of the full setup. While I share access to the public repo, I am the only one accessing the private one.

As an example of how things could go wrong, a contributor to leoloso/PoP might remove CHECKOUT_SUBMODULES: "" since it is superfluous. What the contributor doesn’t know is that, while that line is not needed, removing it will break the private repo.

I guess I need to add a warning!

env:
  ### ☠️ Do not delete this line! Or bad things will happen! ☠️
  CHECKOUT_SUBMODULES: ""

Wrapping up

My repo has gone through quite a journey, being adapted to the new requirements of my code and application at different stages:

  • It started as a single repo, hosting a monolithic app.
  • It became a multirepo when splitting the app into packages.
  • It was switched to a monorepo to better manage all the packages.
  • It was upgraded to a multi-monorepo to share files with a private monorepo.

Context means everything, so there is no “best” approach here—only solutions that are more or less suitable to different scenarios.

Has my repo reached the end of its journey? Who knows? The multi-monorepo satisfies my current requirements, but it hosts all private plugins together. If I ever need to grant contractors access to a specific private plugin, while preventing them to access other code, then the monorepo may no longer be the ideal solution for me, and I’ll need to iterate again.

I hope you have enjoyed the journey. And, if you have any ideas or examples from your own experiences, I’d love to hear about them in the comments.


The post From a Single Repo, to Multi-Repos, to Monorepo, to Multi-Monorepo appeared first on CSS-Tricks. You can support CSS-Tricks by being an MVP Supporter.

How Be and the Muffin Builder will reinvent the way you build websites

Page builders have revolutionized the website building process. As is the case with many other WordPress tools, the number of available options can make it difficult to find the best one for your particular needs. 

With BeTheme’s exciting new page builder, your quest may be at an end.

See for yourself how the new Muffin Live Builder tool provides you and other web designers with a more intuitive and faster way to create superior websites for clients:

You have more than likely come across various live editing features in other page builders. But nothing like the way in which they have been placed at your fingertips in this live builder. And in a way that will transform your website building workflows and processes.

Let’s check out five of them.

Muffin page builder features that will change your approach to building websites

At first glance you might think there’s all not that much new about the Muffin Live Builder; but see how it differs from the others. 

1. The Muffin Live Builder is a part of a fully-loaded website builder platform

Finding the right WordPress page builder plugin to work with can be frustrating. One of the problems you may encounter is a need to upgrade to get features you need but are missing. If you are already paying for a premium theme, why would you want to pay even more for an additional design and editing tool?

With BeTheme you don’t have that problem since everything you need, including a premium page builder, is already included in your license: 

  • 600+ pre-built professionally crafted, customizable websites
  • 100+ pre-built section templates you can mix and match
  • 60+ customizable elements
  • The Muffin Builder and Live Builder premium page builders
  • WooCommerce, Slider Revolution, Contact Form 7, and other premium plugins
  • Top-of-the-line 24/7 support

With BeTheme you have at your fingertips an all-in-one solution for creating WordPress websites that will bring smiles to the faces of your clients or visitors.  

2. BeTheme comes with Pre-built sections you can use to create any type of website to serve any purpose

BeTheme’s pre-built website sections encompass features and functionalities that effectively address the majority of design and content decisions you are likely to be faced with.

These features and functionalities include basic website structure, home page and internal page layouts, and image and text placement.

That’s not to say that a given pre-built website will satisfy every client’s needs.

For example: You’re building an online surf shop for a client. Be Surfing fills the bill, with one exception. Your client wants to have a carousel on the home page to promote recent blog posts. 

The Live Builder takes care of that problem by enabling you to add a new section (in this case the carousel section) by using an existing pre-built section or creating it yourself. 

Pre-designed sections aren’t new to page builders, but when they are included, as is the case with the Live Builder, you won’t waste time searching for a section that matches your site or having to strip out existing designs or content to make things fit.

The Live Builder features more than 100 of these pre-built sections for you to choose from. They are nicely organized into “Call to Action”, “Contact”, and other practical categories. 

Since these pre-built section templates more closely resemble wireframes than fully designed sections, they can easily be repurposed to fit in with any type or style of website you might happen to be working on.

3. The Live Builder’s adeptly-organized toolbars make customizing sites a simple task

A shortcoming amongst most of the visually editing page builders is that their toolbars don’t always gives users a clear insight as to what their functions actually are. This can cause the web designer to have to search for ways to find which settings they need to configure.

The Live Builder avoids this problem. Any time a new element is added, or one is selected for editing, a custom toolbar appears on the left of the screen:

Every available setting for that particular element is displayed on the toolbar. You don’t have to waste time trying to figure out where the settings you need to access are located. Everything is in one place and right in front of you. 

The Live Editor’s admin tool bar is equally well organized and, like the information in the settings toolbar, the most frequently used admin actions are located in one place on the left of the screen. 

The Live Builder makes is so easy to get around. The admin action you need to invoke, whether its Save, Undo, Return to WordPress, or something else, you won’t find your self having to dig through layers of information before you can take the desired action.

4. Live Builder’s Autosave and backup tools come in handy for maintaining version control

Among the actions that you can access from the admin toolbar are four different save tools, which combine with the standard Publish and Update tools to establish and maintain version control.  

These save features consist of: 

  • Autosave, that every five minutes makes a copy of the page you are working on
  • Updates, that stores copies of a page when you hit the “Update” button
  • Revision, which enables you to save special versions of the page
  • Backup, that stores copies of a page prior to restoring an older one

This feature makes it easy to create an historical record of the building of a page, automatically manage backups, and take you back to a preferred version with a single click. Put another way, it can shorten your website building workflows dramatically.

5. Lightweight editor for fast editing

It can be frustrating indeed when editing becomes a ponderous process because of the time it can take for a page builder to load. When you’re designing and/or editing multiple pages, and taking into account multiple revisions, you get the feeling that you could compose a symphonic work during the wait time you have to put up with. 

The Live Editor is so super lightweight that you barely have time to catch your breath before your work is right in front of you and ready to be acted on.

The 12-second clip you see here illustrates how quickly the Muffin page builder loads in real time: 

According to BeTheme developers, the Live Builder is 60% faster than its previous version. Of even greater importance is that the Live Builder, when stacked up against other premium WordPress page builders, does not keep you in suspense when you try to open a page.

Faster is better, but how fast, in terms of page loading speed, can be crucial.

The Core Web Vitals tool comes in handy for checking a website’s ranking factors, one of which is loading speed. The BeTheme team used this tool to compare the loading speed of a Be Mechanic website built with Elementor compared to the same website built with the Muffin Live Builder.

This is what they found for the Elementor-built site: 

The page with the largest byte content took 6.116 seconds to load. 

The team compared that with the same page on the Live Builder site:

It took only 1.232 seconds to load.

Google knows that when a page takes more than three seconds to load, visitors start leaving in droves to look for a site that might have something better to offer. 

That is why having a lightweight page builder capable of creating a speedier and more pleasant experience for your visitors is so very important.

Will you be choosing the new Muffin Live Builder for your next website building projects?

Between its intuitive design, easily accessible features and settings, and fast loading, the Muffin Live visual page builder can be a WordPress web designer’s best friend.  

As this sampling of rave reviews is quick to point out:

Graphical user interface, text, application, email

Description automatically generated

Don’t forget that the Muffin Live Builder is only one part of a powerful WordPress toolkit that features:

  • 600+ customizable pre-built websites
  • 100+ pre-built sections,
  • 60+ elements
  • bundled premium plugins

and a host of other features at your fingertips for building websites that other web designers can only dream of.

The post How Be and the Muffin Builder will reinvent the way you build websites appeared first on Codrops.

Best Digital Marketing at affordable price ~ BY SDLCCorp.

Best Digital Marketing By SDLCCorp. Specializing in Digital Marketing as the word suggests is the marketing of the products and services via digital sources, When marketing of goods is done via the internet to increase the page views of the websites and selling their goods or services through the computers itself is included under digital marketing.

Visit us at:- https://www.sdlccorp.com/digital-marketing/
Call us at:- +1 (618)6154906

digital marketing services in united kingdom

Branding is basically increasing the value of the brand, that is all activities that are diverted towards portraying the image of the brand and strategizing ways to put forward the positive image of the brand. Nowadays branding of the companies is also conducted via digital ways as it has a wide reach of consumers and efficient marketing can be done with the help of these ways.

#digitalmarketingcompany #digitalmarketingagency #digitalmarketingstrategy #digitalmarketingexpert #digitalmarketingconsultant

for more information visit: https://www.sdlccorp.com/digital-marketing/
or call +1 (618) 6154906

Google Ignoring our Meta Description

My website site ranking was in the top ten. Suddenly Google has been Ignoring our Meta Description and Now Google is pulling meta description from my about us page content. After my website Ranking Dropped Dramatically . Can anyone help me with this problem. my website name is bismatrimony.com

How to Add a Post Creation Limit for WordPress Users

Are you looking for a way to limit the number of posts a user can publish or submit for review?

Limiting post creation is helpful if you’re running a multi-author website or have many members submitting content. It allows you to control the number of articles a user can submit, so you can easily manage content on your site.

In this article, we will show you how to add a post creation limit for users on your WordPress site.

How to Add a Post Creation Limit in WordPress

Who Needs to Limit Post Creation in WordPress?

There are several use-cases where you may want to limit the number of posts created by authors within a specific period of time.

A common scenario is if you run a multi-author blog, then you may want to limit the number of posts each author can submit per day, per week, or even by month.

Limiting posts per author makes it easier for you to manage the content on your website and improve the editorial workflow. You’ll have more time to review multiple articles and assign topics to different authors.

Or, let’s say you have a WordPress membership site that gives its members the ability to promote their content through announcements.

In that case, you may want to limit the number of announcements each member can post per week or month.

Another great use case is a listing directory of some kind. For example, you may have a real-estate website where you allow agents to add properties. You can limit the number of properties each agent can add per day or month.

That being said, let’s take a look at how to limit post submissions and creations by users in WordPress.

How to Limit Posts by Users in WordPress

You can easily limit posts for different users using a WordPress plugin without having to touch a single line of code.

For this tutorial, we’ll be using the User Post Limit plugin. It’s a free plugin, and you can use it to set up post limits in just a few clicks.

First, you’ll need to install and activate the User Post Limit plugin on your website. For more details, you can follow our guide on how to install a WordPress plugin.

Once the plugin is active, simply head over to Settings » Posts Limit from your WordPress dashboard. From there, you’ll see options to set post limits based on different user roles.

In the ‘text’ field, the plugin lets you edit the notification that a user will receive when they exceed the limit. There are more advanced options you can change, but the default settings will work for most users.

User Posts Limit settings

Next, you can select a user role for which you’d like to set up a post limit. For example, you can select Author, Editor, Administrator, or any other roles from the dropdown menu.

After selecting the user role, go ahead and choose which type of content you’d like to limit. You’ll see multiple options in the dropdown menu like posts, pages, media, revisions, and more.

Once that’s done, enter a limit for the number of posts a user can submit and select a cycle, such as days, years, months, or weeks. When you’re done, don’t forget to click the ‘Save Changes’ button.

Change post limit settings

If you want to set up post limits for different user roles on your WordPress website, then change the Rules number and repeat the steps.

For example, you can set a post limit for authors, and then create a higher post limit for editors.

To create a new rule, simply change the Rules number to 2 and select a user role, such as Editor. Next, select the content type and enter the limit and cycle.

When you click the Save Changes button, you’ll see your new rule added under the first rule. You can go ahead and create as many post limit rules as you want for each user role.

Two different rules for post limits

That’s all. When users try to create more posts than the set limit, they will receive a notification like this:

Post limit notification

We hope this article helped you to add a post creation limit for WordPress users on your site. You may also want to check our guide on how to improve your editorial workflow in multi-author blogs and the best email marketing services.

If you liked this article, then please subscribe to our YouTube Channel for WordPress video tutorials. You can also find us on Twitter and Facebook.

The post How to Add a Post Creation Limit for WordPress Users appeared first on WPBeginner.

What benefits SEO offers to any business?

#1. SEO RESULTS TO HIGHER CONVERSION RATE
Search engine optimization can significantly increase the conversion rate of your site. Once your business becomes more popular long enough, your target market will realize the quality of the work you do. This initiative will aid you in establishing and maintaining your authority in your field.
#2. IT ENHANCES BRAND AWARENESS
Besides converting users into customers, improving your rankings also increases brand awareness. Even if you only reach the top and then climb your way up to the top of SERPs, you'll generate in-numerable touchpoints. Despite not clicking into your website, potential customers will associate your brand with the solutions you offer by simply being there.
#3. IT IS POSSIBLE TO DO SEO ON A BUDGET
Unlike other marketing strategies such as pay-per-click, SEO does not cost you any money other than your time if you manage it yourself. Search engines crawl A website 24/7, which promotes your crucial content and provides new customers organically. Invest some time reviewing the content on top-ranking sites in your niche, and aim to write better content than theirs. Once you've created content, post the information on your social networks.

what is Right-to-Use (RTU) License?

"I am a technology practitioner who has just entered the data communication industry. I have always hoped to increase the port speed through my efforts and achieve a smoother and stable network connection, whether it is for enterprises, operators or homes. Maybe your distance from the optimized network is just a license!
Recently I saw a very surprising technical guide, which introduced in detail the important concepts of switch licenses, and provided instructions for the use of a variety of service licenses and typical operating methods. I don't want to be the last person to know it!
This is the link:
https://support.huawei.com/enterprise/en/doc/EDOC1100195306"