The Growing Need For Effective Password Management

This article is a sponsored by Passwork

As businesses rely more on digital services and platforms, the number of passwords and access credentials employees need to remember has grown exponentially. This can lead to the use of weak or duplicated passwords, posing a significant security risk. A centralized and secure password management system is essential for mitigating these risks and ensuring that sensitive information remains protected.

Self-Hosted vs. Cloud-Based Password Management Solutions

When it comes to password management solutions, businesses have two primary options: self-hosted and cloud-based. While both have their merits, self-hosted solutions often provide a higher level of control and customization.

Advantages Of Self-Hosted Solutions

  • Greater control
    A self-hosted solution allows administrators to have complete control over the password management infrastructure, enabling them to customize it according to their company’s needs;
  • Enhanced security
    By hosting the password management system on the company’s own servers, businesses can ensure that their sensitive data remains within their control, reducing the risks associated with third-party providers;
  • Compliance
    Self-hosted solutions make it easier for companies to meet industry-specific compliance requirements and data protection regulations.

Limitations Of Cloud-Based Solutions

  • Dependency on third-party providers
    With cloud-based solutions, businesses rely on external providers for the security and availability of their data. This can lead to potential vulnerabilities and the risk of data breaches;
  • Limited customisation
    Cloud-based solutions often have predefined features and settings, which may not align with a company’s unique requirements.
Collaborative Password Management In Companies

In a company setting, employees often need to share passwords and access credentials for various applications and services. A collaborative password management system enables the secure sharing of these credentials, improving productivity and security.

Secure Sharing

Collaborative password management systems, like Passwork, provide secure sharing options, allowing employees to share access credentials with colleagues without exposing sensitive data to unauthorized users. This is the kind of feature that a company needs for frictionless sharing in a collaborative environment, but without exposing sensitive information as you might through another platform, like email. This way, sharing happens securely through the password app’s service.

Permission Management

To maintain control over who can access and modify shared passwords, a collaborative password management system should offer granular permission management. Administrators can assign different levels of access to individual users or groups, ensuring that employees have access to the information they need without compromising security.

Another benefit of permission management is that it provides you with an easy path to knowing who has access to certain information, as well as an easy way to assign and revoke permissions on an individual and group level.

Version Control

Have you ever created a new password for a service, then needed to reference the past password? There’s nothing worse than losing a password when you need it in a pinch, and in an environment where multiple users can update and modify shared passwords, version control becomes essential. Collaborative password management systems should provide a history of changes made to shared credentials, enabling administrators to track modifications and revert to previous versions if needed.

Access Rights Segregation

To ensure that sensitive data remains protected, companies should implement access rights segregation within their password management system. This involves dividing users into different groups based on their roles and responsibilities and assigning appropriate access permissions accordingly.

Role-Based Access Control (RBAC)

RBAC is a widely used method for implementing access rights segregation. With RBAC, administrators can create roles that represent different job functions within the company and assign appropriate permissions to each role. Users are then assigned to roles, ensuring that they only have access to the information they need to perform their tasks.

Attribute-Based Access Control (ABAC)

ABAC is a more flexible approach to access control, where permissions are granted based on a user’s attributes (e.g., job title, department, location, and so on) rather than predefined roles. This allows for greater customization and scalability, as administrators can create complex access rules that adapt to changing business requirements.

Auditing And Monitoring Activity

To maintain a secure password management system, administrators must be able to monitor and audit user activity. This sort of transparency allows you to know exactly who changed something at a particular point in time so you can take corrective action. This includes tracking changes to passwords, monitoring access attempts, and identifying potential security threats.

Activity Logging

A comprehensive password management system should log all user activity, including access attempts, password modifications, and sharing events. This information can be invaluable for detecting unauthorized access, troubleshooting issues, and conducting security audits.

For example, it’s nice to have a way to see who has used a particular password and when they used it, especially for troubleshooting permissions.

Real-Time Notifications

In addition to logging activity, real-time alerts can help administrators quickly identify and respond to potential security threats. A password management system that provides real-time notifications for suspicious activity, such as multiple failed login attempts or unauthorized password changes, can be instrumental in preventing data breaches.

Reporting

Generating reports on user activity, password strength, and compliance can help administrators assess the overall health of their password management system and identify areas for improvement. Regularly reviewing these reports can also ensure that the company remains compliant with relevant industry regulations and best practices.

Best Practices For Implementing A Password Management System

To ensure the success of a password management system, it’s crucial to follow best practices for implementation and ongoing maintenance. You want to ensure that your passwords are managed in a way that is safe for everyone in your company while adhering to compliance guidelines for a secure environment.

First, Choose The Right Solution

Selecting the right password management system for your company is essential. Consider factors such as the size of your organization, the level of customization required, and your preferred hosting option (self-hosted vs. cloud-based) when evaluating solutions. Passwork, for example, offers a self-hosted solution with robust collaboration features, making it a suitable option for businesses looking for greater control and customization.

Next, Train Employees

Employee training is crucial for the successful adoption of a password management system. Ensure that all users understand how to use the system, the importance of password security, and company policies related to password management.

Regularly Review And Update Policies

As your business evolves, your password management policies should adapt accordingly. Regularly review and update your policies to ensure that they continue to meet your organization’s needs and maintain compliance with industry regulations.

Monitor And Audit System Activity

Stay vigilant by regularly monitoring and auditing your password management system. This will help you identify potential security threats and ensure that your system remains secure and up-to-date.

Password Policy Best Practices

To maintain a secure password management system, it’s essential to establish strong password policies and ensure that employees follow best practices.

Password Length And Complexity

A strong password policy should require a minimum password length and a combination of characters, including upper and lower case letters, numbers, and special characters. This helps to increase password entropy, making it more difficult for attackers to guess or crack passwords using brute force methods.

Password Expiration And Rotation

Regularly changing passwords can help to minimise the risk of unauthorized access, especially in cases where passwords have been compromised without the knowledge of the organization. Implementing a password expiration policy that requires users to change their passwords at regular intervals can enhance security.

Two-Factor Authentication (2FA)

In addition to strong password policies, implementing two-factor authentication can provide an additional layer of security. By requiring users to provide a second form of verification, such as a code sent to a mobile device, 2FA reduces the risk of unauthorized access even if a password is compromised.

Prevent Reused Passwords

Employees should be discouraged from using the same password across multiple accounts and services. Encourage the use of unique passwords for each account to minimise the risk of unauthorized access in case one password is compromised.

Integrations And Compatibility

An effective password management system should be compatible with various platforms, applications, and services that your company uses. This ensures seamless integration and streamlined access management.

Single Sign-On (SSO)

SSO enables users to access multiple applications and services with a single set of credentials.

By integrating your password management system with SSO, you can simplify the login process for employees, reducing the need for multiple passwords and improving security.

Browser Extensions and Mobile Apps

A password management system that offers browser extensions and mobile apps can help ensure that employees have access to their passwords and credentials wherever they are. This enhances productivity and encourages the adoption of the password management system.

Custom Integrations

Depending on your company’s requirements, you may need to integrate your password management system with other tools, such as ticketing systems, customer relationship management platforms, or identity and access management solutions. Ensure that the password management system you choose is flexible and allows for custom integrations. Make sure that the password management system you decide to use has the flexibility to connect with the other services you rely on for your business.

Backup And Disaster Recovery

A robust password management system should include backup and disaster recovery features to ensure the availability and integrity of your organization’s passwords and credentials.

Regular Backups

Implement a backup policy that includes regular backups of your password management system’s data. This helps to protect against data loss due to hardware failures, accidental deletions, or other unforeseen issues.

Encrypted Backups

Backups should be encrypted to protect the sensitive data they contain. Ensure that your password management system supports encrypted backups and uses strong encryption algorithms to safeguard your data.

Disaster Recovery Plan

Develop a disaster recovery plan that outlines the steps to be taken in case of a system failure, data breach, or other security incidents. This plan should include procedures for restoring data from backups, as well as measures to prevent further damage or unauthorized access.

Evaluating And Selecting A Password Management Solution

When choosing a password management system, it’s important to thoroughly evaluate potential solutions and select the one that best meets your organization’s needs.

Security Features

Assess the security features offered by each solution, such as encryption algorithms, two-factor authentication support, and activity monitoring capabilities. Ensure that the solution adheres to industry standards and best practices for data security.

Scalability

Consider the scalability of the password management system, especially if your organization is growing or has plans for expansion. The solution should be able to handle an increasing number of users and passwords without compromising performance or security.

Ease of Use

User adoption is crucial for the success of a password management system. Evaluate the user interface and overall ease of use of each solution, as this can have a significant impact on employee adoption and satisfaction.

Cost

Consider the total cost of ownership for each password management system, including initial implementation costs, ongoing maintenance, and any additional fees for upgrades or add-on features. Be sure to weigh these costs against the potential benefits and savings offered by a more secure and efficient password management process.

Ongoing Maintenance And Support

Once your password management system is in place, it’s essential to keep it up-to-date and ensure that users receive the necessary support.

Software Updates

Regularly update your password management system to benefit from the latest security patches, feature enhancements, and bug fixes. This helps to maintain the stability and security of the system.

User Support

Provide user support for your password management system, including training materials, FAQs, and access to technical assistance when needed. This ensures that employees can effectively use the system and resolve any issues that may arise.

Periodic Security Assessments

Conduct periodic security assessments of your password management system to identify any potential vulnerabilities and ensure that it continues to meet your organization’s security requirements. This may include penetration testing, vulnerability scanning, and other security assessments.

Conclusion

Organizing password management in a company is a critical task for system administrators. By selecting the right solution, implementing access rights segregation, fostering collaboration, and actively monitoring and auditing the system, administrators can create a secure and efficient password management environment. Additionally, establishing strong password policies, ensuring like Passwork, can offer businesses greater control and customization, providing a solid foundation for effective password management.

By following the best practices outlined in this guide, system administrators can enhance their organization’s overall security posture while improving productivity and streamlining access management.

How To Build A Localized Website With Hugo And Strapi

Localizing your site can benefit your business or organization in several ways. By translating your content or site, you expand the markets you target. Adapting your product to the language and cultural preferences of potential customers who were not able to use your product before boosts your conversion rates.

Ultimately, this often leads to a growth in the revenue you generate. With a larger, more widespread customer base, your brand becomes increasingly recognizable and strengthened in newer markets.

A localized website has a higher SEO score which means that users within a specific market can find it easier through a search engine. A recognizable brand and improved SEO score reduce the cost of marketing to users within the markets you target.

We’ve seen that localization has its benefits, but what exactly is it? Localization is the process of revising your website, app, or content that was initially intended for a primary market to suit the needs of a new market you plan on targeting. Localization often involves translating a product into the language used in the market you want to introduce it to. It can also mean adding new things or removing parts of the product, for example, that might offend the market. You may also modify a product by changing its look and feel based on writing systems, color preferences, etc.

Although localization may seem straightforward, it cannot happen if the underlying site or app cannot accommodate these changes. Since it isn’t practical to build the same site for every market you want to enter, it makes sense that your site should switch content, language, UI elements, etc., between markets. That’s where internationalization comes in.

Internationalization is the process of designing and building a site or app to accommodate localization across different markets. For example, an online magazine’s site published in Portugal, Japan, and Ireland needs to accommodate different languages, writing systems, payment processors, and so on.

Before embarking on localization, it is important to pick a backend that will help you manage your site content across different locales. Strapi is one choice that provides this functionality. It’s an open-source headless content management system (CMS) built with Node.js. With it, you can manage and structure content into types using its content types builder on its user-friendly admin panel. For every content type you create, it automatically generates a customizable API for it. You can upload all kinds of media and manage them using its media library.

With its Role-Based Access Control (RBAC) features, you can set custom roles and permissions for content creators, marketers, localizers, and translators. This is especially useful since different people on a team should only be responsible for the content in the locales they manage. In this tutorial, you will learn about its internationalization feature that allows you to manage content in different languages and locales.

Your frontend also needs to handle your content in different languages and present it to multiple locales adequately and efficiently. Hugo is an amazing option for this. It’s a static site generator built with Go. It takes your data and content and applies it to templates. It then converts them to static pages, which are faster to deliver to your site visitors.

Hugo builds sites pretty fast, with average site builds completed in a second or less. It supports several content types, enables theme integration, meticulously organizes your content, allows you to build your site in multiple languages, and write content in markdown. It also supports Google Analytics, comments with Disqus, code highlighting, and RSS. Static sites are faster, have great SEO scores, have better security, and are cheaper and less complicated to make.

Without further ado, let’s dive right in!

Pre-Requisites

Before you can proceed with this tutorial, you will need to have:

  1. Hugo installed.
    You can get it through pre-built binaries, which are available for macOS, Windows, Linux, and other operating systems. You can also install it from the command line. These installation guides are available on the Hugo website explaining how to get it in this way. This tutorial was written using v0.68.
  2. Node.js installed.
    Strapi requires at minimum Node.js 12 or higher but recommends Node.js 14. Do not install a version higher than 14 as Strapi may not support it. The Node.js downloads page offers pre-built installers for various operating systems on its website.
An Example Site

To illustrate how localization can work using Strapi and Hugo, you’ll build a documentation website for a product used in Canada, Mexico, and America. The top three languages spoken in those regions are English, French, and Spanish. So, the documents on this site need to be displayed in each of them. The site will have three pages: a home page, an about page, and a terms page.

The Strapi CMS provides a platform to create content for those pages in those three languages. It will later serve the markdown versions of the content created through its API. The Hugo site will consume this content and display it depending on the language a user selects.

Step 1: Setting Up the Strapi App

In this step, you will install the Strapi app and set up an administrator account on its admin panel. The app will be called docs-server. To begin, on your terminal, change directories to the location you’d like the Strapi app to reside and run:

npx create-strapi-app@3.6.8 docs-server

When prompted:

  1. Select Quickstart as the installation type.
  2. Pick No when asked to use a template.
? Choose your installation type Quickstart (recommended)
? Would you like to use a template? (Templates are Strapi configurations designed for a specific use case) No

This command will create a Strapi quickstart project, install the dependencies it requires, and run the application. It will be available at http://localhost:1337. To register an administrator, head to http://localhost:1337/admin/auth/register-admin. You should see the page below.

Enter your first and last names, an email, and a password. Once you’ve finished signing up, you will be redirected to the admin panel. Here’s what it looks like.

On the admin panel, you can create content types, add content entries, and manage settings for the Strapi app. In this step, you generate the Strapi app and set up an administrator account. In the next one, you will create content types for each of the three pages.

Step 2: Create the Content Types

In this step, you will create content types for each of the three pages. A content type on Strapi, as the name suggests, is a type of content. Strapi supports two categories of content types: collection types and single types. A collection type is for content that takes a single structure and has multiple entries.

For example, a blog post collection type collects multiple blog posts. A single type is for content that is unique and only has one entry. An about content type that models content for an about page, for instance, is a single type because a site typically has only one about page.

To generate these types, you’re going to use the Strapi CLI. You have the option of using the existing Strapi admin panel to create the types if you wish. However, the Strapi CLI can be faster and involves fewer steps.

If the Strapi is running, stop it. Running the commands in this step will cause errors that will crash the app. Once you’ve completed this step, you can run it again with the command below on your terminal within the docs-server directory:

npm run develop

Since you will have three separate pages, you will create three different single types. These will be the home, about, and terms types. Each will have a content and title attribute. These two attributes are just a starting point. You can modify the types later if you’d like to add more attributes or customize them further. To create them, run this command on your terminal within the docs-server directory:

for page in home about terms; do npm run strapi generate:api $page title:string content:richtext ;done

Running the above command will generate the home, about, and terms content types with title and content attributes. It also generates APIs for each of the page types. The APIs are generated within the api/ folder. Here’s what this folder looks like now.

api
├── about
│   ├── config
│   │   └── routes.json
│   ├── controllers
│   │   └── about.js
│   ├── models
│   │   ├── about.js
│   │   └── about.settings.json
│   └── services
│       └── about.js
├── home
│   ├── config
│   │   └── routes.json
│   ├── controllers
│   │   └── home.js
│   ├── models
│   │   ├── home.js
│   │   └── home.settings.json
│   └── services
│       └── home.js
└── terms
    ├── config
    │   └── routes.json
    ├── controllers
    │   └── terms.js
    ├── models
    │   ├── terms.js
    │   └── terms.settings.json
    └── services
        └── terms.js

Each of the content types have models, services, controllers, and configuration created for them. Several API routes are added as well to create, modify, and retrieve content modeled against these types.

In the api/about/models/about.settings.json file, you will change the kind of the about content type from a collection type to a singleType. You will also add a description and enable localization for it and its attributes. Replace the code with the following:

{
  "kind": "singleType",
  "collectionName": "about",
  "info": {
    "name": "about",
    "description": "The about page content"
  },
  "options": {
    "increments": true,
    "timestamps": true,
    "draftAndPublish": true
  },
  "pluginOptions": {
    "i18n": {
      "localized": true
    }
  },
  "attributes": {
    "title": {
      "pluginOptions": {
        "i18n": {
          "localized": true
        }
      },
      "type": "string"
    },
    "content": {
      "pluginOptions": {
        "i18n": {
          "localized": true
        }
      },
      "type": "richtext"
    }
  }
}

In this file, you are adding detail to the content type that you can’t specify when generating them through the CLI. The kind property changes to a singleType from a collection type. Localization is enabled using the pluginOptions property. By setting localized to true under the i18n internationalization property, localization is enabled for the type as well as the attributes that specify the same property.

Next, you will modify its API routes to only have routes that will update, delete, and retrieve content. When you create a content type using the CLI, it is by default a collection type. A collection type has five routes created for it: routes to find, find one, count, delete, update, and post. A single type doesn’t need count, post and find-one routes since there’s just one entry. So you will be removing these. Replace the contents of api/about/config/routes.json with this code:

{
  "routes": [
    {
      "method": "GET",
      "path": "/about",
      "handler": "about.find",
      "config": {
        "policies": []
      }
    },
    {
      "method": "PUT",
      "path": "/about",
      "handler": "about.update",
      "config": {
        "policies": []
      }
    },
    {
      "method": "DELETE",
      "path": "/about",
      "handler": "about.delete",
      "config": {
        "policies": []
      }
    }
  ]
}

Since the other content types share the same attributes, you will make similar changes to the model settings for each of the other types. The content types in this tutorial share the same attributes for demonstration purposes but you can modify them to suit the needs of the pages you create. In the api/privacy/models/home.settings.json file, change the code to:

{
  "kind": "singleType",
  "collectionName": "home",
  "info": {
    "name": "Home",
    "description": "The home page content"
  },
  "options": {
    "increments": true,
    "timestamps": true,
    "draftAndPublish": true
  },
  "pluginOptions": {
    "i18n": {
      "localized": true
    }
  },
  "attributes": {
    "title": {
      "type": "string",
      "pluginOptions": {
        "i18n": {
          "localized": true
        }
      }
    },
    "content": {
      "type": "richtext",
      "pluginOptions": {
        "i18n": {
          "localized": true
        }
      }
    }
  }
}

Similar to the about API routes, you will remove the find-one, count, and post routes for the home content type since it’s a single type. Replace the contents of the api/home/config/routes.json file with this code:

{
  "routes": [
    {
      "method": "GET",
      "path": "/home",
      "handler": "home.find",
      "config": {
        "policies": []
      }
    },
    {
      "method": "PUT",
      "path": "/home",
      "handler": "home.update",
      "config": {
        "policies": []
      }
    },
    {
      "method": "DELETE",
      "path": "/home",
      "handler": "home.delete",
      "config": {
        "policies": []
      }
    }
  ]
}

Lastly, in the api/terms/models/terms.settings.json file, replace the existing code with:

{
  "kind": "singleType",
  "collectionName": "terms",
  "info": {
    "name": "Terms",
    "description": "The terms content"
  },
  "options": {
    "increments": true,
    "timestamps": true,
    "draftAndPublish": true
  },
  "pluginOptions": {
    "i18n": {
      "localized": true
    }
  },
  "attributes": {
    "title": {
      "type": "string",
      "pluginOptions": {
        "i18n": {
          "localized": true
        }
      }
    },
    "content": {
      "type": "richtext",
      "pluginOptions": {
        "i18n": {
          "localized": true
        }
      }
    }
  }
}

To remove the unnecessary find-one, count and post API routes for the terms content type, change the contents of api/terms/config/routes.json to this:

{
  "routes": [
    {
      "method": "GET",
      "path": "/terms",
      "handler": "terms.find",
      "config": {
        "policies": []
      }
    },
    {
      "method": "PUT",
      "path": "/terms",
      "handler": "terms.update",
      "config": {
        "policies": []
      }
    },
    {
      "method": "DELETE",
      "path": "/terms",
      "handler": "terms.delete",
      "config": {
        "policies": []
      }
    }
  ]
}

Now you have content types set up for all three pages. In the next step, you will add locales for the markets your content is targeted to.

Step 3: Adding the Locales

In this step, you will add the different locales you’d like to support. As explained in the example section, you will add English(America)(en-US), French(Canada)(fr-CA), and Spanish(Mexico)(es-MX). Be sure to run Strapi with npm run develop, then go to the Internationalization settings, under Settings then Global Settings, and add these locales by clicking the blue Add a locale button.

In the popup, select a locale then click Add locale. You should add the three locales listed in the table below. They are all available in the Locales dropdown.

Locale Local Display Name
en-US English(America)
es-MX Spanish(Mexico)
fr-Ca French(Canada)

When adding these locales, set one as the default locale under Advanced Settings in the Add a locale pop-up. This makes it easier when adding content the first time around. If you do not, the first entry will always default to the en locale. If you do not need the en locale, it’s best to delete it after setting an alternate default locale.

In this step, you added locales on your Strapi app. These will be used when you add content. In the proceeding step, you will add placeholder content for each of the pages.

Step 4: Add Content to Strapi App

In this step, you will add content to the Strapi app for each of the three pages. You will do this using the content manager on the admin panel. Here are links to content entry forms on the admin panel for each of the types:

  1. About page
  2. Home page
  3. Terms page

Here’s what a content entry form looks like.

Add a title and some content. When adding content, always check the locale. Make sure the language of the content matches the locale language.

Once you’re done, click the bright green Save button then the Publish button in the top right of the entry form. When you want to add new content for a locale, select it from the Locales dropdown in the Internationalization section on the right of the form. Remember to save and publish the new content.

Here’s what you’ll add for each of the pages for the title field:

English (America)(en-US) French (Canada)(fr-CA) Spanish (Mexico)(ex-MX)
About À propos Sobre
Home Accueil Hogar
Terms Conditions Condiciones

For the content, you can use this lorem ipsum text for all the pages. You can add a flag emoji for the country to identify the change in locale. This is placeholder content only for demonstration purposes.

English (America)(en-US) # 🇺🇸

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Donec nec neque ultrices, tincidunt tellus a, imperdiet nulla. Aliquam erat volutpat. Vestibulum finibus, lectus sit amet sagittis euismod, arcu eros tincidunt augue, non lobortis tortor turpis non elit.
French (Canada)(fr-CA) # 🇨🇦

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Donec nec neque ultrices, tincidunt tellus a, imperdiet nulla. Aliquam erat volutpat. Vestibulum finibus, lectus sit amet sagittis euismod, arcu eros tincidunt augue, non lobortis tortor turpis non elit.
Spanish (Mexico)(ex-MX) # 🇲🇽

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Donec nec neque ultrices, tincidunt tellus a, imperdiet nulla. Aliquam erat volutpat. Vestibulum finibus, lectus sit amet sagittis euismod, arcu eros tincidunt augue, non lobortis tortor turpis non elit.

In this step, you added placeholder content in multiple languages for different locales. In the next step, you will make the API routes for the content types public.

Step 5: Making the API Routes Public

In this step, you will make the routes that return page content public. These are the GET routes for /home, /about, and /terms. Currently, if you try to access them, you will get a 403 Forbidden error. This is because the permissions set do not allow them to be accessed publicly. You’ll change this so that they are publicly accessible.

To do this:

  1. head over to the Public Roles settings under Users & Permissions Plugin using this link;
  2. in the Application settings, under Permissions, select the find checkboxes for Home, About, and Terms;
  3. click the bright green Save button in the top right of the page.

Here’s a screenshot of what checkboxes to select in the Application Permissions section in the Public Roles settings page:

Now the routes at http://localhost:1337/home, http://localhost:1337/about, and http://localhost:1337/terms are all accessible. They return the content you entered for the pages in the previous step. To specify a locale when fetching content, use the _locale query parameter and assign it the locale. For example, http://localhost:1337/home?_locale=fr-CA will return the home page for the Canadian French locale. If you do not specify a locale, content for the default locale will be returned.

In this step, you made the routes that return content public. In the next step, you will generate a Hugo site that will consume the localized content.

Step 6: Generate a New Hugo Site

The Hugo site that will display the localized content will be called docs-app. To generate it, run the following command on your terminal in a separate directory outside the docs-server project:

hugo new site docs-app

Running this command will generate a new Hugo site. It will scaffold the site with different folders that contain site input. Hugo will use this input and generate a whole site. However, no themes nor content have been added. You will have to add the content and theme. The content will come from the Strapi application. You can view the new site by running:

cd docs-app && hugo server

The app is served at http://localhost:1313/. However, the app is blank since there is no content yet.

In this step, you generated a new Hugo site. In the next step, you will add a documentation theme to it.

Step 7: Add a Theme to the Hugo Site

Hugo provides support for themes. You can add pre-configured theme components to your site. For the purpose of this tutorial, you will use the hugo-book theme, which is a theme for documentation sites. You can pick from a wide range of themes available on the Hugo theme showcase site. However, make sure that the theme supports internationalization.

To add the book theme, make sure you are in the docs-app folder, and if not, run:

cd docs-app

The app needs to have a git repository to add a theme as a git submodule. To initialize an empty one, run:

git init

To add the book theme, run:

git submodule add https://github.com/alex-shpak/hugo-book themes/book

This command adds the book theme repository as a submodule to the site. It clones the book theme into the themes folder. To view thedocs-app site using the book theme, you can run the app with this command:

hugo server --theme book

Here’s a screenshot of what it looks like:

The site is still pretty bare as it does not contain any content yet. You’ll add content to it from Strapi in the later steps.

In this step, you added a theme to your Hugo site that supports internationalization. In the following step, you will modify the setting of the docs-app site to support internationalization.

Step 8: Modify the Hugo Site Settings

While the book theme supports internationalization, you have to modify the settings of the docs-app to enable it. You will also modify other attributes of the site, like its title and base URL. Additionally, you will include other settings to disable search on the book theme and limit the cache lifespan. In the config.toml file, remove the existing code and add the one below:

# The site base URL
baseURL = 'http://localhost:1313/'

# The default site and content language
languageCode = 'en-us'
defaultContentLanguage = 'en-us'

# The site title
title = 'Docs'

# Setting the site theme to hugo-book
theme = 'book'

[params]
# Disabling search here because it falls out of the scope of this tutorial
BookSearch = false
# The Strapi server URL
StrapiServerURL = 'http://localhost:1337'

[caches]
[caches.getjson]
# Sets the maximum age of cache to 10s before it is cleared.
maxAge = "10s"

[languages]
# The US English content settings
[languages.en-us]
languageName = "English (US)"
contentDir = "content"
# The Canadian French content settings
[languages.fr-ca]
languageName = "Français (Canada)"
contentDir = "content.fr-ca"
# The Mexican Spanish content settings
[languages.es-mx]
languageName = "Español (Mexico)"
contentDir = "content.es-mx"

The StrapiServerURL is the URL of the Strapi server. Since it’s running locally for now, you will use http://localhost:1337. You’re going to use the getJSON Hugo function to fetch data from the server. It caches request results. During development, you may change the content on the Strapi app often, and because of cache, it may not reflect the changes you make. So, using the maxAge config property, you will set it to 10s; thus, the most recent Strapi content changes appear on the site. When you deploy the site, you will have to change this to an adequate timespan depending on how often the site is rebuilt, and the content is changed.

For the language settings, you will define three language categories. For each language, you will define a name and a directory for its content. Each of the content directories will be at the site root. The language names will be displayed in a dropdown where users can select what content they want. Here’s a table of the settings for each language.

Language Name Language Code Content Directory
English (US) en-us content/
Español (Mexico) es-mx content.es-mx/
Français (Canada) fr-ca content.fr-ca/

In this step, you added settings to the Hugo app to make it support internationalization. In the next step, you will modify the theme to accept localized content from an external server.

Step 9: Modify the Theme to Accept Strapi Content

In this step, you will modify the theme to accept data from a Strapi server. Although themes already come with pre-configured templates, you can override them by creating similar files in the layouts folder.

For the hugo-book theme, you will modify the template at themes/book/layouts/partials/docs/inject/content-after.html. This template displays whatever is added in it after the main page content. To do this, you will create this file in the layouts/ folder at the site’s root directory and then add content to it. In this file, you will define a template to fetch markdown content from the server, pass it through the markdown processor, and display it. The logic to fetch the content will be placed in a new partial template that you will call strapi-content. So, to create the content-after file, run these commands on your terminal:

mkdir -p layouts/partials/docs/inject/ && touch layouts/partials/docs/inject/content-after.html

Next, you will create the partial template to fetch content from Strapi:

touch layouts/partials/docs/strapi-content.html

In the layouts/partials/docs/strapi-content.html file, add this code:

<!-- Partial to fetch content from Strapi. -->

{{ $endpoint := $.Param "endpoint" }}
{{ $data := dict "title" "" "content" "" }}

{{ if and $endpoint .Site.Params.StrapiServerURL }}

{{ $contentURL := printf "%s%s" .Site.Params.StrapiServerURL $endpoint }}
{{ $data = getJSON $contentURL }}

{{ end }}

{{ return $data }}

In this partial file, you fetch the endpoint page variable for a specific page and store it in $endpoint. This variable is added to the front matter of content files, as you will see in the proceeding step. Next, you create a variable called $data that is returned at the end of the partial. It will hold the content returned from the Strapi server. You will then assign it a default structure with a title and content. This is done in case no endpoint is specified, or a request is unsuccessful. Afterward, you check if a content endpoint and a Strapi server URL is set. You need both of these for a request. If set, you create a URL for the content you need and use the getJSON function to make a request. Lastly, you return the data.

In layouts/partials/docs/inject/content-after.html, add the code below to the file:

{{ $strapiData := partial "docs/strapi-content" . }}
<article class="markdown">
  <h1>{{ $strapiData.title }}</h1>

  {{ $strapiData.content | markdownify }}
</article>

Here, you are fetching the data using the strapi-content partial template. Once you get the content, you add the title as a heading within the article tag. Lastly, you take the returned content, pass it through the markdown processor using the markdownify function, and display it within the article tag.

In this step, you modified the theme by overriding one of its templates and adding a new partial template to fetch content from Strapi. In the next step, you will add content pages for each of the languages.

Step 10: Add Content Pages to the Hugo Site

In this step, you will add content pages. Each language has a content folder, as shown in the previous steps. The content folder is for English(US) content, content.es-mx for Español (Mexico) content, and content.fr-ca for Français (Canada) content. Each content file has to have an endpoint front matter variable which is the Strapi endpoint that provides its content in a specific language. You’ll add this variable in two archetypes files, archetypes/default.md and archetypes/docs.md.

Archetype files are templates for content files. They can be used to specify the front matter and other content. The hugo new command uses archetypes to generate new content files. archetypes/default.md will be the template for all the _index.md content files while archetypes/docs.md will be for all the content files in docs/ folders. archetypes/docs.md and docs/ are specific to the hugo-book theme. To create the archetypes/docs.md file on your terminal:

touch archetypes/docs.md

Next, replace the content of both archetypes/default.md and archetypes/docs.md with:

---
title: "{{ replace .Name "-" " " | title }}"
endpoint: "/"
---

<br/>

The title will be displayed as the page title and in the table of contents. endpoint, as mentioned earlier, is the Strapi endpoint that provides the content. You add the <br/> tag so that the page is not considered blank during a build.

To create the content folders for the other languages, run this command on your terminal:

mkdir content.es-mx content.fr-ca

Next, add content files for each of the pages:

for cont in "_index.md" "docs/about.md" "docs/terms.md"; do hugo new  $cont; done && for langDir in  "content.es-mx" "content.fr-ca" ; do cp -R content/* $langDir; done

This command creates an _index.md file, a docs/about.md file, and a docs/terms.md file in each of the content directories. Here’s what the content directories will look like after you run this command:

content
├── docs
│   ├── about.md
│   └── terms.md
└── index.md
content.es-mx
├── docs
│   ├── about.md
│   └── terms.md
└── index.md
content.fr-ca
├── docs
│   ├── about.md
│   └── terms.md
└── index.md

Here’s the front matter and content you should add for each of the files:

Home (index.md)

  • content
---
title: "Home"
endpoint: "/home?_locale=en-US"
---

<br/>
  • content.es-mx
---
title: "Hogar"
endpoint: "/home?_locale=es-MX"
---

<br/>
  • content.fr-ca
---
title: "Accueil"
endpoint: "/home?_locale=fr-CA"
---

<br/>

About (docs/about.md)

  • content
---
title: "About"
endpoint: "/about?_locale=en-US"
---

<br/>
  • content.es-mx
---
title: "Sobre"
endpoint: "/about?_locale=es-MX"
---

<br/>
  • content.fr-ca
---
title: "À propos"
endpoint: "/about?_locale=fr-CA"
---

<br/>

Terms (docs/terms.md)

  • content
---
title: "Terms"
endpoint: "/terms?_locale=en-US"
---

<br/>
  • content.es-mx
---
title: "Condiciones"
endpoint: "/terms?_locale=es-MX"
---

<br/>
  • content.fr-ca
---
title: "Conditions"
endpoint: "/terms?_locale=fr-CA"
---

<br/>

So, all you need to do now is run the Hugo server. Before you do this, make sure that the Strapi app is running with npm run develop in a different terminal within the docs-server folder, so Hugo can fetch content from it when building the site. You can run the Hugo server using this command:

hugo server

Note About Routine Automated Rebuilds

Since Hugo creates static sites, the content displayed will not be dynamic. Hugo gets the content from the Strapi server during build time and not on the fly when a page is requested. So, if you’d like content to regularly reflect what is on the Strapi server, make sure to automate rebuilds of your Hugo site regularly or as often as changes to the content are made. For example, if your site is hosted on Netlify, you can schedule regular rebuilds of your site.

Conclusion

Hugo is a static site generator that allows you to build fast and efficient static sites. It offers multilingual support using its internationalization feature. You can specify a range of languages, and Hugo will build a site to support each of them. Strapi is a headless CMS that allows its users to manage content with more flexibility. It provides an admin portal to enter and manage content and a customizable API that different frontends can consume the content through. It also offers an internationalization plugin to manage content in different locales.

In this tutorial, you created a Strapi application. Using this app, you added three single content types to represent data for three pages: a home, an about, and a terms page. You added content for each of the pages for three locales: English (US), Español (Mexico), and Français (Canada). You also generated APIs to access content for these pages and made some of its routes public.

After, you generated a Hugo app. In this app, you added a documentation theme, configuration to support internationalization, and content pages for different languages. Lastly, you modified the theme to consume content from Strapi. If you’d like to build out more of the app, try adding more content page types with complex structures or adding content in a new language.

If you’d like to learn more about Hugo, check out their documentation page. To find out more about what you can do with Strapi and the range of features it offers, head to its website here.

Securing Kubernetes Secrets With Conjur

Why Secure Kubernetes Secrets?

Secrets management is one of the important aspects of securing your Kubernetes cluster. Out of the box, Kubernetes uses base 64 encoding for storing them, which is not enough. You have to implement a number of security best practices on top, to prevent possible security breaches. Etcd encryption at rest, access control with RBAC, are a couple of examples of the same. Using secrets management solutions like CyberArk Conjur, not only secures them for Kubernetes but also provides other benefits as we will see in the post.

What Is Conjur?

CyberArk Conjur is a secrets manager. It helps you manage secrets in Kubernetes, as well as across applications, tools, and clouds. It offers Role Based Access Control (RBAC) with an audit trail to easily track each stored secret. It implements encryption at rest with AES-256-GCM and in transit using mTLS. Additionally, you can manage the access for each secret and can also rotate the secrets automatically.

Using RBAC with Service Accounts in Kubernetes

Kubernetes doesn’t maintain a database or profiles of users and passwords. Instead, it expects it to be managed outside of the cluster.  The role of RBAC is to authorize the requests. We will be creating a pod read-only user (Service account) which can get, list, watch any pod in selected namespaces.

What is RBAC?

Role-based access control (RBAC) is a method of regulating access to a computer or network resources based on the roles of individual users within your organization. 

Attribute Based Access Control for Mulesoft APIs

How can we automate the processes of API registrations and access controls to make it easier to manage at scale? Attribute-based access control (ABAC) has unique advantages over role-based access control (RBAC) for API gateway management, especially when it’s enabled with the OAuth2 JSON web token (JWT). Let’s explore how ABAC could be implemented effectively for Mulesoft API gateway through a custom policy.

Background

By default, the Mule Anypoint Platform comes with its own identify provider (IdP) (ref2). This IdP is intended to help customers to jump-start their projects or create PoCs. It is not provided for production deployments, especially with a large number of client applications. For that purpose, Mule supports integration with external IdPs, such as Okta, OpenAM, etc. 

Secure Communication with Token-based RSocket

RSocket provides a message-driven communication mechanism, by using the reactive streaming framework, and supports most of the protocols (TCP/WebSocket/HTTP 1.1&HTTP 2). Furthermore, it’s program language-agnostic interaction models (REQUEST_RESPONSE/REQUEST_FNF/REQUEST_STREAM/REQUEST_CHANNEL) cover most communication scenarios, from the Microservices, API Gateway, and Sidecar Proxy, to the Message Queue.

Considering security for the communication, it's easy to use TLS-based and Token-based solution in RSocket-based productions. RSocket can reuse the TLS over the TCP or WebSocket directly, but to demonstrate the RBAC feature vividly, in this article, we only talk about the token-based implementation.

Origin Authentication and RBAC in Istio with Custom Identity Provider

The concept of access control can be boiled down to two factors: authentication (AuthN) and authorization (AuthZ). While authentication determines the identity of a client based on the data presented to the identity provider (e.g., Google and Microsoft AD), authorization determines whether an authenticated principal may interact with the resource.

Istio supports Token-based end-user authentication with JSON Web Tokens or JWT. In terms of Istio, the process of authentication of the end-user, which might be a person or a device, is known as origin authentication. Istio allows you to validate nearly all the fields of a JWT token presented to it. Since JWT is an industry-standard token format, the origin authentication feature of Istio is compatible with OpenID connect providers such as Auth0, Google Auth, and Key Cloak.

How to Remediate Kubernetes Security Vulnerability: CVE-2019-11247

If you haven't upgraded your Kubernetes CDRs recently, now might be the time.

A new Kubernetes security vulnerability was recently announced, along with patch releases for the issue for Kubernetes versions 1.13, 1.14, and 1.15. CVE-2019-11247 discloses a serious vulnerability in the K8s API that could allow users to read, modify or delete cluster-wide custom resources, even if they only have RBAC permissions for namespaced resources.

If your clusters aren’t using , you aren’t affected. But CRDs have become a critical component of many Kubernetes-native projects like Istio, so many users are impacted. This vulnerability also doesn’t affect you if your clusters run without Kubernetes RBAC, but that puts you at an even greater risk than this vulnerability does. We still strongly recommend enabling and using Kubernetes RBAC.

Kubernetes and Microservices Security

To understand the current and future state of Kubernetes (K8s) in the enterprise, we gathered insights from IT executives at 22 companies. We asked, "How does K8s help you secure containers?" Here’s what we learned.

Microservices, Security, and Kubernetes (K8s)

RBAC

  • K8s helps with authorization and authentication via workload identity. Role-based access control (RBAC) is provided out of the box. Service mesh ensures communication between microservices. 
  • K8s solves more problems than it creates. RBAC enforces relationships between resources like pod security policies to control the level of access pods have to each other, how many resources pods have access to. It’s too complicated but K8s provides tools to build a more stable infrastructure.
  • RBAC mechanisms. Good security profile and posture. K8s provides access and mechanisms to use other things to secure containers.
  • K8s provides a pluggable infrastructure so you can customize it to the security demands of your organization. The API was purpose-built for extensibility to ensure, for example, that you can scan workloads before they go into production clusters. You can apply RBAC rules for who can access your environments, and you can use webhooks for all kinds of sophisticated desired-state security and compliance policies that mitigate operational, security, and compliance risk.
  • The advantage of K8s is in its open-source ecosystem, which offers several security tools like CIS Benchmarks, Network Policy, Istio, Grafeas, Clair, etc. that can help you enforce security. K8s also has comprehensive support for RBAC on System and Custom resources. Container firewalls help enforce network security to K8s. Due to the increased autonomy of microservices deployed as pods in K8s, having a thorough vulnerability assessment on each service, change control enforcement on the security architecture, and strict security enforcement is critical to fending off security threats. Things like automated monitoring-auditing-alerting, OS hardening, and continuous system patching must be done. 
  • The great part about the industry adopting K8s as the de facto standard for orchestration means that many talented people have collaboratively focused on building secure best practices into the core system, such as RBAC, namespaces, and admission controllers. We take the security of our own architecture and that of our customers very seriously, and the project and other open-source supporting projects releasing patches and new versions quickly in the event of common vulnerabilities and exposures (CVE) makes it possible for us to always ensure we are keeping systems fully up to date. We have built-in support for automated K8s upgrades. We are always rolling out new versions and patches and providing our users with the notifications and tooling to keep their systems up to date and secure.

Reduced Exposure

  • You have services deployed individually in separated containers. Even if someone gets into a container, they’re only getting into one microservice, rather than being able to get into an entire server infrastructure. Integration with Istio provides virtualization and security on the access side.
  • Since the beginning, containers have been perceived as a potential security threat because you are running these entities on the same machine with low isolation. There's a perceived risk of data leakage, moving from one container to another. I see the security benefits of containers and K8s outweigh the risks because containers tend to be much smaller than a VM running NGINX will run a full OS with many processes and servers. Containers have far less exposure and attack surface. You can keep containers clean and small for minimal attack surface. K8s has a lot of security functionality embedded in the platform. Security is turned on by default with K8s. When you turn it on, there are a lot of security features in the platform. The microservices and container model is based on immutable infrastructure. You offer limited access to the container running the application. You're able to lock down containers in a better way and be in charge of what our application does. We are now using ML to understand what service or set of containers are doing and we can lock down the service. 
  • You need to look at security differently with containers to be more secure. Containers allow you to reduce the attack surface with fewer privileges. Limit access to the production environment. K8s has a number of security mechanisms built-in like authentication and authorization to control access to resources. You need to learn to configure. Be careful about setting the right privileges.

K8s Security Policies

  • My team helps with full and automatic application discovery spread across multiple data centers and cloud, and creating clean infrastructure policies and runtime discovery. Dynamic policies help lock down containers and apps built on top of containers.
  • You’re only as secure as you design yourself to be in the first place. By automating concepts and constructs of where things go using rules and stabilizing the environment, it eliminates a lot of human errors that occur in a manual configuration process. K8s standardizes the way we deploy containers. 
  • Namespaces, pod security policies, and network layer firewalling where we just define a policy using Calico and then kernel-level security that’s easy for us since we’re running on top of K8s. Kaneko runs Docker builds inside of Docker. Kaneko came out of Google. 
  • 1) Helps us with features like namespaces to segregate networks, from a team perspective. 2) Second is network policies. This pod cannot talk to that database and helps prevent mal-use of software.  3) Theres benefits from K8s protecting individual containers. This mitigates problems escaping from containers and helps you stay isolated.

Other

  • It’s not built-in per se, that’s why there are a number of security tools coming up. Things like scanning  Docker images. as K8s does not scan. A number of security companies are coming out with continuous scanning of Docker images before they are deployed, looking for security vulnerabilities during the SDLC. DevSecOps moves security checking and scanning to occur earlier in the development process. There are tools that are popping up to do that.
  • If you enable the security capabilities provided, it’s an advantage. There are capabilities in K8s that control whether you have the ability to pull up a container. It has to be set up correctly. You need to learn to use the capabilities. You need to think about the security of every container.
  • Security is a very important topic. One area is open source and the level of involvement and the number of developers involved can help drive greater security in the environment. Cloud-native security with the code and the cluster. For customers to leverage K8s in the cloud it changes the investment you have to make because you are inheriting the security capabilities of the cloud provider and dramatically lowering costs. K8s has API-level automation built-in.
  • Our container images are created using the Linux package security update mechanism to ensure the images include the latest security patches. Further, our container image is published to the Red Hat Container Catalog which requires these security measures to be applied as part of the publishing process. In addition, domain and database administrative commands are authenticated using TLS secure certificate authentication and LDAP, as well, domain meta-data, application SQL commands, and user data communications are all protected using the AES-256-CTR encryption cipher.
  • K8s provides only minimal structures for security, and it is largely the responsibility of implementers to provide security. You can build quite a lot on top of the Secrets API, such as implementing TLS in your applications or using it to store password objects.
  • K8s-orchestrated containerized environments and microservices present a large attack surface. The highly-dynamic container-to-container communications internal to these environments offer an opportune space for attacks to grow and escalate if they aren’t detected and thwarted. At the same time, K8s itself is drawing attackers’ attention: just last year a critical vulnerability exposing the K8s API server presented the first major known weakness in K8s security, but certainly not the last. To secure K8s environments, enterprises must introduce effective container network security and host security, which must include the visibility to closely monitor container traffic. Enterprise environments must be protected along each of the many vectors through which attackers may attempt entry. To do so, developers should implement security strategies featuring layer-7 inspection (capable of identifying any possible application layer issues). At the same time, the rise of production container environments that handle personally identifiable information (PII) has made data loss prevention a key security concern. This is especially true considering industry and governmental regulatory compliance requirements dictating how sensitive data must be handled and protected.

Here’s who shared their insights: