NLP Chatbot Resiliency: A Chat With Botpress

In the race to design great conversational experiences, adaptable NLU models will play a key role in the creation of truly intelligent chatbots. In this article, learn how Botpress stemmed from frustrations with poorly designed bots that led to a launch of an open-source managed NLU platform. Also, see why the future of chatbot design will shift from an intents-based approach toward knowledge-based models that offer greater adaptability and resiliency.

Developer Accessibility Key to NLP Chatbot Advancement

Chatbots have come a long way over the years, evolving from simple command-response models to the more nuanced NLP conversational models of today. 

Testing Chatbots for the Unexpected

Quite often we are consulted to design a robust test strategy for a mission-critical enterprise chatbot. How is it possible to test something for all possible unexpected user behaviour in the future? How can someone confidently make assumptions on the quality if we have no clue what the users will ask the chatbot?

Short-Tail vs Long-Tail Topics

While we do not own a magic crystal ball to look into future usage scenarios, from our experience we gained the best results with a systematic approach in a continuous feedback setup. In almost every chatbot project the use cases can be categorized:

Contextual Design With Google Actions

One of the most complex tasks while designing a conversation is to create natural interactions with our users. However, there is one process called a contextual design that helps us to create these natural conversations. With the contextual design, you can design your conversation depending on the current situation of our users. 

For example, if the user is a first-timer using our Google Action, we will tell him/her a different welcome message than if he/she access it for the second time. Another example is the following one: if the user is from one City, we will provide information related to that city accessing geoinformation. Contextual design is one of the keys of Conversational AI.

Multimodal Design With Google Actions: Rich Responses Using Cards

Creating conversations is a really hard task. This is an entire design process that can take a lot of time. In terms of voice assistants, this process is even more complex due to the ability to interact with the user using sound and a display. When you mix those 2 interactions, you are creating a multimodal experience.

In this article, we will learn how to create engaging conversations using multimodality in our Google Action thanks to its Rich Responses using Cards.

Multimodal Design With Google Actions: Visual Selection Responses Using Collections

Creating conversations is a really hard task. This is an entire design process that can take a lot of time. In terms of voice assistants, this process is even more complex due to the ability to interact with the user using sound and a display. When you mix those 2 interactions, you are creating a multimodal experience.

In this article, we will learn how to create engaging conversations using multimodality in our Google Action thanks to its Visual Selection Responses using Collections.

Local Debugging on a Google Action

Google Actions can be developed using Firebase Cloud functions or a REST API endpoint. Firebase Cloud Function function is Googles's implementation of serverless functions available in Firebase. Google recommends using Firebase Cloud Function functions for Google Action development.

This is a very lightweight and powerful approach to developing our Google Action. However, it is complex to work locally with serverless functions like Firebase Cloud Functions.

Google Action With Node.js

Google Actions can be developed using Firebase Cloud functions or a REST API endpoint. Firebase Cloud Function function is Googles's implementation of serverless functions available in Firebase. Google recommends using Firebase Cloud Function functions for Google Action development.

In this post, we will implement a Google Action for Google Assistant by using Node.js, yarn, and Firebase Cloud Functions. This Google Action is basically a Hello World example.

Training NLP Engines Without All of the Answers

Natural Language Processing (NLP) or Natural Language Understanding (NLU) is a subset of Artificial Intelligence (AI). There are many benefits when using the technology, and I am surprised at the pushback from technical people when talking about deploying it. I guess there is a difference between learning about technology in academia and the complexity of actually deploying it.

So, how do we get past all the pushback when chatbots are having conversations and intelligent automation promises to be better than old-school EAI and SOA?

4 Perspectives When Selecting a Conversational AI Platform

 Businesses are quickly acknowledging the importance of Conversational AI (CAI) to increase their customer engagement and revenues. The question is no longer whether to deploy CAI, but rather which platform to use and how to leverage its capabilities. 

In this series, see some insight on important aspects of a conversational AI platform that buyers often overlook. For example, what does language support really mean? What is localization? How do different deployment models impact the TCO? And maybe most importantly – How can the CAI platform not only help me during the first development sprints – but across the entire bot lifecycle?

Modifying Your Virtual Assistant to Use Custom Entities – Here’s How You Do It in Teneo

Virtual assistants and chatbots are great tools for improving customer service in any company. However, to be able to become a great customer service agent there is some work to do to make it fit your business needs.

Businesses today have their own way of naming things and the same way you would need to train any co-worker on the business vocabulary, you need to train the virtual assistant. That is when customization starts.

The Impact of the Covid-19 Pandemic on Conversational AI

As a direct result of Covid-19, enterprises are advancing their plans to digitize and automate parts of their business not just to achieve better operational efficiencies, but to protect themselves from disruptions.

During the pandemic, many companies experienced significant increase in pressure from customers, while their number of available employees decreased. Many contact centers were unable to cope with demand or closed because of lockdown restrictions, leading to long delays in customer service queries, which dramatically affected the customer experience.

Chatbot Scripting: Storing Input Parameters From Client Applications in Teneo

Besides the natural language inputs of the user, client applications can also include input parameters in their requests to Teneo. The values of these input parameters can then be stored in for example global variables, so that they can be used by flows, integrations, etc. 

More details on how client applications can interact with Teneo can be found on the Teneo Engine client API page in Deploy your bot.

Chatbot Integrations – Adding an Integration in Teneo

If you want your chatbot to be able to know the answer to more than just the things you teach it about your business you can integrate to other services. Why invent the wheel twice? With integrations, you can have loads of information that might change from day to day without having to constantly update your solution manually.

Let’s look into how you add an integration in Teneo Studio.

Teach Your Conversational AI Application to Store Information in Teneo

To be able to create a humanlike conversation there is a need to teach your virtual assistant to remember inputs that the user says, for example, the user’s name:

User: I want a small cappuccino.
Bot: Ok, what name shall I note for the order?
User: Amber.
Bot: Thanks for your order, Amber! A small cappuccino will be ready for pickup in 5 minutes.

Creating Your Own Language Objects in Teneo

Language objects are building blocks for language conditions. Sometimes you may not find the language objects you need in the Teneo Lexical Resources (TLR), for the simple reason that they do not exist. Teneo Lexical Resources have primarily been designed to cover general language expressions and common phrases, so whenever you want to use more (domain) specific words in your dialog, you may not find existing language objects for them.

The good news is that you can easily create missing language objects yourself. Once created, you can use them in your current solution. In fact, you can re-use them in other solutions as well!

Creating a Non-English Chatbot Solution in Teneo

When you created your first solution you chose English as the bot’s language and the tutorials on this site assume your bot understands English. However, Teneo supports many more languages, which you can read about here: Languages.

The evaluation environment that is created for you when you sign up contains the resources needed that offer advanced support for Dutch, English, French, German, Norwegian and Swedish. On this page we will show you how you can create a German solution, but the same principle applies to French as well.