Have you ever had a conversation with some kind of AI (it’s totally fine if you didn’t)? Most of the conversations with the bot (or flows as we call them) require the user to only click the buttons.  The user can always write a freeform input into the bot, but, depending on how well the bot is written, the bot usually can’t respond properly if it doesn’t understand the question. So, here are a few tips on how to make your bot more human. 

human bot

Connecting the dots for your bot

Let’ start with some simple tricks. Whenever a bot asks a yes/no question, it will understand a variety of yes/no answers. For example: “Sure, Yep, Yeah” or “No, Nope, Nah”.

To search for something, a job, for instance, the user can type in a free-form sentence into the bot. For example: “Any chef jobs in London?”, or they can follow the job search flow and enter the keywords and the location separately.

Another neat trick is to spell-check all keywords that the user enters. So, if they enter ‘Bartnder’, the bot will say ‘Bartender, got it!’

Also, a database we used in our latest project had more than 50,000 locations in the world, merged with their alternate names. If a user types in, for example, ‘Big Apple’, when asked for their desired job location, the bot will understand it and respond with ‘New York, right?’

Natural Language Processing

What happens if none of the triggers in the bot matches the user input? In the case of MC Chatbot, the bot will send whatever user typed into our backend, which forwards this input to Dialogflow (Google’s online NLP service). In Dialogflow, we made many different intents defined, for example ‘job-search’, ‘browse-careers’, ‘change-location’. For each of these intents, we have defined 15 to 20 example sentences (for some intents, this number is higher). If a freeform input is successfully mapped to an intent, bot responds with the appropriate flow.

Finally, if Dialogflow still can’t categorize user’s input, we do a sentiment analysis of the user’s input. It works like a charm.

Sentiment Analysis

For the sentiment analysis, we used Sentiment JS library. This library can return a sentiment score for a sentence. We attach this sentiment score to the result and send it back to the bot. At this point, the bot only knows whether sentiment was negative, positive or neutral, and it will respond with an answer appropriate for a certain sentiment. For example, for a negative sentiment, it will redirect the user to the flow where they can leave feedback, and say what they weren’t satisfied with.

The bot doesn’t always respond in the same way, given the same input. For some inputs, we have defined a list of answers, that the bot randomly chooses from.

To add a little umph, we’re also using rich media (emojis, pictures, videos) in our bot, as part of its responses.

Combining the little things

Making you bot feel more human doesn’t have to be a hassle. There are a lot of free libraries and services you can use, that can help your development team in creating a bot that will behave almost like a human.

 

0 comments