In this post series I will show how to build a Chatbot to provide self-service for football information. You will be able to ask the Bot about results, fixtures, standings and more. The bot will retrieve that information from a backend API service, and it will display the information back to you in one of the many available Channels.
Digital Assistant vs Chatbot
First things first, lets clarify some of the concepts.
A Chatbot is a single purposed program, user initiated, that solves simple business problems by using service automation and self-service.
A Digital Assistant is user oriented. It can assist users with various related or unrelated tasks. Conversations are context aware. Digital assistants can have many different skills, or chatbots, and can route the conversation to/from those skills.
Learn more about it here and here!
Utterance, Intent and Entity
Utterance: Anything the user says. words or sentences are the utterances.
This is very important as the utterance can significantly vary depending on the user. How many ways one can ask for the game result?
- “What was the score?”
- “What’s the result?”
- “Who won?”
- “Did they win?”
Intent: An intent is the user’s intention. If the user asks : “Who won the game?” , the intent is the game result. In this case we could call this intent askResults or showResults.
An intent is not the key word the user uses, but rather the high level goal or intention of that sentence.
Entity: An entity can be any field of data that allow to extract important information about the intent. If the user input is: “What is the result from the Real Madrid game yesterday?” -> Yesterday can be a time entity and Real Madrid an entity that defines the who question for the showResults intent.
The Oracle Digital Assistant Natural Language Processing (NLP) engine will be the one responsible to interpret the user text input based on the defined Utterances, Intents and Entities!
Mini Project Breakdown
There are a couple of components to this implementation. I will create a single purposed chatbot (one skill).
- Defining the Skill Intents, Utterances, Entities and a Dialog Flow.
- Backend Integration
- Channels
The backend integration will bring the real-time football data into the conversation. I subscribed to a Football API via RapidAPI, and we will fetch all the data from here. The initial implementation will rely on a mockup ( more precisely on hardcoded values)
Finally I will create a communication channel from where we can interact with the Bot.
I will address these 3 components in 3 separate posts. This one will focus on: Intents, Utterances Entities and a Dialog Flow.
Intents & Utterances
I create one Skill called Football Bot which will hold all the Intents.
- Show Results
- Show Fixtures
- Show Table
- Show Best Scorer
- Welcome
For each Intent I define several Utterances.
We can test the Intents and whenever we have ambiguity we can add those Utterances as examples to the selected Intent (as seen above)
Finally I also defined one Entity called team which should hold the information of the team for which we want to retrieve data. This is an interesting part as we need some control here – having a free format would cause issues later on while calling the API. For this small exercise I add 2 possible values.
Dialog Flow
Using the Skill Builder, you define the framework of the user-skill exchange in OBotML, Digital Assistant’s own implementation of YAML. This is a simple markup language, one that lets you describe a dialog both in terms of what your skill says and what it does. – from here.
In a real life scenario, before embarking in the task of building the Dialog Flow, there should be a design phase where business and IT agree on the specs. The Conversation Design is a brand new tool (at least to me) that gives the business the possibility to graphically define the flow, so that on a later stage the developer can implement it’s definition. Quite handy.
Since this is not a real life scenario, i will just use the dialog flow designer with OBotML.
I will skip the explanations of the structure of the YAML file for this exercise.
User starts interaction with a Welcome Greeting
In this dialog flow, the user starts with a initial greeting. After greeting back we ask the user for the team. Present to him a list of choices and proceed to that selection. This path follows the Welcome Intent and does not really make use of the other Intents as we are presented with an option screen.
states:
intent:
component: "System.Intent"
properties:
variable: "iResult"
transitions:
actions:
askResults: "Results"
askFixtures: "Fixtures"
askTable: "Table"
askScorer: "Scorer"
welcome: "Welcome"
Welcome:
component: "System.Output"
properties:
text: "Hello, welcome to the Football Bot!!"
keepTurn: true
transitions:
next: "team"
team:
component: "System.Text"
properties:
prompt: "What is your Team?"
variable: "name"
transitions:
next: "printTeam"
printTeam:
component: "System.Output"
properties:
text: "Fantastic Club!"
keepTurn: true
options:
component: "System.List"
properties:
options: "Results, Fixtures,Table,Best Scorer"
prompt: "What do you want to check for ${name.value}?"
transitions:
actions:
Results: "Results"
Fixtures: "Fixtures"
Table: "Table"
Best Scorer: "Scorer"
unresolvedIntent: "unresolved"
The “Hello” takes us to the Welcome Intent, which transitions to the Team State.
We are then presented with the available Options.
Notice that when I choose Fixtures, the response is immediately followed by the Option menu again. This happens because of the below code line.
keepTurn: true
Below is when KeepTurn is false. The bot responds and handles the control, or next step, back to the user.
User can start the interaction by stating its intent
This approach assumes the user already knows what he can ask, and goes directly to point!
Notice that here the Entities play a role. I defined 2 entities for this intent (TEAM and TIME). The bot recognizes the value TEAM, but I guess that the word yesterday is too vague to set a time, hence it has a null value.
EDIT: I have used the wrong entity. TIME refers to the actual clock time. The entity DATE does refer to a date and would understand the word yesterday.
Note: I made a couple of changes to the original dialog flow yaml to adjust for this approach.
Different approaches to reach the same end results. At the end it all about the design decisions and what kind of experience the bot should provide to the end user.
What if the interaction goes to an unexpected path? Well for that we should always have an unresolvedState, which should gracefully handle these situations:
unresolved:
component: System.Output
properties:
text: "You managed to break this - hope you are happy :)"
keepTurn: true
transitions:
return: done
One last example, where the Bot identifies the showTable Intent and directly shows the (still hardcoded) answer!
Conclusion
In this first post we have a running bot with a mockup for the response. The dialog flow does not follow many best practices, but the idea here is just to explore the required steps and showcase of capabilities!
This is being built using a Football API as the data source, but ultimately one could bring any sports API, and even create a Digital Assistant to call all those different Sports (Skills).
What’s next?
In the next post I will use the node.js SDK for custom components to implement the backend integration and make this a bot with real data 🙂