This post is a back to basics intro on Entities and their relevance in the context of a conversational interface.
Why are Entities so important?
Entities allow us to to extract information from user input. When the user provides an input, the NLP engine maps that utterance to an intent, but that same input can potentially contain much more rich information. Entities are the mechanism we use to extract it.
That information will add meaning and context to the Intent as it can describe it more in details.
On top of that we do not lose any information. One of the worst things we can have in a conversational interface is the user provide information that is not acknowledged by the chatbot.
User Input Examples
Above we have 3 utterances as an example. As humans we can very easily identify Intent and see all the relevant information those utterances provide.
Below, color highlighted potential Entities that we should be able to extract.
How does ODA extract these values?
ODA provides built-in Entities
- Address
- Currency
- Date
- Duration
- Number
- Person
- Phone number
- Set
- Time
- URL
- Yes_No
- Location
The above entities allow us to extract information that fits their type. For example an Entity of type Currency will be able to identify currency values in the utterance. Similarly the Email will identify emails in the input.
On top of this, there also Custom Entities.
- Value list
- Derived
- Entity list
- Regular expression
- Composite bag
- Dynamic entity
- ML Entities
With custom entities we can define Value Lists, Regular Expressions and many other types including the newly released ML Entity (which I will cover in the next post).
More info can be found here.
Does it work by default?
The answer is no. By default the NLP Engine will not be able to extract any entity from the utterance. Simply because it does not know what to look for. We need to associate Entities with Intents.
Fortunately ODA is a low code platform, which means that it is very easy to do so.
How to add and Entity to an Intent?
In the Intent tab, we have the option to add Entities. This is visible on the top right corner of the screen.
You can add any of the built-in or the custom build ones. Once this is done, do not forget to train the model.
What happens then?
Once we associated the desired Entities with the Intent, the NLP engine is able to extract the information from the Utterance into the respective entity slots.
This is called Slot Filling or Entity Slotting and it is one of the most important steps in a successful conversational interface.