The Oracle Digital Assistant (ODA) provides out of the box Native support for several Languages. In this post I will cover the necessary steps to implement a pure Multilingual Chatbot.
“You can design both single-language and multi-language skills. For understanding user input, you can either use Oracle Digital Assistant’s native support for various languages or a translation service. For output, you typically define the strings for each target language in resource bundles.“
The documentation is a good place to start and understand all the details around the Multilingual support. There are typically two approaches, (1) use translation services or the (2) native language support.
Here I will focus on the native language support and cover all the necessary steps to have a fully working multilingual chatbot.
Create Skill
As always the first step is to create a Skill 🙂
We always need to have a default primary language – in this case English
From the Drop Down we can see the available supported native languages.
Create Intents
For this demo I will create a Greeting Intent. Notice that we can only enter Utterances for the chosen English Language.
Add Additional Languages
Under Configuration/General Add more languages
After the above step, we can go back to the Intents page and we see the option to enter Utterances in the additional language.
Resource Bundles
The support for native languages only works for the ODA to understand user input. The approach for bot output is via resource bundles. Think of this as lookup tables. Here we can provide the output in available languages.
I create a resource bundle with a Greeting01 key, for the default (English) language.
Then for the same key, I add a new entry for Portuguese.
At the end it should look like the picture below. The principle is the same, whether we support 2 or all the native languages.
If you try to train the model at this time, you will see an error stating that we need at least 2 Intents in order to train the model. Hence, I add another Intent (Exit)
Dialog Flow
The dialog flow is where we glue all this pieces together. The most interesting part is that the ODA automatically identifies the language, and the only thing we need to do, is to pass the reference to the resource bundle.
In the below code, we Identify the Intent (Greeting) and transition to the state PrintGreeting, which is an output component*. The text has the reference to the resource bundle: text: “${rb.Greeting01}”
We don’t need to worry about language detection, the ODA will take care of that and pick up the appropriate language .
*I just had my ODA updated to 21.06 and it seems the default output component is now System.CommonResponse, but one could also use the System.Output.
main: true
name: ML_Demo
context:
variables:
iResult: "nlpresult"
states:
intent:
component: "System.Intent"
properties:
variable: "iResult"
transitions:
actions:
greeting: "printGreeting"
exit: "exit"
printGreeting:
component: "System.CommonResponse"
properties:
keepTurn: true
metadata:
responseItems:
- type: "text"
text: "${rb.Greeting01}"
transitions:
return: "done"
exit:
component: "System.CommonResponse"
properties:
metadata:
responseItems:
- type: "text"
text: "Sorry, I am giving up!"
transitions:
return: "done"
Testing the Conversation
This is the principle behind a Multilingual implementation – ODA gives us the tools to simply implement a true Multilingual chatbot with minimum effort.
For more information please ask Artie -> the ODA Documentation Chatbot.
Thanks Daniel, this is exactly the post I am looking for to implement multi language using the same skill.
Glad to be of help 🙂
Hi Daniel,
I have implemented the POC, but the bot is not detecting the language automatically.
The good thing is if give the utterance in french, it is resolving to the correct intent and giving out the response in english !
Is there any other setting or configuration required to auto detect the language.
You need to use resource bundles and add keys for both languages. It should work, not other configuration is required!
Yes I did it. in brief what I understand. The detection of the language is done at the beginning of the session.
For e.g., if the user starts the first conversation in ‘French’, the responses will be in French only, even though the user puts a question in English.
The current ODA does not allow to switch languages . this means the language is detected in the first utterance of the conversation and will continue till the end of the session, even if the user switches languages.
This is working fine in the ODA Preview tester. But when I routed to the sample client, it is not detection.
From the blogs, there is something to do with profile.languageTag or profile.locale, which I am unable to figure it out.
https://docs.oracle.com/en/cloud/paas/digital-assistant/use-chatbot/languages-and-skills1.html#GUID-7A5D3B8D-F429-408A-9701-87820C86C123
“The language is not automatically detected if the skill or digital assistant is accessed through a channel where the profile.languageTag or profile.locale variable has been set.”
Thanks
Regarding the 1st part you are correct. One cannot change the session language once it has been identified.
The profile.languageTag is mainly used for detecting language (for translation services). If you are using the Native NLP languages you don’t need those tags.
There are other alternatives, as allowing the user to choose the language he wants to use for the Bot conversation.
You can also think about using the profile.locale to detect the user locale/language and send a hidden message to the bot in that language, so that the conversation starts in the target language without user interaction.