Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 3 Next »

Orbita’s NLP node is capable of identifying user-specific named entity recognition (NER).

Named Entity Recognition (NER) involves determining the parts of a text (utterances) that can be identified and categorized into preset groups (user-defined lists values/entities).

You might encounter missing slot values while using Google or Alexa NLP. Using Orbita NLP will avoid such setbacks and will enable you to capture all the list values/entities and pursue the flow based on the extracted list values/entities.

NLP node

Orbita NLP node extracts the Entities(Lists) from the utterance.

Input

Utterance – The utterance that you want to extract the entities from.
You can place static text or you can pass the utterance dynamically from the payload.

Dynamic utterances should be passed to msg.payload.utterance

Entity(Lists) – All the Entities(Lists) from the project will be listed in the dropdown.
Select the Entities(Lists) you want the NLP to Filter.
If you leave this field empty, the NLP will try to match the words in the utterance with all the Entities(Lists) in the project.

Confidence – You can give values between 0 and 1. The node’s output contains the words that match with the Entities(Lists) and pass this threshold confidence value. The number you enter here is inclusive. (for example, if you enter 0.3, you will get results that have the confidence score or 0.3 and above)

Output

The output path of the node is msg.payload.data.nlp

Using NLP node in Experience Designer

Using NLP node, you can filter the entities based on:

  1. User-defined Lists/Entities.

  2. Confidence score.

For illustrative purposes, we have used Google NLP to trigger the intent and then used the NLP node to filter the list values/entities.

An intent that has multiple user-defined entities/list values used in its utterances is shown below.

For more information on creating slots and using them in intents, see How do I create lists (slots) and How to create an intent

Filter based on user-defined lists/entities

In the intent used in the above screenshot, we have used four slots out of which three of them are user-defined and one is system default. We can use the NLP node in Experience Designer to capture the lists of our interest.

For example, you are interested to know the things and places that the users are referring to in the utterances. We can use the NLP node to extract only the things and places.

Let's assume the user says “I saw two big trucks in the forest”. The extracted slots are stored along with confidence scores at msg.payload.data.nlp.

Filter based on the confidence score

For the same example, if you would like to extract only the slot values that have a confidence score of more than 0.5, you have to enter 0.501(corrected to three places after decimal).

You can also view the parameters, that were used to calculate the confidence score, in the scoreCard object.

Sample flow

In the above screenshot, a function node is used to capture the utterances from the intent node and set the utterances to the msg.payload.utterance.

The function node contains the below code.

msg.payload.utterance = msg.payload.queryResult.queryText;
if (msg.payload.utterance) {
    msg.payload.utterance = msg.payload.utterance.toLowerCase();
}
return msg;

Attached is the sample flow.

  File Modified

Related Articles

  • No labels