Table of Contents | ||
---|---|---|
|
What is Flow Studio?
Orbita Flow Studio is a "no-code" way to build conversational experiences. It is an intuitive, graphical studio with a flowchart-like interface for creating rich, conversational applications. With Orbita Flow Studio’s low-code environment, anyone can quickly build powerful voice and chatbot applications using ready-made controls and pre-built templates. Orbita Flow adds to Orbita’s already rich set of tools for accelerating the creation of voice and chatbot-powered healthcare applications.
Flow Studio reduces bottlenecks and lets designers and content authors create and manage conversational applications through its intuitive drag-and-drop visual editor. The Flow Studio visual application builder was created with these intermediate users in mind and requires no development expertise.
When should you use Flow Studio?
Any time you are building decision tree style conversational dialogs. Situations, where this might be the case, include, defining assessments, modeling care protocols, and in general, needing to define deterministic logic to handle the conversational flow.
If you are trying to manage a conversational state, you should use Flow Studio
In contrast, Experience Designer should be used when needing to integrate with third-party systems, write functions, or extend the base capabilities of Flow Studio.
Both systems, Flow Studio, and Experience Designer work together. Flow Studio is used to define the conversational decision trees. Experience Designer hosts the execution. From a programming standpoint, this is similar to the separation of data and code. Flow Studio allows you to define the business logic as data, while Experience Designer hosts the execution of the logic. The Experience Designer can intercept a user intent and switch flows that are being executed.
Getting started
Login to Experience Manager
Select a project.
From the project side navigation menu, select Create > Agents > Flow Studio.
Creating a new flow
Select a flow from the Flow Studio listing page or click Create a Flow button to add a new flow.
Enter a Title and Description in the Add a New Flow dialog box and click ADD to open the flow studio. Auto Generate Slots is enabled by default.
While creating/editing a Flow studio flow, you can add utterances using the Add Utterances button.
These utterances are grouped in an autogenerated intent. The intent can be accessed using the Flow Group Intent.
You should add the Flow Group Intent node to the Flow Manager node in the Experience designer to use these utterances to invoke the corresponding flow.Auto Generate slots will store the 'Choice Values' and 'Choice Texts' of "Choose many" and "Choose one" nodes of flow studio.
Note: Google Dialog flow limits the number of slots an intent can contain. Therefore, if you use Google Dialog flow for publishing your interaction model, you should restrict yourself to creating only 20 Flow Studio flows with Auto Generate Slots enabled, because each flow will have a dynamically created unique slot name that will be used in the OrbitaFlowStudio intent. If you intend to create and use more than 20 flows, you should disable the Auto Generate Slots and create custom slots for the Choices used in the flow.Every flow has an ID on the top of the screen and the control pane to the right side of the page.
Note: Zoom options are placed in the bottom right corner.Drag and drop the controls from the control panel to the canvas to start creating a flow.
To save the flow, click Save.
Editing an existing flow
If a flow already exists, click the vertical ellipses menu corresponding to the flow name.
The vertical ellipsis menu has the following options:
Flow Details. You can edit the Title, Description, Enable/disable Auto Generate Slots, Add/remove Utterances, view the Flow creator, Created Date, Last Updated, the last user that modified the flow
Edit. Edit the flow.
Delete. Delete the flow. This requires confirmation.
Import/export. The import/export dialog box has 2 tabs:
Export. Click copy to clipboard to copy the JSON code.
Import. Paste a JSON code in the space provided to import the flow.
Flow History. Changes done to the flow studio by a user will be saved in the Flow History option.
The Flow History window contains the below details on every action done on the Flow studio.
Avatar/ Initials.
Action - Update, Delete, or Restored.
Username - of the user that did the action.
Date - Created/Modified
Menu icon
View - To view changes
Restore - To restore changes to an earlier point.
Controls
Use the following controls to create your flow.
Start/End
Single Input
Options (Choose Many)
Options (Choose One)
Say
Rating
Yes/No
Custom
Evaluate
Custom input field
Sub-flow
Link
Agent
Annotation
Start/End
Start is the beginning control, where your process begins. Only one Start control can be on the canvas.
Info |
---|
If multiple Start controls are used, the system selects any of the Start controls randomly to begin the flow with. |
End controls are the terminators of the flow, and you can have as many as you would like. The End control executes the flow manager node, in the Experience Designer, on the second pin.
For End control, the Directives tab contains the following fields:
Microphone. Enabled by default. Uncheck to disable it.
Keyboard. Enabled by default. Uncheck to disable it.
WaitTimeAnimation. Enabled by default. Uncheck to disable it.
Wait Time. Time in milliseconds before which the next bubble appears.
Info |
---|
An End control is just like a Say control except that, with a Say control, you can continue the flow. |
Single Input
Stores or processes unlisted values. For example, a user might be asked to record blood pressure. The user might say or type a number.
In the General tab, a new dropdown is added in the button tab.
If the user selects utterance, href, or onclick, the values will not be added to the auto-generated intent.
By default, the button type is set as ‘default’.
Default - The values will be added to the autogenerated intent.
Href - You can give any valid URL in the value field. The URL will open in a new tab
Utterance - The utterance value will be triggered without considering the state.
OnClick - You can call any js function with this option.
For example, choose OnClick from the dropdown and use the below code in the Value field to dock the chatbot.Code Block (function(self){ self.orbitaChatBot.dockCollapse() })(window)
In the chatbot, clicking on the button will minimize the dock view chatbot.
Phrases are added to the Auto Generate Slots list.
The Directives tab contains the following fields:
Microphone. Enabled by default. Uncheck to disable it.
Keyboard. Enabled by default. Uncheck to disable it.
WaitTimeAnimation. Enabled by default. Uncheck to disable it.
Wait Time. The time in milliseconds before which the next bubble appears.
Input Type. With this option, you can restrict the way the chatbot user interacts with the chatbot.
It contains the following
...
types.
None. Displays the question without a predefined input type.
Date. Enables a date picker widget in the chatbot.
Time. Enables a Time picker widget in the chatbot.
DateTime. Enables a
...
DateTime. Enables a combination of Date Picker and Time Picker in one widget.
...
Number. Restricts the user input to number.
...
Email. Validates the user input to be an email.
...
combination of Date Picker and Time Picker in one widget.
You can choose the Start Date, End Date, and Default date.
None. Choose none to select the current date.
# of days ago. Enter a number to pick a date from the past. (counts from the current date)
Static. You can pick a date from the widget.
In the chatbot, the Date time picker will be rendered as shown below
Number. Select this if you want the user input to be in Numbers (& symbols). If the user gives any other input, it will trigger a none intent.
Phone Number. Restricts the user input to a number (& symbols) with an input box.
Email. Validates the user input to be an email.
Textbox. Using this option, you can have the user respond with utterances that will bypass the NLP. This input type will only work in the Orbita Chatbot.
Hidden. Select this if you don't want to render an input text box in the chatbot.
Single Line. Select this if you want a single-line input text box in the chatbot.
Multi Line. Select this if you want a multiple-line input text box in the chatbot.
You can choose the number of lines for the input text box and the visible text for each row.
5 rows and 12 columns
5 rows 24 columns
5 rows 36 columns
Options (Choose Many)
It generates for each button with a check box for the chat experience.
The dropdown with Default and Utterance types. If you select the Utterance type, the corresponding choice will be excluded from the auto-generated slots so that it can trigger an intent.
For example, consider the choices as in the screenshot below.
If the chatbot user selects Good when he is prompted with this question, the autogenerated slot will not get triggered. The intent (if any) that has an utterance as Good in it will be triggered.
The Directives tab contains the following fields,
Microphone. Enabled by default. Uncheck to disable it.
Keyboard. Enabled by default. Uncheck to disable it.
WaitTimeAnimation. Enabled by default. Uncheck to disable it.
Wait Time. Time in milliseconds before which the next bubble appears.
Checkboxes. By default, the multiple select card will be rendered. If you choose yes or no from the dropdown, the bot will say each option and expect a yes or no response from the user.
For the above example, The query text will be in the property msg.payload.queryResult.queryText : “10001”
should be read excluding the initial placeholder digit i.e 1.
The subsequent numbers are a boolean representation of whether the option is checked or not. In this case, there are 4 options and the last option is checked.
Note: If None of the above options is checked, ‘none of the above’ will be considered as an option and a boolean digit will be added at the end. In this example, if ‘none of the above’option is checked in the Choices tab of option (Choose many) control, then none of the above option is checked for that question.
You can read the choices output from this control in Experience Designer using the below properties.msg.alexaRequest.data.session.attributes.renderedButtons.choices
msg.payload.queryResult.outputContexts[0].parameters.renderedButtons.choices
msg.payload.session.attributes.renderedButtons.choices
Options (Choose One)
It causes the bot to show all the options to the user. The user must pick one of the options provided.
The dropdown with Default and Utterance types. If you select the Utterance type, the corresponding choice will be excluded from the auto-generated slots so that it can trigger an intent.
For example, consider the choices as in the screenshot below.
If the chatbot user selects Good when he is prompted with this question, the autogenerated slot will not get triggered. The intent (if any) that has an utterance as Good in it will be triggered.
Example
Bot: What taste do you prefer: Sweet, Sour, or bitter?
User: Sweet
The choices and values are added to the Auto Generate Slots list if the type is Default.
The Directives tab contains the below fields
Microphone. Enabled the microphone by default. Uncheck to disable it.
Keyboard. Enabled the Keyboard by default. Uncheck to disable it.
Include choice options in the voice output. Disabled by default. This option allows the voice assistants to read the options.
Say Number Before Option. Enable to let the voice assistant say the option number before the Option.
WaitTimeAnimation. Enabled the wait time animation by default. Uncheck to disable it.
Wait Time. The time in milliseconds before which the next bubble appears
Say
Generates content without asking a question; all the other controls that generate content to a user is a question. You can aggregate a series of say controls before a question. For example, you could have a loop that gathers a person's schedule.
Example
Say1 > Expression1 > Say2 > Custom Control > Expression1 > Say3 > Question
Flow will aggregate: Say1 + Say2 + Say3 + Question
In Chat, each Say would be its own bubble when it responds to a question; that is, when it outputs to Alexa or Google.
Say control in the flow studio does not have the Screen tab in the Multi-Modal Content Editor because the Say control aggregates, until a question occurs or until the end control.
Rating
Captures a rating value from the user. You can create your own rating scale using the Rating control. The user can either input the Value, Text, or the index (one, two, three).
The Directives tab contains the following fields:
Microphone. Enabled by default. Uncheck to disable it.
Keyboard. Enabled by default. Uncheck to disable it.
WaitTimeAnimation. Enabled by default. Uncheck to disable it.
Wait Time. Time in milliseconds before which the next bubble appears
Yes/No
Use the Yes/No control for any questions to which the users have to answer using Yes or No.
The Data tab contains the options Yes and No.
The Directives tab contains the following fields:
Microphone. Enabled by default. Uncheck to disable it.
Keyboard. Enabled by default. Uncheck to disable it.
WaitTimeAnimation. Enabled default. Uncheck to disable it.
Wait Time. Time in milliseconds before which the next bubble appears.
Custom
This control allows Flow Studio users to invoke custom code defined in the Experience Designer. When execution reaches a Flow Studio Flow's Custom control, Orbita directs the control to the 3rd pin of the Flow Manager node in Experience Designer. (See the section below titled Experience Designer and Flow Manager Node for more information)
When you drag the Custom control to the Flow Studio canvas, you can see a name field and the control's ID. In Experience Designer, developers can write code that executes when the Custom control is reached.
...
In the Custom Control, note the Custom control Name. In this case “bmiCalculator”
By default, the “Hook-Data” function node is attached to the flow Manager as part of the base project
The “Settings / Hooks initialize” is needed if not already integrated into a global setting.
Note: If you don't have the Hook Data or the corresponding nodes,
In the Experience Designer, create a new flow.
Click on the hamburger menu on the top right corner and navigate to Import > Built-in > Orbita Flows-(BETA) > Flow Manager.
Place the flow on the canvas and Deploy the flow
Settings/Hooks Initialize function ( may not be needed if already set in settings function)
Code Block |
---|
const settings = global.get("settings") || {};
settings.emptyString = '';
global.set("settings", settings);
const hooks = global.get("hooks");
global.set("hooks", hooks || {});
return msg; |
bmiCalculator hook function
Code Block |
---|
global.get("hooks").bmiCalculator = msg => {
node.send(msg);
}; |
BMI Calculation function
Code Block |
---|
var _ = global.get('lodash');
var answers = msg.orbita.session.flowInfo.answerArray;
var weight = _.result(_.find(answers, function(obj) {return obj.name === 'Weight';}), 'value');
var height = _.result(_.find(answers, function(obj) {return obj.name === 'Height';}), 'value');
var bmi = calcBMI(weight, height);
msg.payload.externalHook.data = bmi;
return msg
function calcBMI(weight, height) {
return parseInt((weight * 703) / (height * height));
} |
The BMI calculation, sets
msg.payload.externalHook.data
with results of the bmi, with the return msg, the msg object is passed back to the flow manager. Although you can pass data back anywhere on the msg object, returning it to msg.payload.externalHook.data , will cause this data to be logged in the answers array as a control result.
Hook Data Function ( only for reference - should already be included in your project)
Code Block |
---|
try {
const _ = global.get('lodash');
controlId = msg.payload.session.attributes.orbitaSession.flowInfo.controlId,
controlName = msg.payload.session.attributes.orbitaSession.flowInfo.controlName,
hookName = controlName.indexOf(":") > - 1 ? controlName.substr(0, controlName.indexOf(":")) : controlName,
args = controlName.indexOf(":") > -1 ?
[msg,
...controlName.substr(controlName.indexOf(":") + 1).split("|")
.map(arg => arg.trim())
.filter(arg => !!arg)
.map(arg => arg === "null" ? null : arg)] :
[msg];
msg.payload.externalHook = {
data: hookName
}
if (hookName) {
const hook = global.get("hooks")[hookName];
msg.hookResult = {
hookFound: !!hook
};
if (hook) {
hook.apply(null, args);
} else {
msg.hookResult.executionSuccess = false;
msg.hookResult.error = `Could not find hook '${hookName}'`;
node.warn(`Could not find hook '${hookName}'`);
return msg;
}
}
} catch (error) {
msg.hookResult.executionSuccess = false;
msg.hookResult.error = error;
node.warn(`Error executing hook`);
node.error(error);
return msg;
} |
Evaluate
Acts as a switch command. You can create multiple pins based on the number of outputs you require.
Info |
---|
You have access to the entire msg object, by selecting custom. You can use {{ mustage tags }} on Custom Input, and with Evaluate. |
The evaluate control can be selected by any previous control value. The best practice is to name each control; do not use the default name.
The pins refer to each of the switch cases respectively. The Default pin refers to the else case.
Custom input field
Customizes the input field to call any value from the msg object. You can create a condition with that value and set the appropriate pins to it.
Link flow
The link flow control connects two flows in the flow studio. It also gives you an option to jump to the exact control node in the target flow.
Name. Give a name to the control.
Jump to Flow. The flow names in the project are listed here.
Select Control. All the control ids of the target flow are listed here.
Agent
This control will enable the VUX designer to use any agent in the flow-studio flows. For now, only Knowledge Answers is available in the Select Agent dropdown.
Select Agent: You can select the agents you need from the dropdown. For now, only knowledge answers is available in the agent dropdown.
Select Answers: You can select the kgraph from the current project list.
Select Topics: Allow users to search and select a topic from the selected kgraph and they also provide mustache in the topic selection field.
Select Relationship: Allow users to search and select Relationship from the selected kgraph and they also provide mustache in the relationship selection field.
Select Article: Allows the user to select a specific multiscreen field from the kgraph schema for schemas other than the default.
Success: This pin is triggered if a fact is found for the selected Topic and Relationship.
No Response: This pin is triggered if the topic and relationship pair don't have a fact.
Annotation
This control is used for labeling.
Flow Manager Node & Experience Designer
You must set up the Experience Designer to invoke Flow Studio Flows using the Flow Manager node. This is a one-time setup.
The following image contains a flow in Experience Designer, which uses the flow from Flow Studio. The Flow Manager node calls the flow in the previous image from the Flow Studio. A Hook Data node to call on Hook event functions.
Repeat
AMAZON.RepeatIntent
Will cause a question to be repeated. Note if you want to interrupt a question being asked to the Flow Manager, for example asking a question that Orbita Answer can answer. Then passing the question in a session back within a session. Set the msg.payload.request.intent.name = ‘AMAZON.RepeatIntent’ and the question that was being asked will be repeated.
Answers Flow Function node:
Code Block |
---|
msg.orbita.session.injectedMessagechatText = msg.payload.orbita.answers.articleContent.faqInfo.chat.chatText + '<hr />';
msg.orbita.session.injectedMessageVoice = msg.payload.orbita.answers.articleContent.faqInfo.voice.sayText + '<hr />';
msg.payload.request.intent.name = 'AMAZON.RepeatIntent'; |
Flow Manager Say Node
Code Block |
---|
{{msg.orbita.session.injectedMessageVoice}}
{{msg.payload.multiagent.voice.sayText}} |
Previous
AMAZON.PreviousIntent
Will cause the Flow Manager to backup to the previous question
To get started, download this JSON file, copy it to your clipboard, then import it into your project's Experience Designer using the hamburger menu in the top right corner.
To configure Experience Designer to handle your Flow Studio Flow:
Copy the Flow ID from Flow Studio.
Paste the Flow Id in the Edit Flow Manager node settings tab.
Best practice: Use {{msg.payload.flowId}} (that is, a mustache tag), so you can dynamically load the flow. In most cases, you have only one flow manager in a project, but there is no restriction on the number of flow managers.
Enable No Save to not save data to the Orbita database. Disable No Save to send the data to the Orbita database. By default, Orbita saves the flow result data.
To access the data that is saved to Orbita,Go to the flow Studio list.
Click on the vertical ellipses button against the preferred flow studio.
Click Flow Data. The saved flow result data for this Flow Studio appears.
In the Options(Single) field, specify the template to read out the choices to the user.
Programmatically jumping to anywhere in the flow
When you do not want to start at the beginning, you can jump to any control in the flow. Or you can jump to an external event that captured more information.
Function node
Code Block |
---|
msg.orbita.session.flowInfo = {"flowId":"5cbabd63d4561b6100a10ec6",
"controlId":"602191193647793721",
"state":"IN_PROGRESS",
"messageArray":[],"answerArray":[],
"checkboxPntr":0,"multiOptionMode":[],
"items":{}};
return msg;
|
flowId. The ID you find when you open a flow in the Flow Studio.
controlId. The ID you find in any control in a flow, the one you want the flow to start with, as it receives input from the intent
See the following video.
The same approach could be used to restore a previous session or return to the original calling flow.
Code Block |
---|
// Control ID
case "520495394870413630": // return back to flow A from flow B
msg.orbita.session.flowInfo = msg.orbita.session.saveFlowInfo; |
Sample Flow
The following image shows a sample Flow studio flow.
Attachments | ||||||||
---|---|---|---|---|---|---|---|---|
|
Related Articles
Filter by label (Content by label) | ||||||||
---|---|---|---|---|---|---|---|---|
|