Input components

For inputting text and other data

Inputs are useful for asking your users for information. There are different types of inputs to use depending on your desired information. The complexity of an input can range from simple using input_string to somewhat complex using input_pattern that looks for a specific regex pattern or a more complex request like audio or video inputs.

Input support matrix

Only certain messaging integrations support inputs other than text. See the support matrix below for more detail.

📘

Inputs vs. intents

Most inputs have a corresponding intent with the same behavior and messaging integration compatibility. For example input_image is paired with the image trigger.

1276

Input support matrix

meya.input_string

Optionally outputs text and waits for a response from the user. Matches any string.

PropertyDescription
textthe text to outputRequired
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
outputthe key used to store the data for subsequent steps in the flowOptional. Default: value
scopewhere to store the data. One of flow, user, bot.Optional. Default: flow
detect_languageIf true, will set flow.language to the language the input was written in.Optional. Default: false
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None

Transitions

next: the default transition for an answer (if not present, flow will transition to the next in state in sequence)

component: meya.input_string
properties:
  text: "What's your middle name?"
  output: middle_name
  scope: user

Example - Language Detection

You can use the input_string component to detect a user's language. Start by creating a new flow and paste in the following code.

states:
    first:
        component: meya.text
        properties:
        		text: "Your language is {{ flow.language }}"

    input_state:
        component: meya.input_string
        properties:
            text: "Enter some text"
            detect_language: true
            output: value
            scope: flow

    greeting:
        component: meya.text
        properties:
        		text: "Your language is {{ flow.language }}"

Test it out in the test chat window. Entering hello will match the catchall trigger and start the flow. The bot will say Your language is since it hasn't yet detected your language. Enter hello again and the bot will respond Your language is en. Clear the messages to start over. This time try entering bonjour. The bot will respond Your language is fr.

360

meya.input_pattern

Matches a regex pattern for input. Properties are the same as meya.input_string with a few additions.

PropertyDescription
textthe text to outputRequired
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
patternthe regex pattern to match the input toRequired
require_matchIf false the flow will return an action no_match that you can use to transition to another stateOptional. Default: true
outputthe key used to store the data for subsequent steps in the flowOptional. Default: value
scopewhere to store the data. One of flow, user, bot.Optional. Default: flow
error_messagemessage sent to user if pattern is not matchedOptional. Default: Sorry, I don't understand. Try again.
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None

Transitions

next: the default transition for an answer (if not present, flow will transition to the next in state in sequence)
no_match: the state to transition to if require_match is false and the user inputs something that doesn't match the pattern.

component: meya.input_pattern
properties:
    text: "What's your API key?"
    pattern: ^(?P<capture_key>[0-9a-f]{32})$
    output: api_key
    require_match: true
    scope: user

📘

Regex capture groups

Capture groups are supported with meya.input_pattern. For example, pattern: ^(?P<capture_key>[0-9a-f]{32})$ will make capture_key available on the specified scope. Ex: {{ flow.capture_key }}

If a named capture group does not match, flow.<named_group> will equal the string 'None'.

The entire utterance can be accessed by reading the value of the output property.

meya.input_integer

Matches an integer value

PropertyDescription
textthe text to output
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
require_matchIf false the flow will return an action no_match that you can use to transition to another stateOptional. Default: true
outputthe key used to store the data for subsequent steps in the flowOptional. Default: value
scopewhere to store the data. One of flow, user, bot.Optional. Default: flow
error_messagemessage sent to user if input was not an integerOptional. Default: Sorry, I don't understand. Try again.
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None

Transitions

next: the default transition for an answer (if not present, flow will transition to the next in state in sequence)
no_match: the state to transition to if require_match is false and the user inputs something ambiguous

component: meya.input_integer
properties:
    text: "How old are you?"
    output: age
    require_match: false
    scope: user
transitions:
		next: proper_answer
    no_match: ambiguous_answer

meya.input_datetime

Matches a "human-readable" date. Some examples that will match "now", "in 15 minutes", "tomorrow", "next wednesday at 10:30am", "9pm tonight", "October 20, 1978", "Oct. 20 at 12pm". The result is stored in Unix time (seconds).

PropertyDescription
textThe text to output.Required
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
timezoneWhich timezone to use for interpreting times. Entered as an Olson Database format. If not set, the component will attempt to read timezone from datastore.Optional: Default: GMT
timezone_scopeWhat datastore scope to read the timezone.Optional: Default: user
parserIf V2.0, uses an NLU date parsing library that supports languages other than English.Optional. Default: V1.0
require_matchIf false the flow will return an action no_match that you can use to transition to another state.Optional. Default: true
outputThe key used to store the data for subsequent steps in the flow.Optional. Default: value
scopeWhere to store the data. One of flow, user, bot.Optional. Default: flow
error_messageMessage sent to user if input was not a date or time.Optional. Default: Sorry, I don't understand. Try again.
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None

Transitions

next: the default transition for an answer (if not present, flow will transition to the next in state in sequence)
no_match: the state to transition to if require_match is false and the user inputs something ambiguous

📘

A note about timezones

If you don't specify a timezone and there is no timezone set for the user, Meya assumes GMT. Keep this in mind when using the data at a later date. Meya will do it's best to populate the user's time zone for you. However, we can't guarantee that it's available.

component: meya.input_datetime
properties:
    text: "When is your birthday?"
    output: birthdate
    timezone: Canada/Eastern

meya.input_image

Gets an image url uploaded by the user.

📘

Supported on Messenger, Telegram, Kik and Smooch.

PropertyDescription
textthe text to outputRequired
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
outputthe key used to store the dataOptional. Default: value
scopewhere to store the data. One of flow, user, or bot.Optional. Default: flow
error_messagemessage sent to user if input was not an imageOptional. Default: Sorry, I don't understand. Try again.
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None
component: meya.input_image
properties:
  text: "Send me a picture of your favorite food!"
  output: food_image
  scope: flow

meya.input_video

Gets a video url uploaded by the user.

📘

Supported on Messenger, Telegram, Kik and Twilio

PropertyDescription
textthe text to output to the userRequired
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
outputthe key used to store the video urlOptional. Default: value
scopewhere to store the data. One of flow, user, or bot.Optional. Default: flow
error_messagemessage sent to user if input was not a videoOptional. Default: Sorry, I don't understand. Try again.
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None
component: meya.input_video
properties:
  text: "Send a video of your view!"
  output: view_video
  scope: flow

meya.input_audio

Gets an audio url uploaded by the user.

📘

Supported on Messenger, Telegram and Twilio.

PropertyDescription
textthe text to output to the userRequired
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
outputthe key used to store the audio urlOptional. Default: value
scopewhere to store the data. One of flow, user, or bot.Optional. Default: flow
error_messagemessage sent to user if input was not an audio fileOptional. Default: Sorry, I don't understand. Try again.
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None
component: meya.input_audio
properties:
  text: "Send a recorded message of your order :)"
  output: audio_url
  scope: flow

meya.input_file

Gets a file url uploaded by the user.

📘

Supported on Messenger, Telegram and Twilio.

PropertyDescription
textthe text to output to the userRequired
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
outputthe key used to store the file urloptional - default: value
scopewhere to store the data. One of flow, user, or bot.optional - default: flow
error_messagemessage sent to user if input was not a fileOptional. Default: Sorry, I don't understand. Try again.
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None
component: meya.input_file
properties:
  text: "Send a file of the doc"
  output: file_url
  scope: flow

meya.input_location

Gets and stores the location information about the user. Works with a text-based answer (e.g. Sunnyvale, CA) for all integrations or a location pin 📍 in the case of Telegram and Messenger.

📘

Support

Location pins 📍 are only supported for Telegram, Messenger and Smooch

PropertyDescription
textthe text to outputRequired
speechText to speak to the user. This field also accepts SSML markup to customize pronunciation.Optional
outputthe key used to store the data entered by the userDefault: value
confidencethe assumed confidence when matching.Optional. Default: 0.95
require_matchIf false the flow will return an action no_match that you can use to transition to another stateOptional. Default: true
error_messagemessage sent to user if input was not a locationOptional. Default: Sorry, I don't understand. Try again.
encryptIf sensitive, will encrypt the input in the bot logs as well as at the /messages API endpoint. More info: EncryptionOptional. Default: None

The data is stored to the user datastore. You can access the information found in the table below.

DatastoreDescription
user.latThe latitude of the location.
user.lngThe longitude of the location.
user.timezoneThe timezone of the location.
user.cityThe city of the location.
user.stateThe state of the location.
user.countryThe country of the location.

Example of a flow using input location.

component: meya.input_location
properties:
  text: "Where are you?"
  output: location