Dialogflow is used for building conversational interfaces. It can analyze text or audio and respond to a human in a natural chatty way. Dialogflow is an authoring platform, not just an API, but it's possible to manage your chat bot through its API. In most cases, you would create a chat bot in the dialogflow console and embed it into your telephony or text application using the prebuilt integrations. But there are some use cases when you might use the API instead, this might be for app development or in a device. Before using the API, you should understand some basics of how dialogflow and chatbots work. The building blocks of any chatbot are intentand entities. Indense are the meaning behind something that the user types or speaks to a chatbot. So if I say, aren't you opened yet? And the chatbot replies, we are open at 6. The intent is get opening times. Of course, there are many different ways you might ask that question, so a chatbot has to be trained by giving it example phrases. You don't have to think of every possible way of framing the question. Given enough samples, the ML, machine learning algorithm, gets very good at working out the intent. Do you also need to fit the chatbot its responses? It needs to be told what to say once it understand the intent. Just one sample response is enough, but having several creates more natural conversations. A dialogflow conversation takes place in a session. Do create the session at the beginning of the conversation and this continue it when it has ended. Session data is only retained for 20 minutes, then delete it. As with a human conversation, a dialogflow chat works in turns. The user takes something, dialogflow detects its intent and returns a response, which in turn may generate another user input. Context help the conversational flow. They are used to pass parameter values from one intent to the next. This is similar to how, in a human conversation during fair, where is meant by what was said previously. When the user asks, who wrote it, a context stores the value for the get music title and passes it as a parameter to the next intent, DoComposerSearch. This makes the conversation more natural. Context last for the duration of our conversation session and then expire. Entities are objects that your application acts on. The chat bot seeks to extract entities from the intense in the conversation at work out what to do with them. So, if I type, is it possible to book a table? The entities might be booking and table type. The actors that I'm entity initiates might be to ask. They use another question, pass an input value to another application, or return an output value to the user. Entities have values known as parameters. You need to train dialogflow to anticipate only likely ways a human might reference an entity parameter from within an intent. For example, saying, I want the corner table, might reference an entity table type. But you also might reference the same entity by saying, can you get me a quiet table? You have to train dialogflow with all of the synonyms that reference the same entity. Dialogflow is a complex and powerful authoring tool, but if you understand intents and entities, and contacts and parameters, you have made a good start. All of dialogflow functionality is accessible through the console. And the most common way to present the chat bot to the end user is through the prebuild telephony and text integrations with third party applications, also configurable in the console. But sometimes, you may want to access your chat bot in your own application through the API. If you are not using an integration, you must write code to interact with end users. For each conversational turn, your called calls the dialogflow API to query your agent. You sent the users message to dialogflow by calling the detect intent or a streaming detect intent methods of the session type. Detect intent is used in conversational challenge response type conversation. Streaming detect intent is used for listening and responding concurrently. For example, with the streaming detect intent, you could stream audio to dialogflow and listen for a response at the same time. You could configure this further by having dialogflow listen for pulses in the audio stream and interrupt the flow of the conversation, just like a chatty human. Whichever method you use, dialogflow responds with information about the much intent, the actions, the parameters, and their response defined for the intent. Your service performs actions as needed. For example, data is queries for external API calls and sends a message to the end user. This process continues until the conversation has ended. A detect intet requests can be configured in powerful ways. For example, it can be configured, so the dialogflow recognizes speech, do a specify in the detect intent that the input is audio, and dialogflow processes the audio and converts to text before attempting an Intent match. And if you need your bot to talk back to user, it can do that too. This is all done by setting parameters in the Asian body of your detect intent request to the API. By default, dialogflow configure without speech adaptation on, so that your agent will use entities, training phrases, and conversation states as hints to the meaning. But you can explain this further in your detect intent request by providing a speech context parameters as phrases with boost values to prefer one meaning over another. If the word is positive, dialogflow will increase the probability that the phrases in this context are recognized over similar sounding phrases. This might be useful in a contact center use case where there tends to be highly repetitive and domain specific use of language. You can even configure sentiment analysis to detect whether a user is expressing a positive, negative, or neutral opinion. To set this up, do configure the parameter, analyze query sentiment to true. And in the response, you get back a positive, negative, or zero centimeter score. This enables dialogflow to return a more targeted message into the Exxon response. You can trigger sentiment analysis predict intent requests, or you can configure your agent to always return sentimental analysis result. In some cases, for example, with word base FAQ, where you might already have all questions and answers in pre-existing documents, granting them into dialogflow, might not make sense. In this case, you can configure dialogflow to parse your documents, HTML and CSV for a suitable response using knowledge connectors. With knowledge connectors enabled, all detect intent request will try to find automated response in your knowledge basis. Alternatively, you can specify one or more knowledge basis in individual detect intent request. To do this using the API, you would specify the knowledge base as a query parameter in your input, Jason Packet. Knowledge connectors offers less response precision and control that intense, so the user agent conversation might be a stilted. It's common for an agent using knowledge connectors to also use define intents. In general, use intents to handle complex user request and let knowledge connectors handle simple request. Otto is dialogflow, API driven chat bot developed by learning pull for a European travel operator. It combines open ended questions of the chat bot with the structure questions to a knowledge base to inform staff about corporate brand, culture and values.