June 7, 2018 6:03 pm
Remember the early 90s, those phone calls where a computer tried to record your voice and played some taped response (…press 0, if you want to speak to someone)? You could easily distinguish a computer AI from a human right?
Remember when you typed a request and the chatbot didnt understand you because you did not literally say it the way the chatbot was programmed? In those days artificial intelligence was mostly decision trees: the if – else statements.
If the user asks whats the weather today? it either returned the weather information or it returned Im sorry I cant understand.
This is exactly what can make or break a chatbot and ultimately the experience. So how do we fix this?
Get ready for the Dutch Google Assistant
In May 2016, Google announced the Google Assistant, a personal assistant available on multiple platforms to help you with your daily task wherever you are. The Google Assistant is built-in on Android devices (phones, tablets, wearables) and available for iPhones and iPads through apps.
Google also released a voice activated speaker, which can help you control your smart home (lights, sensors, cameras), play music and tasks like setting alarms, reminders, make calls or send messages.
In June this year, the Google Assistant will be launched in Dutch. First on Android devices then later in the year on smart activated speakers.
Many large organizations are getting ready to implement their own custom actions (apps) on the Google Assistant and to have their use cases available through voice.
Think of internal business to employee agents, public facing chatbots for customer service or sales departments or controlling IoT devices, etc.
With the end-to-end development suite Dialogflow it is possible to build conversational interfaces for websites, mobile applications, popular messaging or social media platforms, IoT devices and voice activated speakers/assistants (like Google Assistant).
This is all powered by machine learning (Natural Language Processing and Speech Recognition) that helps recognize the intent and context of what a user says. This enables your conversational UI to provide highly efficient and accurate responses.
Executing conversational actions with APIs
The power of conversational UIs is not just to understand what you are saying and to have a conversation that feels real. The actual value comes from the ability to execute your questions and commands. These can be anything from getting the weather forecast to turning on the lights in your living room. Or checking what your calendar looks like for today.
Whatever it is, APIs are the basis for connecting this to the conversational UI. APIs bring meaning and content to the conversation by having access to data, services and IoT devices. They connect the talk to the walk.
Dialogflow is able to make an API call when all the questions are clear or when the data has been understood. You can configure Dialogflow to directly call the weather service API from the web or the API of your Hue lamp.
But, by introducing an API management platform like Apigee you can make it a whole lot easier and more secure.
Apigee will let you configure the API that Dialogflow needs for its so-called webhook. This is a POST call that accepts all data from the conversation and expects a text response containing the reply that will be spoken to the user.
Your existing API on an IoT device or on a public service probably does not match this format, so youll need some sort of transformation. Apigee will help with this and make it super easy to use any API and connect it to your Dialogflow project with no hassle.
Apigee can also act as a security layer, accepting an API Key or a JWT token from Dialogflow and translating this into the credentials needed for the backend service which might be accepting a Basic Authorization header or a different token.
Additional value of the Apigee platform is that youll be able to get analytics on the API calls that are fired by all the conversations. You will be able to see how many conversations are happening, if they are successful or not and if the latency is okay.
During the Beyond Banking event there will be various activities on building conversational UIs with Dialogflow and Google Assistant. For example the Build your own Google Assistant action Learningthon or sessions about Machine Learning.
Visit the Google Cloud booth to get access to Dialogflow or play around with the AIY Voice Kit made from Cardboard.
Kevin Bouwmeester, Customer Engineer Apigee
Lee Boonstra, Customer Engineer Google Cloud
Categorised in: News
This post was written by BasDV