Build Intelligent Robots with EZ-Robot and Microsoft Cognitive Services

What if a robot could tell how you are feeling?

Microsoft Cognitive Services is a set of APIs and SDK services that can enable developers to easily add features that can detect emotion, identify objects and understand language understanding into their applications. Imagine utilizing that same set of features in a robot.

Enter Calgary based EZ-Robot who has created a solution to allow those interested in building robots to create applications for their robots using different SDKs. Their core product called EZ-Builder allows anybody, regardless of programming knowledge, to bring robots to life. The software is available from Windows Store and is able to incorporate Microsoft Cognitive Services to allow for further depth to the EZ-Robot's capabilities..

This post will cover is the use of three plugins compatible with EZ-Builder that are based on Microsoft Cognitive Services. The following are a list of these plugins:

To setup any of these plugins you simple need to install them on your computer and add to your EZ-Builder project. Once you have these plugins in your project, it’s easy to setup each of them. For example, if you want to use voice commands in your project, you need to provide some parameters using Config window below:

clip_image002

Working with the configuration for the plugin you will be able to invoke Script Editor to define robot actions based on input. Access to recognized text based on input voice command will be stored in BingSpeech variable, but you can rename it any time. Finally, you have to provide API Key to get access to Microsoft service. It’s just one place where you need to deal with the service directly. Just visit https://azure.microsoft.com/en-us/services/cognitive-services/ and create a free account to generate a key. The plugin will do all other things for you automatically and you can program robot’s behavior just imagine that the robots is getting text commands rather than voice commands.

The same situation for Cognitive Emotion and Cognitive Vision plugins. You need to provide API Key and after that you will be able to go ahead and start using the plugins just working with some predefined variables. In the case of the Emotion plugin, you can use EmotionDescription and EmotionConfidence variables. In the case of the Vision plugin, you can get access to image description and tags using VisionDescription, VisionConfidence and VisionReadText variable.

Therefore, it’s really easy to start using cognitive services in your EZ-Robot applications and I can promise some fun.

If you want to know more and watch some videos, EZ-Robot has begun publishing their lessons on YouTube and videos about Emotion and Vision services are already published.

Start using these plugins now and add some intelligence to your robots!