6 Minutes to Learn How to Get a Cloud-Based IoT Solution Running!

This post is by Santosh Balasubramanian, a Program Manager in the Microsoft Azure Stream Analytics team

Have you ever wondered how to connect cheap devices like Arduino boards or Raspberry Pi’s with off-the-shelf sensors (which measure such things as temperature, light, motion, sound, etc.) to create your own monitoring solutions? Have you assumed that it might get pretty complicated or overwhelming to efficiently collect such sensor data, analyze it, and then visualize it on dashboards or setup notifications?

Well, it’s time to think again.

This problem – of collecting, analyzing and taking action on high volumes of IoT or sensor data in real-time – is relevant not just for hobbyists among us, but is also of critical importance to many large enterprises involved in a myriad different activities such as manufacturing, energy efficient buildings, smart meters, connected cars and more.

Through the short 6 minute video below, you will learn everything you need to know on how trivial we have made it to connect sensor data to the cloud and run sophisticated data analytics on it using a set of Microsoft Azure services including Event Hubs, Stream Analytics and Machine Learning.

At the end of it, you will have created custom “live” dashboards and notifications on data emitted by a weather shield sensor.

And, best of all, you don’t have to be a data scientist to do any of this.

Getting Data into Azure and Performing Analytics on it

We started building the IoT solution using Azure Event Hubs, a highly scalable publish-subscribe ingestor. It can take in millions of events per second, so you can process and analyze massive amounts of data produced by your connected devices or applications. There’s code running on the Arduino boards and Raspberry Pi’s to take sensor data and stream it in real-time to Event Hubs. Once this is done, you are ready to create live dashboards and view your current sensor data, such as the temperature and humidity charts shown in the video.

Now say that you have thousands of temperature sensors – in a large building, for instance, and, rather than seeing each sensors’ data individually, you wish to see aggregated information such as an average, maximum or minimum temperature for the building each hour. To do this, you can use Azure Stream Analytics, our fully managed stream processing solution, which seamlessly connects to Event Hubs. This allows you to write stream processing logic in a SQL-like language. It includes several temporal functions such as TumblingWindow, SlidingWindow, and Hopping Window, allows you to Join multiple streams, detect patterns and create your stream processing logic. It provides enterprise-grade SLAs and easily enables you to scale your resource needs up or down based on the incoming throughput. You can create the cheapest stream-processing jobs for as little as $25/month (and currently at half that price, as this service is still in public preview). With Azure Stream Analytics, there is no writing or debugging of complex temporal logic in Java or .NET – if you know SQL, you are ready.

Real-Time Notifications and Alerts

Once you see live or aggregate data in your dashboards, you will likely want to setup rules or conditions under which you will get notified about issues in real-time. For this you can setup thresholds for alerts in Azure Stream Analytics. These alerts can be as simple as “show me alerts when the temperature is over 79 degrees,” to complex, such as “alert me when the average humidity in the last second is 20% greater than the average humidity in the previous 5 seconds.”

As a system gets more complex and involves an ever increasing number and variety of sensors, it often becomes necessary for alert thresholds to be adjusted periodically. This is often a very manual and cumbersome or complicated process. Hence the need for systems that recognize “normal” data patterns as opposed to “outlier” situations where something unusual may be happening. Better yet, such a system should teach itself to recognize such anomalous patterns, so that rules would not need to be manually or continuously adjusted. This is where Azure Machine Learning comes in. We use Azure ML models to detect anomalies and raise alerts in our example. We simply used a pre-existing Anomaly Detection API available from the Azure Marketplace. Our streaming sensor data is sent to this model, where anomalies are detected in real-time and get displayed in our alerts. An ML model such as this one, made available as an API on the Azure Marketplace, allows folks such as myself who are not data scientists to easily consume such “best of breed” models, even if we don’t know their full inner workings.

If you wish to create your own weather sensor on Raspberry Pi’s, Arduino boards etc. please go to the following blog post by Microsoft Open Technologies. There you can download the code and provision your sensors easily. This code not only gets data from sensors, but also streams the data in real time to Event Hubs like as described above. Additionally, you will also get the code for creating Stream Analytics queries and custom real-time dashboards.

I hope you found this post useful.

Subscribe to this blog. Follow us on twitter