Senior Manager, Worldwide Marketing and Operations
The key to unlocking future predictions has generally always been some type of data. Today, with the ever-increasing storage and computational capabilities of the cloud, combined with the sheer number of devices available to help people harness information, scientists around the world are learning how to interpret Mother Nature like never before. The depth of analysis and the types of prediction models that researchers are now able to create simply would not have been possible a mere few years ago, and Microsoft is honored to help play a role in some of these scenarios.
One special initiative has been the VENUS-C (Virtual multidisciplinary EnviroNments USing Cloud infrastructures) series of projects, which has brought together the European Commission and a range of universities, research organizations, and companies, including Microsoft, to better understand and leverage the capabilities of the cloud. One of the projects includes the VENUS-C Fire application, which, when paired with Windows Azure and open source software, has helped firefighters on the Greek island of Lesvos to be more aware of fire risks and to provide faster, more effective responses.
We recently caught up with Dennis Gannon, Director of Cloud Research Strategy for Microsoft Research, to learn more about VENUS-C and how the projects are embracing both commercial and open source software:
Provide a brief overview of VENUS-C and what it, along with Windows Azure, has been able to do:
VENUS-C was started as a collaboration with the European Commission (EC) about two years ago, and was an attempt to try and understand the role of cloud computing in the larger research infrastructure. The EC wanted to know what the cloud could do to add to that infrastructure and how the cloud could interoperate with the EC’s existing setups – something that is extremely important, since the EC is a strong supporter of open source software (OSS). We structured the whole VENUS-C project around a group of pilot opportunities to run a sort of stress test on ideas on how to make a cloud like Windows Azure interoperate in the most seamless way possible. There are other OSS-based cloud platforms already running in Europe, and this project was really an excellent one for us to start with to think about what would it actually take for an application to run on Windows Azure, and then how could that same application run on other cloud platforms (and what would we need to make it work).
The Fire Application in particular was an excellent one because it first involved a specific activity that was important for a large community of users. In this case, it was the firefighters in Greece, who, in the years prior, were dealing with a number of devastating forest fires in the region. What the cloud allows us to do is to have mobile and web-based applications where the cloud forms a sort of collection point for a lot of different data from various sensors and observations and simulations that predict the weather, or the path of a fire as it evolves. By integrating all of this data and knowledge into one place, it gives the first responders and government policymakers an excellent view of what is really going on, what the threats are, and how people are responding. That’s a really significant change and you see that potential in the cloud to be a hub for data to be integrated with more global knowledge for all disasters. This was something that was not really easy to do via existing high performance computing structures, such as what the EC had been using previously.
We developed a tool called the Generic Worker that simplifies the process of taking an application from a desktop and then push it out to the cloud and scale it up to make it do more analysis and computation. It is already being used widely by our team on a number of VENUS-C projects because we built the Generic Worker tool with a standard interface that others could then implement on other open source platforms. Additionally, we built it so that it conformed to certain data movement standards that are widely accepted by the EC, the Open Grid Forum, other open data standards and so on. The EC thought that all of the VENUS-C projects were and are quite a success, but the fire one got people quite excited by the prospects.
Why was the choice made to utilize OSS for this project? What benefits stemmed from the pairing of Microsoft and OSS?
With all of the VENUS-C projects and related applications, we supported this effort with the intention of bringing all of the different research groups together to solve problems – some 20 groups took part in just the European projects alone. In total, we had 75 projects worldwide and, in each case, we wanted the research teams to bring what they use on a daily basis, and what’s important to them, up into the cloud. If they needed to use OSS, that was great – we wanted to make sure that it runs well in the cloud via Windows Azure. Allowing the community to tell us what is important to them – again, open standards was a guiding principle of VENUS-C – that’s what mattered for us. We wanted to ensure our tools embraced standards that were used were widely used by these 20 groups. Ultimately, the teams determined what tools they wanted to used and were best for them to achieve success, and a combination of proprietary and open source solutions delivered strong results.
What types of projects do you envision in the future between Microsoft and OSS?
Getting the open source tools working well on our cloud, such as with Hadoop and Linux images running on Windows Azure, is great to see. In terms of my work, as I continue to work with the research community in the cloud and with Windows Azure, I’m putting more of an emphasis on life science technologies, which is an area that is traditionally known for using Linux software. I am working to try and encourage people to try some of the open source genomics software on Windows Azure, and am even funding some of those efforts to do that right now.
Can you see similar applications being used to prevent or combat other threats from natural disasters?
The opportunities are really huge, and in terms of natural disasters, the really interesting thing happening right now is that we’re becoming such a sensor-connected world, with sensors on vehicles, GPS devices on things like firefighting equipment and snowplows and such to broadcast their locations, and more. Now we can basically take any city and completely instrument it, and use all of that instrumentation paired with the cloud, which is the sink where all of that data can then be gathered. Responses to a severe storm in the future can be monitored and understood by a variety of different instruments that collect the data, and then there are all the tools that we can now use to analyze that data. The key is coupling live data with historical data, via analytical tools to make physical simulations, and establish predictive monitoring using machine learning techniques from past events. You can see current conditions in an environment that resemble past events and you can conclude that danger may be ahead, by quickly channeling all this information with simulations models.