Hello … I’m Steve Clayton and this is the first in a series of posts you’ll see from me spotlighting the trends, ideas, products and people behind the technology as well as our vision for the future of technology at Microsoft. I’m passionate about uncovering the invention, creativity and interesting projects happening around Microsoft, so expect to see me back here soon.
Later this week at a symposium on User Interface Software and Technology, Andy Wilson from Microsoft Research will present a new research paper on LightSpace that could revolutionize the way we handle virtual documents and objects. In his presentation, he will show how entire rooms can become computers and physical surfaces can evolve into interactive displays. With LightSpace, you can manipulate virtual objects, moving them from one surface to another simply by touching two surfaces. You can even “pick up” a virtual object, walk to the other side of the room, and then place it on another screen or surface. As often happens with technology like this, it’s much easier to see it than imagine it, so check out the video below.
If you’re like me, you’re already thinking about what this could mean in the real world. Imagine boardroom Post-It note exercises becoming completely digital, or in classrooms, students working in teams around a table and then “carrying” their virtual document to the front of the classroom to present. Fashion editors could piece together a magazine from virtual “mood boards” spanning tables and walls and easily make last-minute changes before their print runs. On the coffee table at home you could be reading a digital book and as you touch images on the page, another surface shows related information and videos. It’s exciting to think about the impact this could potentially have on the way we work and live, and even more exciting to think that this is only one of many advancements that are part of the emerging trend of Natural User Interfaces, or NUI.
NUI employs touch, face and voice recognition, movement sensors, our location, context and even our mood to deliver more natural, human interfaces. You may have seen Kinect, which uses depth-sensing cameras similar to LightSpace, and enables gaming that is gesture-driven where you literally are the controller. There are several other NUI products on the market today, but those are really just the tip of the iceberg when it comes to what’s possible. As researchers and developers continue to make advancements in this area, we’ll begin to see products and ideas that we never even thought possible. For example, another idea shown at UIST from Florian Müller, a Microsoft Research Fellowship winner, in affiliation with the University of Melbourne, is “Jogging Over a Distance.” Using spatial audio to enable joggers in Europe and Australia to virtually run together in open terrain, the technology gives the feeling that your running partner is right next to you, slightly behind or ahead depending on their real-time heart rate.
NUI is so much more than touch as it removes mental and physical barriers to enable a whole new way to experience technology. Stay tuned, as we have lots more coming. You can find more stories like this on the Microsoft News Center’s Next in Tech Newsroom.