More Like Us

Earlier this week you may have seen my post highlighting LightSpace, a project from Microsoft Research that we believe has the potential to advance the idea of Natural User Interfaces (NUI). This week, Craig Mundie, chief research and strategy officer at Microsoft, gave a glimpse of more NUI technologies during speeches to students at Duke University and MIT.

Craig’s demo was a large stereoscopic 3D display that showed some of the possibilities of human-scale computer interaction. In it, he showed what it might be like to interact with your computer when it is bigger than you — like the size of a room. Sadly, unless we send each of you a set of 3D glasses it’s hard to recreate the experience, so you’ll have to trust me on this one when I say it was pretty incredible. Craig was able to walk into a 3D world where he could shop and play games, interacting with people in both the physical and virtual environment in a very natural way. I hope to be showing you more of the short clip from “The Spy from the 2080s” video he used so stay tuned on that one.

As display prices continue to trend downward, it’s conceivable that we’ll see wall- and room-sized displays popping up in homes, offices, doctors’ offices, retail stories, schools and more. They will, encourage us to interact with computers in new ways. What happens when the computer no longer fits in the palm of our hand? When tables, walls, mirrors and other objects are connected and have the ability to respond to our voice and touch, or interpret our gestures and motives? We’ll be able to interact with technology in much the same way we interact with each other.

This shift is already underway and we see it in our personal and productivity devices, such as mobile phones, entertainment devices, touch-enabled laptops and the forthcoming Xbox Kinect. But this is just the start. Future interfaces will go beyond touch-based interaction, adding and combining voice, vision, gestures and immersive experiences, and leveraging technology that is contextually aware. Computer systems will understand things like our preferences, the environment we’re in, and what we are trying to achieve. Our devices will know where we are and what nearby technology can helps us achieve what we’re trying to do.

This means it will be possible to visit anywhere in the world virtually — to hike the Grand Canyon or shop in Paris on the Champs-Elysées for example — from the comfort of your living room. It’s the start of an exciting journey toward computing that is more like us.

See also: Microsoft's Next in Tech Newsroom

Posted by Steve Clayton
Microsoft Storyteller