Microsoft Research shows Holodesk

Microsoft Research shows Holodesk

  • Comments 11
  • Likes

In the last few months we’ve seen some pretty great applications of natural user interface (NUI). It seems fair to say that NUI is becoming more widely used and accepted in its various forms.  But not being ones to sit back and relax, the folks at Microsoft Research (MSR) have been toying with a few ideas they have up their sleeves.

My favorite is Holodesk, a research project out of the Sensors and Devices group at Microsoft Research Cambridge. I won’t attempt to describe what it does in great detail, except to say that with Holodesk you can manipulate 3-D, virtual images with your hands. Whilst this is only a research project at this stage, I can envisage future applications in areas such as board gaming, rapid prototype design or perhaps even telepresence, where users would share a single 3D scene viewed from different perspectives. I know it sounds very Star Trek but this is not science fiction.

For the record, the Holodesk isn’t the only 3-D interaction experiment out there. But what sets it apart from the rest is the use of beam-splitters and a graphic processing algorithm, which work together to provide a more life-like experience. The video provides a much better explanation, so I’ll leave the explanation at that.

As if that weren’t enough, MSR this week published several research papers at the User Interface Software and Technology Conference. All of the papers relate to some aspect of NUI.  For example there’s a project called OmniTouch that uses a pico projector and 3-D camera to turn virtually any surface into a multi-touch interface. Eventually, Omni Touch could be designed to be no larger than a pendant or a watch.

Another example is PocketTouch, which is focused on creating multi-touch sensors that will allow you to interact with a smart phone or other device through the fabric of your shirt pocket or jeans. Conceivably, you could pause a song or listen to voice mail without fishing out your phone. Both of these projects are part of a larger effort by MSR to look at more unconventional uses of touch interface, so we don’t sit back and settle for the status quo.

Both projects have seen plenty of coverage in the last day and you can read more about them, and see videos, on the Microsoft Research site.

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • I'd love to slow down physics and use it to learn to juggle!

  • Superbbbbbbbbbbb.....sounds cool...awaiting for this tech commercial use

  • super innovation....................

  • Looking fwd to Tele presence !!! Great going MSR !

  • 3D or not 3D? That is the question. Your hands are manipulating the objects in 3D space but is the image itself 3D with separate images being directed at each eye, or is it simulated 3D a la Johnny Chung Lee's famous Wii hack?

    It would be cool to project the image using something like MicroVision's laser picoprojector. The laser light source would allow you to polarize the images and create a true 3D image using nothing but polarized glasses. If you take off the glasses, you simply revert to a non-stereo image.

    Very cool tech. This is the kind of stuff I'd love to work on. My dream job is going from lab to lab or meeting room to meeting room as a consultant solving problems and offering creative ideas.

  • Scratch that. I suspect they're using Microsoft's "Wedge" to create a stereoscopic image. I don't know if this works for multiple viewers but for a single viewer, it's a cool solution.

  • Looks like a patent I saw that Apple got earlier this year.

  • @Rob: I believe it's called Musion Eyeliner. No special projection needed. Just needs to be at least 2k I believe.

  • @rob: This was introduced to the market 4 or 5 years ago as holographic projection for life audience. It was used for holographic projected life conferences on stage. So yes, the projection is 3d without goggles. That's also why the projection surface is tilted at an 45 degree angle.

    About msr: great work but still a little way to go. When I look at the tea pot, looks like they need at least one more camera to define the opening inbetween the fingers more precisely. But otherwise quite great. I would imagine this could be interesting as nano surgery robots UI. Once they got rid of the jittering.

  • *40 years later*

    Back in my days, we had to TURN ON our devices... with buttons.