Microsoft Research Connections Blog
Next at Microsoft
Social Media Collective
Windows on Theory
Posted by Rob Knies
Steve Hodges and his colleagues in the Sensors and Devices group at Microsoft Research Cambridge spend their time pursuing novel sensing technologies and new devices that make it easier for people to interact with computer systems and digital content.
The team’s successes have been many, and among the most notable have been SenseCam—a wearable camera that takes photos automatically, thereby enabling users to review a series of snapshots and recall events as they transpired—and .NET Gadgeteer, a rapid prototyping platform for small electronic gadgets and embedded hardware devices.
Now, these creative researchers have unveiled their latest concept via a note titled An Interactive Belt-worn Badge with a Retractable String-based Input Mechanism during the Association for Computing Machinery’s 2013 SIGCHI Conference on Human Factors in Computing Systems, being held in Paris through May 2.
The note was co-written by Norman Pohl of the University of Stuttgart, Hodges and his Microsoft Research Cambridge colleagues Nicolas Villar and John Helmes, along with Tim Paek of Microsoft Research Redmond.
“We interact with digital content more and more—such as electronic diaries, emails, traffic status, and weather info,” says Hodges, principal hardware engineer at the Cambridge lab. “But even if you have your mobile phone in your pocket, it can be a pain to interact with this content in many cases.
“The badge is always on hand and lets you navigate to the content you want simply by moving it to the right place relative to your body, using your spatial muscle memory. It’s much easier for quick ‘snacking’ on small amounts of digital content.”
The lightweight, interactive badge prototype includes an embedded LCD that presents dynamic information to the wearer. Sensing-based input capabilities are built into the badge’s retractable string, enabling single-hand interaction.
“When you pull the badge away from your body,” Hodges explains, “the sensor detects how far the badge is pulled out and at what angle, enabling the system to know where the badge is in relation to your body. Depending on this location, it displays different content. If the content to be displayed is too big to fit on the badge display, it’s possible to pan around it.”
Paek, in particular, served as a catalyst for moving the research project to where it is today.
“Our work on augmented-reality systems—like HoloDesk and the mobile projector,” Hodges says, “got us thinking about a lightweight display which could sense its location relative to the body and which could act like a ‘lens’ onto virtual digital content.
“But it was a conversation with Tim that actually spurred us to turn all these thoughts into a full-fledged research project when we realized that his vision about displaying automatically mined context information on a low-power, wearable device like a badge were similar to ours.”
Of course, the exploration of a new, intriguing research project has its own attractions.
“This research still has many open questions on design, form factor, and content to be explored, but the challenge of working on technology that has promising, unexplored potential is what makes this exciting,” he enthused. “So many people already wear a badge on a regular basis—in offices, hospitals, or schools and universities—yet they are currently just pieces of plastic with static images on them.
“Let’s turn them into interactive devices!”