Learn How Microsoft Researchers Are Using Wearable Technology to Read Your Moods

Learn How Microsoft Researchers Are Using Wearable Technology to Read Your Moods

  • Comments 10
  • Likes

Posted by Kelly Berschauer

Mary Czerwinski (left) and Asta Roseway 

You can feel the stress building—you’re on deadline, your computer has stalled to a standstill, you’re pounding keys in frustration, and your blood is boiling. You’re about to explode.
 
And at that exact moment, your computer tells you to take deep breath and a walk.
 
Thanks to a team of Microsoft researchers within the VIBE group (Visualization and Interaction for Business and Entertainment) within Microsoft Research, the technology that would make that intervention possible is a work in progress focusing on human-computer interaction and clinical psychology. Three years ago, the team started working in the area of affective computing: designing systems—some including wearable computing devices—that attempt to identify your mood and react accordingly, in order to help you reflect on your own state. 
 
On Nov. 20, Mary Czerwinski, principal researcher in the VIBE group, will deliver the closing keynote of the AMIA 2013 Annual Symposium, being held in Washington, D.C., where she’ll share her team’s innovative research to advance the field of affective computing with the health community.

Intrigued? Can't be there in person and want to know more? We did too, so we turned to Channel 9 to shine the spotlight in the latest installment of the Microsoft Research Luminaries video series on two of the researchers making affective computing a reality, Czerwinski and Asta Roseway, principal research designer.
 
A key tenet of the team’s work is understanding and aiding emotional health to improve the quality of life. Czerwinski says: “Our research goes beyond traditional fitness. It’s about emotional fitness.”

There are all kinds of ways a system could detect what you’re feeling, such as utilizing a variety of sensors that monitor your facial features, how quickly you are typing, the intensity of each keystroke, or the stress in your voice. The combination of machine learning and data analytics could potentially tie together all this data to predict accurately how you are feeling.

Your computer may not be able to read you—yet—but research in affective computing could bring that to a reality soon.
 
Be sure to tune into Channel 9 to hear more about the novel ways this research is extending the boundaries of affective computing.

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • <p>Your work is much needed and there is no doubt in my mind that you will succeed.</p>

  • <p>Enjoyed very much your study and exploration of supporting emotional balance in our world. Congratulations on your research!</p>

  • <p>I&#39;m a real novice. &nbsp;However the concept is amazing and I&#39;m sure doable. I wish you a successful exploration, which in turn will help us understand our feelings.</p>

  • <p>zeiss glass lens in sonys new wearable</p> <p>zeiss can set place fit locate sensors transmitters micromotors go nano</p> <p>sony can customize the wearable go fibre go mouldable Lithium-Ion polymer batteries go wireless power or wireless energy transmission </p> <p>ask ed meitner if sony will allow sharing of his brain go dsd converters go frequencies go filters</p> <p>positronic brain is far off</p>