Microsoft Research Connections Blog
Next at Microsoft
Social Media Collective
Windows on Theory
Posted by Rob Knies
Three months ago, a post appeared in this space about a SIGGRAPH 2013 paper called Dynamic Hair Manipulation in Images and Videos, written in part by Lvdi Wang of Microsoft Research, along with a few academic collaborators. It was fun, fascinating work that enabled the creation of a 3-D hair model from a single photograph with just a bit of user interaction.In reality, this seemingly lighthearted project was based on some extremely challenging graphic manipulations required to present lifelike results. But it was hard to get past the project’s ability to let users take photos of themselves or others and try different, occasionally whimsical, hairstyles on for size.
On Oct. 30, during Innovation Day 2013 at Microsoft Research Asia, Wang and colleague Fan Yang took things a step further.
Their demo, Digital Barber: 3D Hair Manipulation on Mobile Phone, extends what had previously been achieved to a mobile platform, but, more importantly, it now renders the results in three dimensions, giving the project much more real-world applicability.
“We have been thinking about how we could use the hair-model technology to give users a more interesting experience,” Yang says. “For this application, the basic idea is that the users could change their hair with a single click, as before, but the cool thing in our technology is that all the hair scenes are 3-D.
“Now, we supply the user with a real 3-D experience, in which you can turn a head, do some animations, and cut your hair with some simple interactions.”
Haircutting can be an art, as is understood by any of us who have emerged from a barber shop or a hair salon a bit underwhelmed with the results. It’s not easy, and neither is rendering digital hair accurately and artfully.
“Humans are really sensitive about their faces and what they look like,” observes Yang, who has spent his share of time as a digital barber. “That means that if there are any small artifacts in the results, people will feel bad. They won’t like that. For example, many times in photos, people’s hair will cover their eyebrows, and in that case, it’s hard to reconstruct the eyebrows. We might need to change long-hair styles to short-hair styles. The hair might be really close to the eyebrow, or even the eye. It’s hard to get a clean forehead. “We need to use some technologies to detect the position of the eyes and eyebrows. We might need to create some fake eyebrows but make them look like the real eyebrows, make the left side and the right side look accurate—all these small tricks. People really, really care about their own appearance, so it’s hard to reconstruct all this.”
Still, despite the challenges, the work is rewarding—and could prove a harbinger of the future in the coiffure craft.
“Hair changing itself is really popular in the marketplace," says Yang, noting that there are other apps that purport to enable users to try out alternative ’dos. “Sometimes, people just go to the barber and show the results to the barber and say: ‘Hey, I like this hairstyle. I want it cut like this.’
“For 2-D images, it’s hard to get results that reflect what you wanted. But with our 3-D results, we have your 3-D head model and 3-D hair model, which means that all this technology, all the geometries, are real. When you’re using our application and select a hairstyle, it’s really good. You can just tell the barber, ‘Cut this for me,’ and he can do that."
Where's the app? We don't get to try it?
please tell me what your think about this hairstyle