Facial recognition technology is being used in more and more ways these days, from urban surveillance cameras to payments processing. Where some of the most interesting advances lie, however, is in applications that go beyond mere ID verification and attempt to detect and analyze a person's emotions—in other words, sentiment analysis, but through human faces, not Tweets.

For example, the Wall Street Journal recently reported on software developed by a finance professor that aims at detecting the emotions of CEOs and with that information in hand, predicting how their companies will do financially:

Computer programs that scan facial expressions have been used to detect whether people respond positively to commercials or whether hospital patients are in pain. Can they also read a CEO’s mind?

James Cicon thinks they can. A finance professor at University of Central Missouri, Cicon built software that analyzed video of the faces of Fortune 500 executives for signs of emotions like fear, anger, disgust, and surprise. The emotions, he found, correlated with profit margins, returns on assets, stock price moves, and other measures of performance at the associated companies.

Companies that have developed emotion-recognition software are using it to gauge the attitude of shoppers as they walk into stores and market-research subjects as they watch advertisements. Apple Inc. in a 2014 patent application described a software system that could analyze and identify people’s moods based on variables including facial expression, perspiration, and tone of voice. The company recently acquired Emotient, a startup that makes software purporting to detect emotions in facial imagery.

Analysis: Emoting In Enterprise Apps

Now consider facial recognition in the context of enterprise software, particularly for productivity and collaboration applications. There is vast potential, notes Constellation Research VP and principal analyst Alan Lepofsky.

"The idea would be to provide sentiment analysis to an employee's mood," he says. "Are they frustrated, taxed or tired?"

Take email, for instance. "If my camera on my laptop can inform my inbox that I just sat there ripping my hair out, it's probably not the best time to put more information in front of me."

Or a webinar: Could cameras on participants' laptops help presenters know when they are bored or confused, or by the same token excited and interested in the content? 

Of course, there's an obvious obstacle to all of this, namely the specter of Big Brother. "Nobody wants their camera pointing at them 24 hours a day when they're working," Lepofsky says. "The biggest hurdle is, am I going to allow this to happen as an employee? People don't like someone or something always listening, how would they feel about something always watching? It needs to be implemented as a user-initiated feature."

"And it's not just sentiment," he adds. "Think about making commands via facial recognition. As my eyes dart around the screen, can I move things and be more productive?"

Facial recognition is just part of an even bigger potential use case for sentiment analysis in enterprise apps, Lepofsky adds. Heart rates, breathing patterns, even blinking could be detected and analyzed. 

"It's really about filtering, sorting, prioritizing and understanding new facets of information," Lepofsky says.

Reprints
Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales.