What do the author of The Future of Happiness, a senior director of autonomous driving ecosystems for NVDIA, and an 18-year-old high school student from Kansas all have in common? They all appeared recently at the first Emotion AI Summit, at the world-renowned MIT Media Lab.

Sponsored by emotion AI software development company, Affectiva, more than two dozen experts spoke on a variety of issues surrounding the burgeoning ability of AI to recognize human emotions. How will it affect our work, play, health, and yes, even our happiness and wellbeing?

Amy Blankson, the aforementioned author, spoke of testing 400—yes four hundred—different devices and apps to monitor mood, as part of her research on the future of happiness.
“The market is huge,” she asserts, “and it’s useful to have somebody who is knowledgeable to sort it out, to tell you what’s real and what’s useful.”

If that surprises you, consider Overland Park, Kansas high school senior, Erin Smith. She stood up in front of 300 business and technology thought leaders and presented with amazing confidence and authority. It’s not surprising she won a BioGenius award for her app that can read human expressions to predict Parkinson’s disease onset, years before overt symptoms appear.

But Affectiva co-founder Rosalind Picard summed it up the best. “I thought this was an inspiring day,” she said, “it brought together many of the people who are working hard to build the future of AI in a way that shows more respect for humanity.”

Emotion AI Synthesis

Would you rather steal or starve? The answer to that existential question is obvious to any sane human being; just ask Jean Valjean. But Emoshape CEO and founder Patrick Levy-Rosenthal sees the question in a far different light than the protagonist of Les Miserables.

“By the end of this century,” Levy-Rosenthal asserts, “humans will talk more to sentient machines than to other humans.” For anything close to that to happen, artificial intelligence will need to do more than just identify human emotion. It will need to understand it, and output appropriate affective responses. For a question like steal or starve, it will need to understand the moral and emotional nuances involved in things like guilt and existential angst. And that’s just what Emoshape aims to do with its new EPU II chip (EPU=Emotion Processing Unit), which they tout as the world’s first emotion synthesis. They claim it can process 64 trillion emotional states in a fraction of a second, and provide that appropriate synthesis with 86% accuracy. I think I want one of those—that’s almost fast enough to figure out my wife.


This has been an excerpt from the Nov-Dec 2017 issue of the Age of Robots magazine.