AI’s Next Big Step: Detecting Human Emotion and Expression

Emotionally intelligent bots may soon understand how you feel when you talk to them and reply with empathy. Is this the future?

Alex Kantrowitz at The Big Technology: The AI field has made remarkable progress with incomplete data. Leading generative models like Claude, Gemini, GPT-4, and Llama can understand text but not emotion. These models can’t process your tone of voice, rhythm of speech, or emphasis on words. They can’t read your facial expressions. They are effectively unable to process any of the non-verbal information at the heart of communication. And to advance further, they’ll need to learn.

Though much of the AI sector is currently focused on making generative models larger via more data, compute, and energy, the field’s next leap may come from teaching emotional intelligence to the models. The problem is already captivating Mark Zuckerberg and attracting millions in startup funding, and there’s good reason to believe progress may be close.

“So much of the human brain is just dedicated to understanding people and understanding your expressions and emotions, and that’s its own whole modality, right?” Zuckerberg told podcaster Dwarkesh Patel last month. “You could say, okay, maybe it’s just video or an image. But it’s clearly a very specialized version of those two.”

One of Zuckerberg’s former employees might be the furthest along in teaching emotion to AI. Alan Cowen, CEO of Hume AI, is a former Meta and Google researcher who’s built AI technology that can read the tune, timber, and rhythm of your voice, as well as your facial expressions, to discern your emotions.

More here.

Leave a Reply

Your email address will not be published.