Rana el Kaliouby is on a mission. She wants to bring emotions into the digital technologies that have become so pervasive in our lives. That is the topic of her recently released TED Talk.
“Today’s technology has lots of IQ, but no EQ; lots of cognitive intelligence, but no emotional intelligence,” she says in her opening remarks. “That got me thinking, what if our technology could sense our emotions?”
Rana el Kaliouby is a PhD scientist specializing in facial recognition and human emotions. She and her team have developed a technology that can function on any mobile device to track facial expressions. It determines, based on a vast database of 12 billion data points, what you’re feeling based on your facial expressions. The possibilities for this technology are huge, and in her TED Talk she demonstrates how it works by calling an audience member to the stage.
The audience member holds a tablet with the emotion reading software up to her face. At Kaliouby’s command, she smiles, she frowns, she furrows her brow and the software determines her emotions with pinpoint accuracy.
Developing the program is the product of Kaliouby’s 15 year career. What started out as an MIT research project is now a startup company called Affectiva. But there are so many possible applications for this software that her company alone cannot address them all, so she’s making the technology available to third party companies. The potential for misuse is high, but the potential benefits far outweigh the dangers.
The data her team collected came from people in 75 different countries, and Kaliouby and her team gleaned some fascinating insights in their findings. In the USA, women are 40 percent more expressive than men, but for some reason in the UK men and women are equal in how much they express emotions. In general, women are more expressive than men. Women smile longer and more often than men do, according to their data.
“Perhaps what surprised us the most about this data is that we happen to be expressive all the time, even when we’re sitting in front of our devices alone, and it’s not just when we’re watching cat videos on Facebook,” she says. “We’re expressive when we’re emailing, texting, shopping online, or even doing our taxes.”
Kaliouby imagines a future where all of our devices have emotion chips in them. She poses, “What if your watch tracks your mood, or your car knows you’re tired? What if your fridge knows when you’re stressed so it autolocks to prevent you from binge eating.” This last suggestion earns rousing laughter from the audience.
In the end, Kaliouby says this technology could give us a new way to connect with machines and with each other. It would give us the ability to connect emotionally in an increasingly digital world, and that is something that has faded away in the influx of social media and internet happening in our modern times.
“As more and more of our lives become digital, we’re fighting a losing battle in trying to curb our usage of devices in order to reclaim our emotions,” she says. “What I’m trying to do instead is to bring emotions into our technology and make our devices more responsive. I want those devices that separated us to bring us back together.”