Saturday, July 27, 2024

Affectiva: Experience the emotion AI

Rana el Kaliouby - CEO and Co-Founder, Affectiva
Rana el Kaliouby – CEO and Co-Founder, Affectiva

The efforts to give computers the capacity to read our feelings and react, in ways that have come to seem startlingly human had started in 1990’s. Today, machines seem to get better every day at digesting vast gulps of information—and they remain as emotionally inert as ever. Voice experts trained computers to identify deep patterns in vocal pitch, rhythm, and intensity that their software can scan a conversation between a woman and a child and even determine if the woman is a mother, whether she is looking the child in the eye, whether she is angry or frustrated or joyful. Other machines can measure sentiment by assessing the arrangement of our words, or by reading our gestures. Still, others can do so from facial expressions.

Our faces are organs of emotional communication; by some estimates, we transmit more data with our expressions than with what we say, and a few pioneers are dedicated to decoding this information. Perhaps the most successful is an Egyptian scientist living near Boston, Rana el Kaliouby.’Rana is the CEO and Co-founder Affectiva. At MIT, Rana spearheaded the applications of emotion-sensing and facial coding. Affectiva is pioneers in Artificial Intelligence, and they are on a mission to humanize digital interactions by building artificial emotional intelligence, or Emotion AI for short. As their interactions with technology become more conversational and relational, Emotion AI has become a critical component in transforming many industries like advertising, automotive to social robotics and healthcare. Ultimately, they envision an emotion-aware world, redefining not only how we interact with technology but also how we connect and communicate with one another.

Affdex, her company’s signature software, was simplified to track just four emotional “classifiers”: happy, confused, surprised, and disgusted. The software scans for a face; if there are multiple faces, it isolates each one. It then identifies the face’s main regions—mouth, nose, eyes, eyebrows and it ascribes points to each, rendering the features in simple geometries. Affectiva’s Emotion AI unobtrusively measures unfiltered and unbiased facial expressions of emotion, using any optical sensor or just a standard webcam. The technology first identifies a human face in real time or an image or video. Computer vision algorithms identify key landmarks on the face – for example, the corners of your eyebrows, the tip of your nose, the corners of your mouth. Machine learning algorithms then analyze pixels in those regions to classify facial expressions. Combinations of these facial expressions are then mapped to emotions. Affectiva’s products measure seven emotion metrics: anger, contempt, disgust, fear, joy, sadness and surprise. Also, they provide 20 facial expression metrics. In their SDK and API, they also provide emojis, gender, age, ethnicity and many other metrics. Their algorithms are trained using emotion data repository, which has now grown to more than 5 million faces analyzed in 75 countries. They continuously test their algorithms to provide the most reliable and accurate emotion metrics. Their key emotions achieve accuracy in the high 90th percentile.

Like the face, speech contains strong signals for human emotions. Their science team is now working on technology for analyzing a speaker’s tone of voice. They plan to make this available in their products soon.

affectiva

Company:

Affectiva

Management:
CEO & Co-Founder –
Rana el Kaliouby

Description:
Affectiva, an MIT Media Lab spin-off, is the pioneer in Emotion AI, the next frontier of artificial intelligence.

Latest