Friday, June 17, 2016

The Technology of the Future Understands How You’re Feeling

by Arige Shrouf
Spring 2016 Intern

While shopping for a computer, I fell in love with a laptop that had facial recognition software. Naturally that laptop went home with me, and the face recognition, which works with the built-in camera, was one of the first things I set up. Instead of asking me for a password or passcode when I turned it on, my laptop would show a screen that would scan my face. Within seconds the software would recognize my face and the laptop would unlock. In addition to making me feel like the lead in a spy movie, this technology makes logging in much easier for laptops in tablet mode. Unfortunately, the software is quite literal in its face recognition. When the lighting is not great or I make strange facial expressions or I’m having a bad hair day, my laptop has a hard time recognizing me and I have to resort to inputting the password.

My laptop’s facial recognition failures were the first thing I thought of when I came across the term affective computing. Affective computing enables machines to be better capable of understanding and even influencing human emotion, which would in turn make them better capable of supporting people. The name of the software comes from the term used to describe the physical signs of users’ emotions, or “affect.” With this technology, my laptop would recognize the emotions, such as anger or frustration, registering on my face; it would no longer be confused by my changing facial reactions or refuse to acknowledge that I am actually myself, rather than a lookalike trying to break into my computer.

Affective computing could be used for more than just enhancing facial recognition software. Machines with affective computing capabilities could better equip humans to understand affect. For instance, in a classroom environment, this technology could analyze students’ affect and alert a teacher as students lose or gain interest in the topic being discussed. The technology could also be useful to people with autism who have a difficult time understanding or deciphering human emotions or reactions in others; the technology would pick up on social cues that could otherwise be missed or misinterpreted. Thanks to the efforts of researchers around the world, new developments in this technology are quickly making it a reality for practical uses in fields of science and medicine that would go beyond the initial translation of emotions into binary code.

Did You Know?

Affective computing researchers at MIT have worked on several projects, including automatic stress recognition sensors. When worn on the wrist, these biosensors were able to determine the stress levels of nine call center employees in their real-life environments. While still in its early stages, this technology could potentially help to prevent chronic stress, and the risks associated with it, by tracking early indicators.

No comments:

Post a Comment