Scent of Anxiety and the Arousal of Empathy

I was reading New Scientist earlier today and two different articles caught my attention. The first was an article about the development of gadgets that could read human emotions either by interpreting facial cues, monitoring the quality and speed of voices, or by interpreting things like heart and breathing rate. The article is called Emotional robots: Will we love them or hate them? The article goes on to talk about car alarms that jolt sleepy drivers, monitors that diagnose depression, and a computerized tutor that could monitor student frustration and slow down instruction.

Then the article talks about how computers can be programmed to read facial expressions accurately enough to recognize six basic emotions nine times out of ten. The computers can recognize disgust, happiness, sadness, anger, fear, and surprise. To read emotions even more accurately computers will need extra cues such as head motion and upper body position. Already facial tracking technology as it is called has analyzed the differences between real smiles and fake smiles and facial expression software is more accurate than actual humans at determining if someone is in pain. The computer software could detect if someone was really in pain or not 88 percent of the time. The untrained volunteers asked to participate in the study were right only 49 percent of the time.

Another article that I read was about how the scent of anxiety has an effect on the human brain and lights up areas that process social and emotional signals and are thought to be involved with empathy. The study was done on students taking exams.

As I was reading these two articles I began to think. Emotion sensing software has the possibility of frightening applications and the computers will not have the benefit of empathy. What if the reason that the untrained volunteers who were wrong about guessing if someone was in pain or not chose to assume that the person was in pain so that they could procure treatment for them? What if they were attempting to alleviate suffering by going with the safer bet and saying the person was in pain so they could get help? The study doesn’t give the details of how this was posed.

Further, the computer will be able to report to whoever wants the information a person’s emotional state. What if this technology were used in airports under the guise of stopping terrorism? What if this technology were installed in classrooms to ensure that another Columbine or Virginia Tech type of massacre did not happen? Would having this technology in place be beneficial? Or would it violate the idea of someone being innocent until proven guilty? Do we want not only to be spied upon but to have our emotions read without the benefit of empathy or context? What use for this technology could not be justified? Stores could have it installed to prevent shoplifting. Workplaces to ensure productivity. Homes under the guise of protecting our health and mental well being. And as that information would be collected– where would it go?

Our ability to smell another person’s anxiety makes us empathetic to their situation and possibly evolved as a way for one person to subtly cue others to the possibility of danger. It was a type of complex mechanism that very well might have come about to bring humans closer together and ensure our survival. How will we ensure that the application and the development of computers that can read our emotions and relay this information will have similar benefits?