Scientists at Carnegie Mellon University say they have figured out how to identify people's emotions through their brain activity. It is, they say, the first time such identification has been possible.
In their new study, they combined functional magnetic resonance imaging, or fMRI, with machine learning, a series of techniques that allow them to see patterns in big data, to measure brain signals that pinpoint what emotions the subject is experiencing.
"The research introduces a new method with potential to identify emotions without relying on people's ability to self-report," said Karim S. Kassam, an assistant professor who has a Ph.D. in psychology from Harvard University and the study's lead author.
The study was co-authored by Amanda R. Markey and George Loewenstein of the Department of Social and Decision Science, and Vladimir L. Cherkassky and Marcel Adam Just of the Department of Psychology. It was funded by the National Institute of Mental Health, and was published Wednesday in PLOS ONE, a peer-reviewed, open-access journal.
The participants were 10 actors from CMU's drama school, who, presumably, would be good at running the gamut of emotions. Researchers scanned their brains while the subjects viewed the words for various feelings -- anger, disgust, envy, fear, happiness, lust, pride, sadness and shame.
Inside the fMRI scanner, the actors were told to enter each of these emotional states multiple times in random order. Researchers used the subjects' neural activation patterns in early scans to gauge the emotions felt by the same participants in later scans.
In all cases, the computer models were far better than random guessing at identifying the correct emotion, and in some cases were correct as much as 60 or 80 percent of the time.
In a second phase, to ensure they were measuring real emotion and not the act of trying to induce it, researchers gave participants photos of neutral and disgusting content they hadn't seen before. The computer model used patterns it learned in the earlier phase to correctly identify the emotional content of the photos based on the viewers' brain activity.
Among the researchers discoveries:
• Even though people have many different psychologies, they tend to neurally encode emotions in remarkably similar ways.
• Emotional signatures aren't limited to specific brain regions but produce characteristic patterns throughout a number of brain regions.
• The model rarely confused negative and positive emotions. It was best at identifying happiness and least accurate at pinpointing envy.
• It was least likely to confuse lust with any other emotion, suggesting that lust produces a distinct pattern of neural activity.
Researchers undertook the study with two goals in mind, Mr. Kassam said.
"One, to try to understand the nature of emotion. When one person feels happy, does it look in the brain the same way as when another is happy?"
The answer in this study, he said, is yes.
"There is an assumption that emotions like anger are hard-wired. Our studies can't say that, but they do say there's a common neural response that is supportive of evolved emotional response."
The second goal, he said, was finding a more reliable way to gauge emotion.
"It's hard to measure people emotions. You can do it with facial expression, physiological reaction like blood pressure and heart rate, or just by asking them. But they all have shortcomings. Sometimes people don't want to say how they feel, or they're not able to. This sets up a new technique for judging emotional reaction to any type of stimulus, for example a flag, a brand name or a political candidate."
So it's primarily a marketing tool? Is this the next step in companies reading our minds in order to sell us stuff?
"Well," he said, "people have to voluntarily go through a scanner. We're not talking about someone walking under a plate that tells them how they're feeling. We're nowhere close to that at this point. Could it go there? It's not happening in the next 10 years."
What about reading emotions that people are trying to suppress or hide?
"We're looking to go there next," he said.
Sally Kalson can be reached at firstname.lastname@example.org or 412-263-1610. First Published June 19, 2013 5:30 PM