The excerpt below stems from an article reviewing an intriguing study on emotion processing and identification is from the online journal Medical Express.
“Machine learning technology is getting really good at recognizing the content of images—of deciphering what kind of object it is,” said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder.
We wanted to ask: Could it do the same with emotions? The answer is yes.
Part machine-learning innovation, part human brain-imaging study, the paper, published Wednesday in the journal Science Advances, marks an important step forward in the application of “neural networks”—computer systems modeled after the human brain—to the study of emotion.
It also sheds a new, different light on how and where images are represented in the human brain, suggesting that what we see—even briefly—could have a greater, more swift impact on our emotions than we might assume.
“A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system,” said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. “We found that the visual cortex itself also plays an important role in the processing and perception of emotion.”
Link to the full Medical Express article here.