. 24/7 Space News .
ROBO SPACE
A computer that understands how you feel
by Staff Writers
Boulder CO (SPX) Jul 31, 2019

Kragel is combining machine learning with brain imaging to learn more about how images impact emotions.

Could a computer, at a glance, tell the difference between a joyful image and a depressing one?

Could it distinguish, in a few milliseconds, a romantic comedy from a horror film?

Yes, and so can your brain, according to research published this week by University of Colorado Boulder neuroscientists.

"Machine learning technology is getting really good at recognizing the content of images - of deciphering what kind of object it is," said senior author Tor Wager, who worked on the study while a professor of psychology and neuroscience at CU Boulder. "We wanted to ask: Could it do the same with emotions? The answer is yes."

Part machine-learning innovation, part human brain-imaging study, the paper, published Wednesday in the journal Science Advances, marks an important step forward in the application of "neural networks" - computer systems modeled after the human brain - to the study of emotion.

It also sheds a new, different light on how and where images are represented in the human brain, suggesting that what we see - even briefly - could have a greater, more swift impact on our emotions than we might assume.

"A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system," said lead author Philip Kragel, a postdoctoral research associate at the Institute of Cognitive Science. "We found that the visual cortex itself also plays an important role in the processing and perception of emotion."

The Birth Of Emonet
For the study, Kragel started with an existing neural network, called AlexNet, which enables computers to recognize objects. Using prior research that identified stereotypical emotional responses to images, he retooled the network to predict how a person would feel when they see a certain image.

He then "showed" the new network, dubbed EmoNet, 25,000 images ranging from erotic photos to nature scenes and asked it to categorize them into 20 categories such as craving, sexual desire, horror, awe and surprise.

EmoNet could accurately and consistently categorize 11 of the emotion types. But it was better at recognizing some than others. For instance, it identified photos that evoke craving or sexual desire with more than 95 percent accuracy. But it had a harder time with more nuanced emotions like confusion, awe and surprise.

Even a simple color elicited a prediction of an emotion: When EmoNet saw a black screen, it registered anxiety. Red conjured craving. Puppies evoked amusement. If there were two of them, it picked romance. EmoNet was also able to reliably rate the intensity of images, identifying not only the emotion it might illicit but how strong it might be.

When the researchers showed EmoNet brief movie clips and asked it to categorize them as romantic comedies, action films or horror movies, it got it right three-quarters of the time.

What You See Is How You Feel
To further test and refine EmoNet, the researchers then brought in 18 human subjects.

As a functional magnetic resonance imaging (fMRI) machine measured their brain activity, they were shown 4-second flashes of 112 images. EmoNet saw the same pictures, essentially serving as the 19th subject.

When activity in the neural network was compared to that in the subjects' brains, the patterns matched up.

"We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so," said Kragel.

The brain imaging itself also yielded some surprising findings. Even a brief, basic image - an object or a face - could ignite emotion-related activity in the visual cortex of the brain. And different kinds of emotions lit up different regions.

"This shows that emotions are not just add-ons that happen later in different areas of the brain," said Wager, now a professor at Dartmouth College. "Our brains are recognizing them, categorizing them and responding to them very early on."

Ultimately, the resesarchers say, neural networks like EmoNet could be used in technologies to help people digitally screen out negative images or find positive ones. It could also be applied to improve computer-human interactions and help advance emotion research.

The takeaway for now, says Kragel:

"What you see and what your surroundings are can make a big difference in your emotional life."

Research paper


Related Links
University of Colorado at Boulder
All about the robots on Earth and beyond!


Thanks for being there;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
SpaceDaily Monthly Supporter
$5+ Billed Monthly


paypal only
SpaceDaily Contributor
$5 Billed Once


credit card or paypal


ROBO SPACE
Kitchen disruption: better food through artificial intelligence
Washington (AFP) July 21, 2019
Looking for that perfect recipe, or a new flavor combination that delights the senses? Increasingly, players in the food industry are embracing artificial intelligence to better understand the dynamics of flavor, aroma and other factors that go into making a food product a success. Earlier this year, IBM became a surprise entrant to the food sector, announcing a partnership with seasonings maker McCormick to "explore flavor territories more quickly and efficiently using AI to learn and predict n ... read more

Comment using your Disqus, Facebook, Google or Twitter login.



Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle

ROBO SPACE
French inventor to hover across English Channel on 'flyboard'

US spacecraft's solar sail successfully deploys

Indigenous Congo foragers learn early to use sun for orientation

Japan's Noguchi to Be 1st Foreign Astronaut to Join New US Spacecraft Crew for ISS Mission

ROBO SPACE
SpaceX cargo launch to space station now targeting Wednesday

Apollo's legacy: A quiet corner of Alabama that is forever Germany

India to make new bid to launch Moon rocket on Monday

Von Braun: Apollo hero, rocket builder for Hitler, father

ROBO SPACE
Europe prepares for Mars courier

Fueling of NASA's Mars 2020 rover power system begins

ExoMars radio science instrument readied for Red Planet

Mars 2020 Rover: T-Minus One Year and Counting

ROBO SPACE
Chinese scientists say goodbye to Tiangong-2

China's space lab Tiangong 2 destroyed in controlled fall to earth

From Moon to Mars, Chinese space engineers rise to new challenges

China plans to deploy almost 200 AU-controlled satellites into orbit

ROBO SPACE
OneWeb and Airbus start up world's first high-volume satellite production facility in Florida

Why isn't Australia in deep space?

Maintaining large-scale satellite constellations using logistics approach

Maxar begins production on Legion-class satellite for Ovzon

ROBO SPACE
Finding alternatives to diamonds for drilling

Electronic chip mimics the brain to make memories in a flash

First of Two Van Allen Probes Spacecraft Ceases Operations

NUS 'smart' textiles boost connectivity between wearable sensors by 1,000 times

ROBO SPACE
ELSI scientists discover new chemistry that may help explain the origins of cellular life

Scientists deepen understanding of magnetic fields surrounding Earth and other planets

Super salty, subzero Arctic water provides peek at possible life on other planets

Astronomers expand cosmic "cheat sheet" in hunt for life

ROBO SPACE
Jupiter's auroras powered by alternating current

Kuiper Belt Binary Orientations Support Streaming Instability Hypothesis

Study Shows How Icy Outer Solar System Satellites May Have Formed

Astronomers See "Warm" Glow of Uranus's Rings









The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.