Emotional Attention: From Eye Tracking to Computational Modeling

Shaojing Fan, Zhiqi Shen, Ming Jiang, Bryan L. Koenig, Mohan S. Kankanhalli, Qi Zhao

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Attending selectively to emotion-eliciting stimuli is intrinsic to human vision. In this research, we investigate how emotion-elicitation features of images relate to human selective attention. We create the EMOtional attention dataset (EMOd). It is a set of diverse emotion-eliciting images, each with (1) eye-tracking data from 16 subjects, (2) image context labels at both object- and scene-level. Based on analyses of human perceptions of EMOd, we report an emotion prioritization effect: emotion-eliciting content draws stronger and earlier human attention than neutral content, but this advantage diminishes dramatically after initial fixation. We find that human attention is more focused on awe eliciting and aesthetic vehicle and animal scenes in EMOd. Aiming to model the above human attention behavior computationally, we design a deep neural network (CASNet II), which includes a channel weighting subnetwork that prioritizes emotion-eliciting objects, and an Atrous Spatial Pyramid Pooling (ASPP) structure that learns the relative importance of image regions at multiple scales. Visualizations and quantitative analyses demonstrate the model's ability to simulate human attention behavior, especially on emotion-eliciting content.

Original languageEnglish (US)
Pages (from-to)1682-1699
Number of pages18
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
VolumePP
Issue number2
DOIs
StatePublished - Feb 1 2023

Bibliographical note

Publisher Copyright:
IEEE

Keywords

  • Analytical models
  • Benchmark testing
  • Computational modeling
  • Data models
  • Deep learning
  • Human attention
  • Observers
  • Visualization
  • [10]
  • [11]
  • [12]. Substantial research finds that the emotional relevance of a stimulus influences selective attention [13]
  • [14]
  • [15]
  • [16]
  • [17]
  • [18]. For example
  • [20]. Although many neuroimaging and behavioral studies have investigated how emotion-eliciting stimuli affect attention [21]
  • [22]
  • [24]
  • [25]
  • [26]
  • [27]
  • [28] are ahead of those for how sentiment relates with human attention
  • [2]
  • [2]. All visual stimuli are in competition to become the focus of the eyes and encoded into visual short-term memory before it is filled up. Such phenomenon is known as selective attention [3]
  • [4]
  • [5]
  • [6]. Selective attention is a hallmark of human visual attention
  • [8]
  • [9]
  • an object or scene that elicits an emotional response in the observer)
  • and it is an important topic among researchers from various domains
  • convolutional neural network
  • few computer vision studies have-due in part to the lack of an eye-tracking dataset that includes emotion-eliciting stimuli. Advances in understanding the relationship between semantics and attention [23]
  • human psychophysics
  • image sentiment
  • neuroscience
  • not all incoming environmental stimulation can be processed in parallel and evaluated thoroughly [1]
  • people preferentially attend to emotion-eliciting stimuli (i.e
  • ranging from psychology
  • such as cute babies or erotic scenes [19]
  • to computer vision [7]
  • visual saliency. 1 I N T R O D U C T I O N D U E to the capacity limits of the human brain

PubMed: MeSH publication types

  • Journal Article
  • Research Support, Non-U.S. Gov't

Fingerprint

Dive into the research topics of 'Emotional Attention: From Eye Tracking to Computational Modeling'. Together they form a unique fingerprint.

Cite this