↓ Skip to main content

Facial expression, size, and clutter: Inferences from movie structure to emotion judgments and back

Overview of attention for article published in Attention, Perception & Psychophysics, January 2016
Altmetric Badge

About this Attention Score

  • In the top 5% of all research outputs scored by Altmetric
  • One of the highest-scoring outputs from this source (#3 of 1,312)
  • High Attention Score compared to outputs of the same age (99th percentile)
  • High Attention Score compared to outputs of the same age and source (97th percentile)

Mentioned by

news
19 news outlets
blogs
3 blogs
twitter
3 tweeters
peer_reviews
1 peer review site
googleplus
1 Google+ user

Citations

dimensions_citation
11 Dimensions

Readers on

mendeley
30 Mendeley
Title
Facial expression, size, and clutter: Inferences from movie structure to emotion judgments and back
Published in
Attention, Perception & Psychophysics, January 2016
DOI 10.3758/s13414-015-1003-5
Pubmed ID
Authors

James E. Cutting, Kacie L. Armstrong

Abstract

The perception of facial expressions and objects at a distance are entrenched psychological research venues, but their intersection is not. We were motivated to study them together because of their joint importance in the physical composition of popular movies-shots that show a larger image of a face typically have shorter durations than those in which the face is smaller. For static images, we explore the time it takes viewers to categorize the valence of different facial expressions as a function of their visual size. In two studies, we find that smaller faces take longer to categorize than those that are larger, and this pattern interacts with local background clutter. More clutter creates crowding and impedes the interpretation of expressions for more distant faces but not proximal ones. Filmmakers at least tacitly know this. In two other studies, we show that contemporary movies lengthen shots that show smaller faces, and even more so with increased clutter.

Twitter Demographics

The data shown below were collected from the profiles of 3 tweeters who shared this research output. Click here to find out more about how the information was compiled.

Mendeley readers

The data shown below were compiled from readership statistics for 30 Mendeley readers of this research output. Click here to see the associated Mendeley record.

Geographical breakdown

Country Count As %
Unknown 30 100%

Demographic breakdown

Readers by professional status Count As %
Researcher 7 23%
Student > Bachelor 3 10%
Lecturer 3 10%
Student > Ph. D. Student 3 10%
Other 3 10%
Other 11 37%
Readers by discipline Count As %
Psychology 11 37%
Computer Science 5 17%
Unspecified 4 13%
Neuroscience 3 10%
Social Sciences 2 7%
Other 5 17%

Attention Score in Context

This research output has an Altmetric Attention Score of 178. This is our high-level measure of the quality and quantity of online attention that it has received. This Attention Score, as well as the ranking and number of research outputs shown below, was calculated when the research output was last mentioned on 02 June 2019.
All research outputs
#73,693
of 13,172,054 outputs
Outputs from Attention, Perception & Psychophysics
#3
of 1,312 outputs
Outputs of similar age
#2,735
of 358,980 outputs
Outputs of similar age from Attention, Perception & Psychophysics
#1
of 48 outputs
Altmetric has tracked 13,172,054 research outputs across all sources so far. Compared to these this one has done particularly well and is in the 99th percentile: it's in the top 5% of all research outputs ever tracked by Altmetric.
So far Altmetric has tracked 1,312 research outputs from this source. They receive a mean Attention Score of 3.4. This one has done particularly well, scoring higher than 99% of its peers.
Older research outputs will score higher simply because they've had more time to accumulate mentions. To account for age we can compare this Altmetric Attention Score to the 358,980 tracked outputs that were published within six weeks on either side of this one in any source. This one has done particularly well, scoring higher than 99% of its contemporaries.
We're also able to compare this research output to 48 others from the same source and published within six weeks on either side of this one. This one has done particularly well, scoring higher than 97% of its contemporaries.