Dual Development of Affective-Speech-Based Emotion Perception.

Published on Aug 23, 2021in Journal of Genetic Psychology0.923
· DOI :10.1080/00221325.2021.1967270
Shinnosuke Ikeda (Kyoto University)
Sources
Abstract
Studies have shown that when interpreting emotions from speech, adults focus on prosody, while young children focus on lexical content. However, the kind of socio-emotional processing implemented in such emotion perception, as well as how it is developed, remains unclear. The present study examined the development of a dual process in affective-speech-induced emotion perception in 3- and 5-year-old children. Previous studies have suggested that unconscious emotion perception at the gaze level and conscious emotion judgment in response to speakers' emotions develop differently. Children were presented with affective speech, which included inconsistent lexical content and prosody (e.g., saying 'thank you' in an angry tone), and asked to report the speaker's emotions by pointing to the corresponding facial expressions (happy or angry). Additionally, the duration for which children gazed at each facial expression was examined. The results showed that 3-year-old children judged the speaker's emotions based on lexical content more than the 5-year-olds, who used prosody. However, at the gaze level, both the 3- and 5-year-olds focused longer on the facial expressions that matched the prosody. The results suggest that two processes can be observed: unconscious emotion perception, which matches prosody and expression, and assessment of the speaker's emotions by weighting the lexical content and prosody.
📖 Papers frequently viewed together
1 Citations
20 Citations
42 Citations
References24
Newest
#1Shinnosuke Ikeda (UTokyo: University of Tokyo)
#2Etsuko Haryu (UTokyo: University of Tokyo)H-Index: 7
2 CitationsSource
We conducted a series of activation likelihood estimation (ALE) meta-analyses to determine the commonalities and distinctions between separate levels of emotion perception, namely incidental perception, passive perception, and explicit evaluation of emotional expressions. Pooling together more than 180 neuroimaging experiments using facial, vocal or body expressions, our results are threefold. First, explicitly evaluating the emotions of others recruits brain regions associated with the sensory ...
49 CitationsSource
#1Jeffrey A. Brooks (NYU: New York University)H-Index: 4
#2Holly Shablack (UNC: University of North Carolina at Chapel Hill)H-Index: 8
Last. Kristen A. Lindquist (UNC: University of North Carolina at Chapel Hill)H-Index: 29
view all 6 authors...
: Recent behavioral and neuroimaging studies demonstrate that labeling one's emotional experiences and perceptions alters those states. Here, we used a comprehensive meta-analysis of the neuroimaging literature to systematically explore whether the presence of emotion words in experimental tasks has an impact on the neural representation of emotional experiences and perceptions across studies. Using a database of 386 studies, we assessed brain activity when emotion words (e.g. 'anger', 'disgust'...
52 CitationsSource
#1Kristen A. Lindquist (UNC: University of North Carolina at Chapel Hill)H-Index: 29
#2Ajay B. Satpute (Pomona College)H-Index: 25
Last. Lisa Feldman Barrett (NU: Northeastern University)H-Index: 108
view all 5 authors...
The ability to experience pleasant or unpleasant feelings or to represent objects as "positive" or "negative" is known as representing hedonic "valence." Although scientists overwhelmingly agree that valence is a basic psychological phenomenon, debate continues about how to best conceptualize it scientifically. We used a meta-analysis of 397 functional magnetic resonance imaging (fMRI) and positron emission tomography studies (containing 914 experimental contrasts and 6827 participants) to test ...
319 CitationsSource
#1Jared M. J. Berman (U of C: University of Calgary)H-Index: 5
#2Craig G. Chambers (U of T: University of Toronto)H-Index: 15
Last. Susan A. Graham (U of C: University of Calgary)H-Index: 28
view all 3 authors...
Abstract An eye-tracking methodology was used to examine the time course of 3- and 5-year-olds’ ability to link speech bearing different acoustic cues to emotion (i.e., happy-sounding, neutral, and sad-sounding intonation) to photographs of faces reflecting different emotional expressions. Analyses of saccadic eye movement patterns indicated that, for both 3- and 5-year-olds, sad-sounding speech triggered gaze shifts to a matching (sad-looking) face from the earliest moments of speech processing...
16 CitationsSource
#1Carolyn Quam (UA: University of Arizona)H-Index: 8
#2Daniel Swingley (UPenn: University of Pennsylvania)H-Index: 27
Young infants respond to positive and negative speech prosody (A. Fernald, 1993), yet 4-year-olds rely on lexical information when it conflicts with paralinguistic cues to approval or disapproval (M. Friend, 2003). This article explores this surprising phenomenon, testing one hundred eighteen 2- to 5-year-olds’ use of isolated pitch cues to emotions in interactive tasks. Only 4- to 5-year-olds consistently interpreted exaggerated, stereotypically happy or sad pitch contours as evidence that a pu...
60 CitationsSource
#1Silke Paulmann (University of Essex)H-Index: 25
#2Debra Titone (McGill University)H-Index: 31
Last. Marc D. Pell (McGill University)H-Index: 41
view all 3 authors...
This study investigated cross-modal effects of emotional voice tone (prosody) on face processing during instructed visual search. Specifically, we evaluated whether emotional prosodic cues in speech have a rapid, mandatory influence on eye movements to an emotionally-related face, and whether these effects persist as semantic information unfolds. Participants viewed an array of six emotional faces while listening to instructions spoken in an emotionally congruent or incongruent prosody (e.g., ''...
32 CitationsSource
#1Nicole L. Nelson (BC: Boston College)H-Index: 11
#2James A. Russell (BC: Boston College)H-Index: 147
In daily experience, children have access to a variety of cues to others’ emotions, including face, voice, and body posture. Determining which cues they use at which ages will help to reveal how the ability to recognize emotions develops. For happiness, sadness, anger, and fear, preschoolers (3–5 years, N = 144) were asked to label the emotion conveyed by dynamic cues in four cue conditions. The Face-only, Body Posture-only, and Multi-cue (face, body, and voice) conditions all were well recogniz...
91 CitationsSource
#1Matthew Waxer (UWO: University of Western Ontario)H-Index: 3
#2J. Bruce Morton (UWO: University of Western Ontario)H-Index: 21
Six-year-old children can judge a speaker's feelings either from content or paralanguage but have difficulty switching the basis of their judgments when these cues conflict. This inflexibility may relate to a lexical bias in 6-year-olds' judgments. Two experiments tested this claim. In Experiment 1, 6-year-olds (n = 40) were as inflexible when switching from paralanguage to content as when switching from content to paralanguage. In Experiment 2, 6-year-olds (n = 32) and adults (n = 32) had more ...
27 CitationsSource
#1Jared M. J. Berman (U of C: University of Calgary)H-Index: 5
#2Craig G. Chambers (U of T: University of Toronto)H-Index: 15
Last. Susan A. Graham (U of C: University of Calgary)H-Index: 28
view all 3 authors...
Abstract An eye tracking methodology was used to evaluate 3- and 4-year-old children’s sensitivity to speaker affect when resolving referential ambiguity. Children were presented with pictures of three objects on a screen (including two referents of the same kind, e.g., an intact doll and a broken doll, and one distracter item), paired with a prerecorded referentially ambiguous instruction (e.g., “Look at the doll”). The intonation of the instruction varied in terms of the speaker’s vocal affect...
46 CitationsSource
Cited By0
Newest