Psychiatry Investig Search

CLOSE


Psychiatry Investig > Volume 19(6); 2022 > Article
Bae, Rhee, Hwang, Son, Bae, and Han: Correlations Between Psychological Status and Perception of Facial Expression

Abstract

Objective

Facial affect recognition is associated with neuropsychological status and psychiatric diseases. We hypothesized that facial affect recognition is associated with psychological status and perception of other affects.

Methods

A total of 80 images depicting facial affect, including 20 Neutral, 20 Angry, 20 Fear, and 20 Sad, were screened for use in our research. A total of 100 healthy individuals were asked to rate these images using a 10-point Likert scale and complete psychological scales assessing the emotional statuses and cognitive functions.

Results

The participants’ emotional state of aggression, attention, and impulsivity may have been associated with their interpretation of the Angry facial expressions. The participants often rated the Angry facial expressions as Fear. The participants rated Fear images as Angry or Sad. In response to a Sad facial expression, the participants reported psychological statuses of attention and impulsivity which were associated with the facial expression rating. The participants rated the Sad expression as Angry or Fear.

Conclusion

The psychological statuses of the participants were significantly correlated with their interpretation of facial affects. In particular, a psychological state of attention was often correlated with incorrect affect ratings. Attention and impulsivity could affect the rating of the sad facial expressions.

INTRODUCTION

Studies on facial affect recognition have been instrumental in gaining insights into cognition and emotion, and in influencing the design of computational models and perceptual interfaces. Such studies have been conducted for several decades [1-3]. Historically, many studies have employed six facial expressions, namely happiness, sadness, anger, disgust, fear, and surprise, when testing human emotional perception [4,5]. Other studies used more or fewer facial expressions in their studies [6,7]. Of these, the expression of happiness was recognized more easily than were other emotions [8]. In a meta-analysis of emotional expressions, McKasy [9] reported that anger did not have a significant effect on depth of information processing when compared to other emotions, including neutrality, sadness, happiness, and fear. Anger is defined as a strong unpleasant emotion due to interfering obstacles or disparaging offenses against oneself or another [10,11]. Compared to sadness, anger has a more obvious target of blame and accountability [12]. A fearful facial expression was the most salient for humans to visualize, compared to other facial expressions [13]. Based on the results of these studies, we hypothesized that the differentiation between Anger, Fear, and Sad facial expressions could provide insight into human cognition and emotions.
A facial affect recognition deficit is thought to be due to the individual emotional statuses of depression, anxiety, and aggression [1,14,15], as well as to cognitive factors of attention and impulsivity. Demenescu et al. [14]. reported that adults with anxiety disorders or major depressive disorders found it difficult to recognize facial expressions. Alharbi et al. [1]. suggested that affective factors, including depression and anxiety, could predict individual differences in emotional recognition. In a multicohort longitudinal study, Acland et al. [15]. reported that negative emotion recognition was associated with higher concurrent aggression.
In addition to emotional factors, difficulties in facial affect recognition are associated with cognitive impairments, including attention and impulsivity [16-20]. In a review of facial emotional recognition in adolescents with attention deficit hyperactivity disorder (ADHD), Dan [16] reported that adolescents with ADHD found the recognition of facial expressions difficult due to differences in their brain activity. Löytömäki et al. [17]. stated that a delay in emotional recognition in patients with ADHD is associated with the linguistic and cognitive skills required for selective intervention procedures. Faces can provide multidimensional visual stimuli and a broad range of information, including identity, gender, age, race, mood, and intentions [21]. Several studies have suggested that impulsivity could affect the response to emotional face stimuli, including happy, angry, and sad [15]. However, few studies have reported a correlation between cognitive function and emotional perception in healthy individuals. This makes our study one of the first to attempt this.
We hypothesized that facial affect recognition would be affected by participants’ emotional status, including depression, anxiety, and aggression, as well as cognitive functions, including attention and impulsivity. Additionally, one facial affect can be perceived as another facial affect influenced by individual emotional and cognitive factors.

METHODS

Participants and study procedure

Effect size was determined using Cohen’s d [22]. The effect size and power values were 0.20 and 0.95, respectively. We planned to recruit 100 participants for the analyses of this study using flyers and the web bulletin board service of Chung-Ang University. This study was approved by the Institutional Review Board of Chung-Ang University (IRB number: 1041078-202008-HRBM-231-01). All the participants provided written informed consent.
A total of 103 participants were recruited based on the following criteria: 1) the participants must be at least 18 years of age and 2) must not have a history of psychiatric diseases such as schizophrenia, other psychotic disorders, intellectual disability, mental disorders, or neurological disease. Through screening using the Mini International Neuropsychiatric Interview (MINI), and after meeting with a psychiatric doctor (DHH), three participants were excluded from the study. Two participants were excluded because of a major depressive disorder. The other participant was excluded because of substance dependence. Therefore, we used data from a total of 100 participants in the analyses (Figure 1).
After screening for psychiatric comorbidities and completing surveys for psychological status, all participants were asked to rate facial affects in response to images depicting facial expressions, including Neutral, Angry, Fear, and Sad.

Psychiatry comorbidity screening and psychological status assessment

Psychiatric comorbidities were screened using the Korean version of the MINI. The MINI is a semi-structured diagnostic interview that is generally used to assess the presence of co-occurring mental disorders [23,24].
Before rating the facial expression images, all participants were asked to complete psychological surveys in order to assess the emotional status of depression, anxiety, and aggression, as well as cognitive functions of attention and impulsivity (Figure 1).
Depression was assessed using the Beck Depression Inventory II (BDI-II) [25]. The BDI-II is a 21-item self-report inventory used to assess the severity of depression. Each item is rated on a 4-point Likert-type scale ranging from 0 to 3. The total score ranged from 0 to 63. The Korean version of the BDI-II has good internal consistency (Cronbach’s alpha=0.89) [26]. Anxiety symptoms were assessed using the Beck Anxiety Inventory (BAI). The BAI is a 21-item self-report inventory used to assess anxiety severity. Each item is rated on a 4-point Likerttype scale ranging from 0 to 3, with a total score ranging from 0 to 63. The Korean version of the BAI has good internal consistency (Cronbach’s alpha=0.95) [27]. Impulsivity was assessed using the Barratt Impulsiveness Scale-11 (BIS-11), which consists of 30 items rated on a 4-point scale ranging from 1 to 4 [28]. The Korean version of the BIS-11 has good internal consistency (Cronbach’s alpha=0.78) [29].
Attention problems were assessed using the Korean version of the Adult Attention Deficit/Hyperactivity Disorder Self-Report Scale (K-ASRS). The total K-ASRS score ranges from 0 (best) to 72 (worst; 34). The questions in the K-ASRS were divided into two sections: A (six questions) and B (12 questions). Four or more positive answers in Section A can indicate K-ASRS [30,31]. Aggression was measured using the Buss-Perry Aggression Questionnaire (AQ) [32]. The AQ consisted of 29 items assessing overall aggression and four sub-components assessing aggression, including physical aggression, verbal aggression, anger, and hostility. The AQ-Korean version also has good internal consistency (Cronbach’s alpha=0.87) [33].

Rating facial images of emotion

Eighty images depicting facial expressions were screened in our study. These included 20 Neutral (N), 20 Angry (A), 20 Fear (F), and 20 Sad (S) facial images. All facial affect images were randomly selected from the following four categories: neutral, angry, fear, and sad out of 176 Korean facial expressions [34] and 259 extended ChaeLee Korean facial expressions [35]. Using a 10-point Likert scale, 100 healthy individuals were asked to identify three emotions in each image depicting a facial expression. For example, participants identified anger, fear, and sadness in response to an “angry” facial affect image.
The presentation of the facial expression images consisted of 20 blocks. Each block contained four facial expressions (N, A, F, and S) with various distributions. One of the 20 facial expression images in each category was distributed into 20 blocks. The presentation order of the facial expression images in each category was distributed as follows: N-A-F-S, N-A-S-F, N-F-A-S, N-S-A-F, N-F-S-A, N-S-F-A, A-F-S-N, A-S-F-N, A-N-S-F, A-N-F-S, A-S-N-F, A-F-N-S, F-N-A-S, F-N-S-A, F-A-N-S, F-A-S-N, F-S-A-N, and S-A-F-N, A-F-S-N, S-F-A-N. Each image (5×7 cm2) was shown to the participant for three seconds. The participants rated the images for three seconds. A total of 480 seconds was required to rate all 80 images in the four categories.
If participants could not respond within three seconds they were timed out, these trials were discarded from the analyses. Participants underwent response training for 10 minutes to reduce the percentage of discarded trials. Of the 8,000 trials (80 trials in 100 participants), 38 (0.48%) were discarded as timed out in the analyses.

Data control and statistics

Linear mixed-effects models were used to estimate the effect of participants’ psychological status on the rating scores and the 95% confidence interval after adjusting for the effect of participants’ sex on the results. Subsequently, the affect of each facial expression image was compared with that of a neutral face. This served as the reference image. In addition, the rating scores for each image were fitted using the estimated coefficients of the linear mixed-effects models. All tests were two-sided and differences were considered statistically significant at a significance level of 0.05. All statistical analyses were performed using the lmer function of the lem4 package in the R software (version 3.6.3; R Foundation for Statistical Computing, Vienna, Austria).

RESULTS

Demographic and psychological characteristics of the participants

The clinical characteristics and psychological state of the participants are presented in Table 1. The sex ratios of the participants were 78.0% male and 22.0% female. The mean age of participants was 22.9±2.6 years and educational duration was 14.5±1.7 years.

Effects of psychological status on the rating of facial emotional expressions

In response to fearful facial expressions, the emotional status of aggression and cognitive function of attention were associated with participants’ ratings (Table 2). Controlling for psychological status, fearful facial expressions could be responded to as anger or sadness in the current results. The participants may interpret the images depicting Fear as Angry and Sad.
In response to the Sad facial expression, the cognitive functions of attention and impulsivity were associated with the participants’ facial affect ratings (Table 2). Controlling for psychological status, the Sad facial expression could be responded to as anger or fear emotions in the current results. The participants could interpret the images depicting Sad as Angry and Fear. Conclusively, controlling for psychological status, fearful and sad facial expressions could be interpreted as other emotions.

Fitted rating scores of facial emotion expression images

Among the neutral facial expression images, those depicting Neutral 11 had the lowest fitted rating scores, while Neutral 1 had the highest fitted rating scores in the Angry, Fear, and Sad ratings. Among the facial expression images depicting anger, the image depicting Angry 13 had the highest fitted rating score, and Angry 6 had the lowest fitted rating score in the Angry group. Among the images depicting fear, that of Fear 18 had the highest fitted rating score, and Fear 7 had the lowest fitted rating score in the Fear group. Among the images depicting sad facial expressions, Sad 2 had the highest fitted rating score, and Sad 11 had the lowest fitted rating score in the Sad group (Table 3).

DISCUSSION

In the present study, participants’ emotional mood and anxiety were not linked to the rating of facial emotional expressions. This differs from the results of previous studies [1,2,14]. Many studies have suggested that patients with depression and anxiety tend to gravitate toward depressive or anxious facial emotional expressions [1,2,14]. In a longitudinal study on recognition thresholds, Mei et al. [36] reported that individuals with subthreshold depression exhibited increased perceptual sensitivity toward sad expressions and was associated with participants’ current depressive states. Our study differs from previous studies in that we recruited healthy subjects after screening for psychiatric diseases. Previous studies recruited patients with depression, anxiety disorders, and ADHD. The emotional status of aggression could have affected the participants’ ratings of fearful expressions in the current study and was consistent with that of a previous study. Acland et al. [15] declared that negative emotions, including sadness and fear, were concurrent with an aggressive emotional status in healthy children.
The cognitive function of the participants was significantly correlated with their interpretation of the facial affects. Attention, in particular, was correlated with affect ratings. Attention and aggression levels may have affected the ratings of fearful facial expressions in the present study. Attention and impulsivity may have affected ratings of sad facial expressions.
These results are in line with those of previous studies on the correlation between facial expression and attention [37,38]. The attention mechanism is thought to play a crucial role in human emotion perception, including feature extraction and artifact removal [39]. The saliency and meaning of facial emotional expressions can facilitate conscious perception in healthy subjects [40].
Additionally, the emotional and motivational value of social signals derived from facial expressions may be associated with the attention system [41]. Faces were thought to be regarded as special objects containing social significance, such as innate salience [42]. In the competition of several facial emotional expressions, fearful expressions with a sensory advantage were most salient to human vision [6]. Bertini and Làdavas [43] suggested that fear-related signals should be prioritized in the visual system. In a previous systematic review, fear was the facial expression that patients with ADHD were least likely to recognize [44]. Pessoa et al. [13] stated that fear facial expressions would be more salient to human vision than are other facial expressions.
The core deficits of facial expression recognition in ADHD might be caused by a failure to correctly interpret affects due to inattention or impulsivity [37]. Deficits in sustained attention and inhibition in ADHD are thought to dysregulate emotional facial perception processing [45]. In fact, aggression and impulsivity were associated with Fear and Sad facial expressions in the present study. In a review of emotional dysregulation in ADHD, van Stralen [46] stated that executive function deficits may be associated with inappropriate internalized (sadness) or externalized (aggression) emotional responses.
However, whether abnormal executive function in subjects with ADHD can cause deficits in emotional recognition remains controversial [47]. Petroni et al. [48] suggested that these two capabilities may be separate from each other at the clinical level; however, they are linked at the neural level.
In the present study, participants were more likely to interpret facial expressions as emotions that they had previously felt. However, images depicting Fear could be rated as Angry or Sad while pictures depicting a Sad facial expression could be rated as Angry or Fear. By controlling emotional status and cognitive function, healthy individuals can misinterpret facial expressions as other emotions. Shioiro et al. [49] reported the misinterpretation of emotional facial recognition: sad and anger were misinterpreted as disgust, and fear was misinterpreted as surprise. Usually, misinterpretation of facial expressions has been reported to be associated with cultural background and emotional intensity [17]. However, the present study suggests that significant misinterpretation of facial expressions could occur in the condition of the same cultural background and intensity. Based on these results, we suggest that researchers consider the participants’ psychological status, including emotional status and cognitive functions, as well as their misinterpretation of facial expressions.
The present study has several limitations. First, the small number of participants and unbalanced sex distribution are insufficient to generalize the results, although we considered them in the statistical analyses. Second, in the present study, we did not perform thorough standardized cognitive function tests to assess attention and intelligence. Future studies should include a larger number of participants, a more balanced sex distribution, and cognitive function tests.
In conclusion, our findings suggest that interpretation of facial expressions can be affected by psychological status and misinterpretation of other affects. Researchers should consider these factors when planning facial expressions studies.

Notes

Availability of Data and Material

The datasets generated or analyzed during the study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors have no potential conflicts of interest to disclose.

Author Contributions

Conceptualization: Doug Hyun Han, Young Don Son. Data curation: Sujin Bae. Formal analysis: Beom Seuk Hwang, Eunhee Rhee. Funding acquisition: Doug Hyun Han, Young Don Son. Investigation: Doug Hyun Han. Methodology: Doug Hyun Han, Sujin Bae. Project administration: Eunhee Rhee. Validation: Sujin Bae, Ji Hyun Bae. Writing—original draft: Doug Hyun Han, Young Don Son. Writing—review & editing: Doug Hyun Han, Sujin Bae.

Funding Statement

This work was supported by a National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT) (NRF-2020R1A4A1019623).

ACKNOWLEDGEMENTS

We acknowledge the contributions of colleagues, institutions, and agencies that aided the efforts of the authors.

Figure 1.
Diagram for research processing. MINI, Mini International Neuropsychiatric Interview; MDD, major depressive disorder; SUD, substance use disorder.
pi-2022-0025f1.jpg
Table 1.
Demographic and psychological characteristics of the participants
Variable Value
Sex
 Male 78 (78.0)
 Female 22 (22.0)
Age (yr) 22.9±2.6
Education (yr) 14.5±1.7
Adult Attention Deficit/Hyperactivity Disorder Self-Report Scale 7.3±6.4
Barratt Impulsiveness Scale-11 63.5±7.2
The Buss-Perry Aggression Questionnaire 51.9±13.3
Beck Depressive Inventory II 10.7±8.8
Beck Anxiety Inventory 5.6±8.2

Values are presented as number (%) or mean±standard deviation

Table 2.
Effects of psychological status on the rating of facial emotions
Predictor Angry face
Fear face
Sad face
Estimation CI p-value Estimation CI p-value Estimation CI p-value
Intercept -2.65 -5.56- -0.27 0.075 -3.86 -7.56- -0.15 0.041 -2.65 -5.75-0.46 0.032
Sex 0.30 -0.45-1.04 0.434 0.32 -0.64-1.27 0.516 0.56 -0.23-1.36 0.165
K-ASRS -0.05 -0.10-0.01 0.079 -0.10 -0.17-0.03 0.006* -0.07 -0.13- -0.01 0.023*
BIS-11 0.04 -0.00-0.09 0.060 0.04 -0.00-0.11 0.051 0.05 0.00-0.09 0.045*
AQ 0.02 -0.00-0.05 0.071 0.02 0.00-0.07 0.024* 0.02 -0.00-0.05 0.065
BDI-II 0.01 -0.04-0.06 0.602 0.01 -0.06-0.07 0.787 0.04 -0.02-0.09 0.202
BAI 0.00 -0.06-0.05 0.927 0.00 -0.04-0.10 0.391 0.02 -0.04-0.08 0.604
Angry emotion 6.78 6.22-7.34 <0.001* 1.25 0.66-1.84 <0.001* -0.74 -1.36- -0.45 0.005*
Fear emotion 1.95 1.39-2.50 <0.001* 4.37 3.78-4.97 <0.001* 0.15 -0.47-0.77 0.019*
Sad emotion 0.37 -0.19-0.93 0.193 3.01 2.41-3.60 <0.001* 4.90 4.28-5.52 <0.001*

Linear mixed-effects model adjusted for sex.

* statistically significant.

K-ASRS, the Korean version of the Adult Attention Deficit/Hyperactivity Disorder Self-Report Scale; BIS-11, Barratt Impulsiveness Scale-11; AQ, Buss-Perry Aggression Questionnaire; BDI-II, Beck Depression Inventory II; BAI, Beck Anxiety Inventory; CI, confidence interval

Table 3.
Fitted rating scores of the facial emotion expression pictures
Neutral pictures Perceived emotion
Angry pictures Perceived emotion
Fearful pictures Perceived emotion
Sad pictures Perceived emotion
Angry Fear Sad Angry Fear Sad Angry Fear Sad Angry Fear Sad
Neutral 11 0.51 0.66 1.08 Angry 13 8.74 2.33 1.33 Fear 18 1.70 6.20 2.58 Sad 2 0.78 4.22 7.49
Neutral 9 0.61 1.23 2.52 Angry 14 8.59 2.65 1.23 Fear 20 2.00 6.86 3.03 Sad 4 0.94 4.38 6.38
Neutral 18 0.64 1.56 2.68 Angry 1 8.59 2.96 1.62 Fear 17 2.00 6.20 2.38 Sad 7 1.00 2.99 5.47
Neutral 3 0.79 0.91 1.72 Angry 11 8.56 2.51 1.07 Fear 19 2.06 6.38 2.22 Sad 10 1.01 3.59 5.77
Neutral 4 0.83 0.69 0.76 Angry 3 8.36 2.38 1.00 Fear 12 2.08 6.77 1.95 Sad 5 1.04 4.16 6.98
Neutral 16 0.94 2.36 2.81 Angry 17 8.30 2.64 1.64 Fear 15 2.32 6.72 2.19 Sad 9 1.06 3.96 6.27
Neutral 17 1.01 1.18 1.91 Angry 15 8.11 2.79 1.15 Fear 16 2.43 6.46 1.96 Sad 6 1.08 4.51 6.98
Neutral 13 1.05 0.99 1.86 Angry 18 8.08 2.67 1.87 Fear 10 2.54 5.58 2.65 Sad 3 1.11 3.84 6.34
Neutral 7 1.07 1.11 2.04 Angry 8 8.06 2.50 1.37 Fear 13 2.54 6.24 1.77 Sad 1 1.11 4.25 8.29
Neutral 2 1.17 2.36 3.83 Angry 19 8.04 2.41 1.18 Fear 11 2.72 5.98 1.85 Sad 8 1.11 3.89 7.00
Neutral 14 1.18 1.18 1.51 Angry 7 7.96 2.65 1.34 Fear 3 3.24 4.34 2.02 Sad 18 1.47 4.37 7.67
Neutral 19 1.21 0.75 1.36 Angry 20 7.94 2.98 1.43 Fear 1 3.41 4.04 1.29 Sad 13 1.52 3.98 7.40
Neutral 20 1.23 0.88 1.50 Angry 4 7.92 2.64 1.09 Fear 14 3.50 6.30 1.68 Sad 14 1.77 4.69 7.95
Neutral 10 1.25 1.76 2.82 Angry 9 7.80 2.37 1.14 Fear 8 3.73 5.64 1.95 Sad 12 1.95 4.49 7.56
Neutral 15 1.34 1.41 2.69 Angry 16 7.78 2.64 1.20 Fear 2 3.76 5.39 2.32 Sad 19 2.01 4.36 7.77
Neutral 6 1.46 1.43 1.69 Angry 12 7.76 2.66 1.21 Fear 4 4.03 5.24 2.30 Sad 16 2.05 4.11 7.47
Neutral 12 1.65 0.98 1.53 Angry 5 7.73 2.51 1.05 Fear 6 4.12 4.09 2.09 Sad 17 2.07 4.63 7.77
Neutral 8 1.65 1.89 2.49 Angry 2 7.45 2.23 1.41 Fear 9 4.39 4.64 2.47 Sad 20 2.25 6.77 2.74
Neutral 5 1.90 1.09 1.26 Angry 10 7.23 2.31 1.21 Fear 5 4.80 4.91 2.19 Sad 15 2.70 4.64 7.52
Neutral 1 2.23 1.84 3.01 Angry 6 6.38 2.38 1.68 Fear 7 5.32 5.70 3.21 Sad 11 3.08 4.57 8.20

Coefficients of the linear mixed-effects models

REFERENCES

1. Alharbi SAH, Button K, Zhang L, O’Shea KJ, Fasolt V, Lee AJ, et al. Are affective factors related to individual differences in facial expression recognition? R Soc Open Sci 2020;7:190699
crossref pmid pmc pdf
2. Bistricky SL, Ingram RE, Atchley RA. Facial affect processing and depression susceptibility: cognitive biases and cognitive neuroscience. Psychol Bull 2011;137:998-1028.
crossref pmid
3. Passarotti AM, Sweeney JA, Pavuluri MN. Emotion processing influences working memory circuits in pediatric bipolar disorder and attention-deficit/hyperactivity disorder. J Am Acad Child Adolesc Psychiatry 2010;49:1064-1080.
crossref pmid pmc
4. Darwin C. The expression of the emotions in man and animals. Chicago: University of Chicago Press; 1965.

5. Izard CE. Emotion theory and research: highlights, unanswered questions, and emerging issues. Annu Rev Psychol 2009;60:1-25.
crossref pmid pmc
6. Hedger N, Adams WJ, Garner M. Fearful faces have a sensory advantage in the competition for awareness. J Exp Psychol Hum Percept Perform 2015;41:1748-1757.
crossref pmid
7. Lange J, Heerdink MW, van Kleef GA. Reading emotions, reading people: emotion perception and inferences drawn from perceived emotions. Curr Opin Psychol 2022;43:85-90.
crossref pmid
8. Calvo MG, Avero P, Fernández-Martín A, Recio G. Recognition thresholds for static and dynamic emotional faces. Emotion 2016;16:1186-1200.
crossref pmid
9. McKasy M. A discrete emotion with discrete effects: effects of anger on depth of information processing. Cogn Process 2020;21:555-573.
crossref pmid pdf
10. Kühne R, Schemer C. The emotional effects of news frames on information processing and opinion formation. Commun Res 2015;42:387-407.
crossref pdf
11. Nabi RL. A cognitive‐functional model for the effects of discrete negative emotions on information processing, attitude change, and recall. Commun Theory 1999;9:292-320.
crossref
12. Lazarus RS. Progress on a cognitive-motivational-relational theory of emotion. Am Psychol 1991;46:819-834.
crossref pmid
13. Pessoa L, Japee S, Ungerleider LG. Visual awareness and the detection of fearful faces. Emotion 2005;5:243-247.
crossref pmid
14. Demenescu LR, Kortekaas R, den Boer JA, Aleman A. Impaired attribution of emotion to facial expressions in anxiety and major depression. PLoS One 2010;5:e15058
crossref pmid pmc
15. Acland EL, Jambon M, Malti T. Children’s emotion recognition and aggression: a multi-cohort longitudinal study. Aggress Behav 2021;47:646-658.
crossref pmid pdf
16. Dan O. Recognition of emotional facial expressions in adolescents with attention deficit/hyperactivity disorder. J Adolesc 2020;82:1-10.
crossref pmid pdf
17. Löytömäki J, Ohtonen P, Laakso ML, Huttunen K. The role of linguistic and cognitive factors in emotion recognition difficulties in children with ASD, ADHD or DLD. Int J Lang Commun Disord 2020;55:231-242.
crossref pmid pdf
18. Mancini C, Falciati L, Maioli C, Mirabella G. Happy facial expressions impair inhibitory control with respect to fearful facial expressions but only when task-relevant. Emotion 2022;22:142-152.
crossref pmid
19. Xu J, Hao L, Chen M, He Y, Jiang M, Tian T, et al. Developmental sex differences in negative emotion decision-making dynamics: computational evidence and amygdala-prefrontal pathways. Cereb Cortex 2021;Oct 13 [Epub]. https://doi.org/10.1093/cercor/bhab359.
crossref
20. Buades-Rotger M, Solbakk AK, Liebrand M, Endestad T, Funderud I, Siegwardt P, et al. Patients with ventromedial prefrontal lesions show an implicit approach bias to angry faces. J Cogn Neurosci 2021;33:1069-1081.
crossref pmid pmc
21. Pascalis O, de Martin de Viviés X, Anzures G, Quinn PC, Slater AM, Tanaka JW, et al. Development of face processing. Wiley Interdiscip Rev Cogn Sci 2011;2:666-675.
crossref pmid pmc pdf
22. Cohen J. A power primer. Psychol Bull 1992;112:155-159.
crossref pmid
23. Sheehan DV, Lecrubier Y, Sheehan KH, Amorim P, Janavs J, Weiller E, et al. The mini-international neuropsychiatric interview (M.I.N.I.): the development and validation of a structured diagnostic psychiatric interview for DSM-IV and ICD-10. J Clin Psychiatry 1998;59 Suppl 20:22-33. quiz 34-57.
pmid
24. Yoo SW, Kim YS, Noh JS, Oh KS, Kim CH, NamKoong K, et al. Validity of Korean version of the mini-international neuropsychiatric interview. Anxiety Mood 2006;2:50-55.

25. Wang X, Wang Y, Xin T. The psychometric properties of the Chinese version of the Beck Depression Inventory-II with middle school teachers. Front Psychol 2020;11:548965
crossref pmid pmc
26. Lee EH, Lee SJ, Hwang ST, Hong SH, Kim JH. Reliability and validity of the Beck Depression Inventory-II among Korean adolescents. Psychiatry Investig 2017;14:30-36.
crossref pmid pmc pdf
27. Lee HK, Lee EH, Hwang ST, Hong SH, Kim JH. Psychometric properties of the Beck Anxiety Inventory in the community-dwelling sample of Korean adults. Kor J Clin Psychol 2016;35:822-830.
crossref
28. Patton JH, Stanford MS, Barratt ES. Factor structure of the Barratt impulsiveness scale. J Clin Psychol 1995;51:768-774.
crossref pmid
29. Lee SR, Lee WH, Park JS, Kim SM, Kim JW, Shim JH. The study on reliability and validity of Korean version of the Barratt impulsiveness scale-11-revised in nonclinical adult subjects. J Korean Neuropsychiatr Assoc 2012;51:378-386.
crossref
30. Kessler RC, Adler L, Ames M, Demler O, Faraone S, Hiripi E, et al. The World Health Organization adult ADHD self-report scale (ASRS): a short screening scale for use in the general population. Psychol Med 2005;35:245-256.
crossref pmid
31. Kim JH, Lee EH, Joung YS. The WHO adult ADHD self-report scale: reliability and validity of the Korean version. Psychiatry Investig 2013;10:41-46.
crossref pmid pmc
32. Buss AH, Perry M. The aggression questionnaire. J Pers Soc Psychol 1992;63:452-459.
crossref pmid
33. Kim E, Yim HW, Jeong H, Jo SJ, Lee HK, Son HJ, et al. The association between aggression and risk of Internet gaming disorder in Korean adolescents: the mediation effect of father-adolescent communication style. Epidemiol Health 2018;40:e2018039
crossref pmid pmc
34. Lee SB, Koo SJ, Song YY, Lee MK, Jeong YJ, Kwon C, et al. Theory of mind as a mediator of reasoning and facial emotion recognition: findings from 200 healthy people. Psychiatry Investig 2014;11:105-111.
crossref pmid pmc
35. Lee KU, Kim J, Yeon B, Kim SH, Chae JH. Development and standardization of extended ChaeLee Korean facial expressions of emotions. Psychiatry Investig 2013;10:155-163.
crossref pmid pmc
36. Mei G, Li Y, Chen S, Cen M, Bao M. Lower recognition thresholds for sad facial expressions in subthreshold depression: a longitudinal study. Psychiatry Res 2020;294:113499
crossref pmid
37. Marsh PJ, Williams LM. ADHD and schizophrenia phenomenology: visual scanpaths to emotional faces as a potential psychophysiological marker? Neurosci Biobehav Rev 2006;30:651-665.
crossref pmid
38. Pelc K, Kornreich C, Foisy ML, Dan B. Recognition of emotional facial expressions in attention-deficit hyperactivity disorder. Pediatr Neurol 2006;35:93-97.
crossref pmid
39. Liu P, Han S, Meng Z, Tong Y. Facial expression recognition via a boosted deep belief network. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition; 2014 Jun 23-28; Columbus, OH, USA. IEEE. 2014;1805-1812.
crossref
40. Vuilleumier P. How brains beware: neural mechanisms of emotional attention. Trends Cogn Sci 2005;9:585-594.
crossref pmid
41. Stein T, Verosky SC. No effect of value learning on awareness and attention for faces: evidence from continuous flash suppression and the attentional blink. J Exp Psychol Hum Percept Perform 2021;47:1043-1055.
crossref pmid
42. Johnson MH. Face processing as a brain adaptation at multiple timescales. Q J Exp Psychol (Hove) 2011;64:1873-1888.
crossref pmid pdf
43. Bertini C, Làdavas E. Fear-related signals are prioritised in visual, somatosensory and spatial systems. Neuropsychologia 2021;150:107698
crossref pmid
44. Borhani K, Nejati V. Emotional face recognition in individuals withattention-deficit/hyperactivity disorder: a review article. Dev Neuropsychol 2018;43:256-277.
crossref pmid
45. Sinzig J, Morsch D, Lehmkuhl G. Do hyperactivity, impulsivity and inattention have an impact on the ability of facial affect recognition in children with autism and ADHD? Eur Child Adolesc Psychiatry 2008;17:63-72.
crossref pmid pdf
46. van Stralen J. Emotional dysregulation in children with attention-deficit/ hyperactivity disorder. Atten Defic Hyperact Disord 2016;8:175-187.
crossref pmid pmc pdf
47. Bisch J, Kreifelts B, Bretscher J, Wildgruber D, Fallgatter A, Ethofer T. Emotion perception in adult attention-deficit hyperactivity disorder. J Neural Transm (Vienna) 2016;123:961-970.
crossref pmid pdf
48. Petroni A, Canales-Johnson A, Urquina H, Guex R, Hurtado E, Blenkmann A, et al. The cortical processing of facial emotional expression is associated with social cognition skills and executive functioning: a preliminary study. Neurosci Lett 2011;505:41-46.
crossref pmid
49. Shioiri T, Someya T, Helmeste D, Tang SW. Misinterpretation of facial expression: a cross-cultural study. Psychiatry Clin Neurosci 1999;53:45-50.
crossref pmid pdf


ABOUT
AUTHOR INFORMATION
ARTICLE CATEGORY

Browse all articles >

BROWSE ARTICLES
Editorial Office
#522, 27, Seochojungang-ro 24-gil, Seocho-gu, Seoul 06601, Korea
Tel: +82-2-717-0892    E-mail: psychiatryinvest@gmail.com                

Copyright © 2024 by Korean Neuropsychiatric Association.

Developed in M2PI

Close layer
prev next