Published December 13, 2021 | Version v1
Publication Open

Global-local attention for emotion recognition

  • 1. Ho Chi Minh City University of Science
  • 2. Vietnam National University Ho Chi Minh City

Description

Abstract Human emotion recognition is an active research area in artificial intelligence and has made substantial progress over the past few years. Many recent works mainly focus on facial regions to infer human affection, while the surrounding context information is not effectively utilized. In this paper, we proposed a new deep network to effectively recognize human emotions using a novel global-local attention mechanism. Our network is designed to extract features from both facial and context regions independently, then learn them together using the attention module. In this way, both the facial and contextual information is used to infer human emotions, therefore enhancing the discrimination of the classifier. The intensive experiments show that our method surpasses the current state-of-the-art methods on recent emotion datasets by a fair margin. Qualitatively, our global-local attention module can extract more meaningful attention maps than previous methods. The source code and trained model of our network are available at https://github.com/minhnhatvt/glamor-net .

⚠️ This is an automatic machine translation with an accuracy of 90-95%

Translated Description (Arabic)

يعد التعرف على المشاعر البشرية مجالًا بحثيًا نشطًا في الذكاء الاصطناعي وقد حقق تقدمًا كبيرًا على مدى السنوات القليلة الماضية. تركز العديد من الأعمال الحديثة بشكل أساسي على مناطق الوجه لاستنتاج المودة البشرية، في حين لا يتم استخدام معلومات السياق المحيط بشكل فعال. في هذه الورقة، نقترح شبكة عميقة جديدة للتعرف بشكل فعال على المشاعر البشرية باستخدام آلية جديدة للانتباه العالمي والمحلي. تم تصميم شبكتنا لاستخراج الميزات من كل من مناطق الوجه والسياق بشكل مستقل، ثم تعلمها معًا باستخدام وحدة الانتباه. وبهذه الطريقة، يتم استخدام كل من معلومات الوجه والسياق لاستنتاج المشاعر البشرية، وبالتالي تعزيز تمييز المصنف. تُظهر التجارب المكثفة أن طريقتنا تتجاوز الأساليب الحديثة الحالية في مجموعات بيانات المشاعر الحديثة بهامش معقول. من الناحية النوعية، يمكن لوحدة الاهتمام العالمية والمحلية الخاصة بنا استخراج خرائط اهتمام ذات مغزى أكثر من الأساليب السابقة. الرمز المصدري والنموذج المدرّب لشبكتنا متاحان على https://github.com/minhnhatvt/glamor-net .

Translated Description (English)

Abstract Human emotion recognition is an active research area in artificial intelligence and has made substantial progress over the past few years. Many recent works mainly focus on facial regions to infer human affection, while the surrounding context information is not effectively utilized. In this paper, we propose a new deep network to effectively recognize human emotions using a novel global-local attention mechanism. Our network is designed to extract features from both facial and context regions independently, then learn them together using the attention module. In this way, both the facial and contextual information is used to infer human emotions, thus enhancing the discrimination of the classifier. The intensive experiments show that our method surpasses the current state-of-the-art methods on recent emotion datasets by a fair margin. Qualitatively, our global-local attention module can extract more meaningful attention maps than previous methods. The source code and trained model of our network are available at https://github.com/minhnhatvt/glamor-net .

Translated Description (French)

Abstract Human emotion recognition is an active research area in artificial intelligence and has made substantial progress over the past few years. Many recent works mainly focus on facial regions to infer human affection, while the surrounding context information is not effectively used. In this paper, we proposed a new deep network to effectively recognize human emotions using a novel global-local attention mechanism. Our network is designed to extract features from both facial and context regions independently, then learn them together using the attention module. In this way, both the facial and contextual information is used to infer human emotions, therefore enhancing the discrimination of the classifier. The intensive experiments show that our method surpasses the current state-of-the-art methods on recent emotion datasets by a fair margin. Qualitatively, our global-local attention module can extract more meaningful attention maps than previous methods. The source code and trained model of our network are available at https://github.com/minhnhatvt/glamor-net .

Translated Description (Spanish)

Resumen Human emotion recognition is an active research area in artificial intelligence and has made substantial progress over the past few years. Many recent works mainly focus on facial regions to infer human affection, while the surrounding context information is not effectively utilized. In this paper, we proposed a new deep network to effectively recognize human emotions using a novel global-local attention mechanism. Our network is designed to extract features from both facial and context regions independently, then learn them together using the attention module. In this way, both the facial and contextual information is used to infer human emotions, therefore enhancing the discrimination of the classifier. The intensive experiments show that our method surpasses the current state-of-the-art methods on recent emotion datasets by a fair margin. Qualitatively, our global-local attention module can extract more meaningful attention maps than previous methods. The source code and trained model of our network are available at https://github.com/minhnhatvt/glamor-net .

Files

s00521-021-06778-x.pdf.pdf

Files (2.2 MB)

⚠️ Please wait a few minutes before your translated files are ready ⚠️ Note: Some files might be protected thus translations might not work.
Name Size Download all
md5:60a89b29d2a094d79fea3b37ceff54df
2.2 MB
Preview Download

Additional details

Additional titles

Translated title (Arabic)
الاهتمام العالمي المحلي بالتعرف على المشاعر
Translated title (English)
Global-local attention for emotion recognition
Translated title (French)
Global-local attention for emotion recognition
Translated title (Spanish)
Global-local attention for emotion recognition

Identifiers

Other
https://openalex.org/W4205497094
DOI
10.1007/s00521-021-06778-x

GreSIS Basics Section

Is Global South Knowledge
Yes
Country
Vietnam

References

  • https://openalex.org/W115430332
  • https://openalex.org/W145599994
  • https://openalex.org/W1532257412
  • https://openalex.org/W1535753778
  • https://openalex.org/W1555767263
  • https://openalex.org/W1582347098
  • https://openalex.org/W1965028342
  • https://openalex.org/W1965947362
  • https://openalex.org/W1981918162
  • https://openalex.org/W2003238582
  • https://openalex.org/W2032254851
  • https://openalex.org/W2074788634
  • https://openalex.org/W2091906201
  • https://openalex.org/W2103943262
  • https://openalex.org/W2128042196
  • https://openalex.org/W2141890865
  • https://openalex.org/W2143492886
  • https://openalex.org/W2145310492
  • https://openalex.org/W2154683974
  • https://openalex.org/W2158705122
  • https://openalex.org/W2161969291
  • https://openalex.org/W2167277498
  • https://openalex.org/W2283758531
  • https://openalex.org/W2295001676
  • https://openalex.org/W2345305417
  • https://openalex.org/W2604272474
  • https://openalex.org/W2737398044
  • https://openalex.org/W2798506093
  • https://openalex.org/W2948406758
  • https://openalex.org/W2962984928
  • https://openalex.org/W2963860638
  • https://openalex.org/W2970710980
  • https://openalex.org/W2987119394
  • https://openalex.org/W3001196836
  • https://openalex.org/W3001529617
  • https://openalex.org/W3028808916
  • https://openalex.org/W3031696893
  • https://openalex.org/W3034520808
  • https://openalex.org/W3118310928
  • https://openalex.org/W3122081138
  • https://openalex.org/W3124054989
  • https://openalex.org/W3129397256
  • https://openalex.org/W3174189418
  • https://openalex.org/W3175546442
  • https://openalex.org/W3179103990
  • https://openalex.org/W3203255640
  • https://openalex.org/W4235499294
  • https://openalex.org/W4236965008