DOI QR코드

DOI QR Code

Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans

가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석

  • 김진옥 (대구한의대학교 국제문화정보대학 모바일콘텐츠학부)
  • Received : 2012.07.25
  • Accepted : 20120900
  • Published : 2012.10.31

Abstract

Virtual human used as HCI in digital contents expresses his various emotions across modalities like facial expression and body posture. However, few studies considered combinations of such nonverbal multimodal in emotion perception. Computational engine models have to consider how a combination of nonverbal modal like facial expression and body posture will be perceived by users to implement emotional virtual human, This paper proposes the impacts of nonverbal multimodal in design of emotion expressed virtual human. First, the relative impacts are analysed between different modals by exploring emotion recognition of modalities for virtual human. Then, experiment evaluates the contribution of the facial and postural congruent expressions to recognize basic emotion categories, as well as the valence and activation dimensions. Measurements are carried out to the impact of incongruent expressions of multimodal on the recognition of superposed emotions which are known to be frequent in everyday life. Experimental results show that the congruence of facial and postural expression of virtual human facilitates perception of emotion categories and categorical recognition is influenced by the facial expression modality, furthermore, postural modality are preferred to establish a judgement about level of activation dimension. These results will be used to implementation of animation engine system and behavior syncronization for emotion expressed virtual human.

디지털 콘텐츠에서 HCI로 활용되는 가상 인간은 얼굴 표정과 신체자세와 같은 모달을 이용하여 다양한 감정을 표현하지만 비언어적 다중모달의 조합에 대한 연구는 많지 않다. 감정을 표현하는 가상 인간을 제작하려면 계산 엔진 모델은 얼굴 표정과 신체자세와 같은 비언어적 모달의 조합이 사용자에 의해 어떻게 인식되는지를 고려해야 하기 때문에 본 연구는 가상 인간의 감정 표현 디자인에 필요한 비언어적 다중모달의 영향을 분석하여 제시한다. 먼저 가상 인간에 대한 다중모달 별 감정 인식을 평가하여 다른 모달간의 상대적 영향성을 분석하였다. 그리고 일치하는 얼굴과 자세 모달을 통해 기본 감정 및 정서가와 활성화 인식에 대한 영향을 평가하며 감정이 불일치하는 다중모달을 통해 일상생활에서 빈번하게 드러나는 중첩된 감정의 인식 정도를 관측하였다. 실험 결과, 가상 인간의 얼굴과 신체자세의 표정이 일치하면 감정 인식이 용이하며, 얼굴 표정으로 감정 카테고리를 판별하지만 감정의 활성화 차원 판단에는 자세 모달리티가 선호됨을 확인하였다. 본 연구 결과는 감정을 드러내는 가상 인간의 행동 동기화 및 애니메이션 엔진 시스템 구현에 활용할 수 있다.

Keywords

References

  1. K.R. Scherer, "Emotion, Introduction to Social Psychology: A European perspective", Oxford: Blackwell, pp.151-191, 2000.
  2. J. A. Harrigen, "The new handbook of methods in nonverbal behavior research", Series in Affective Science, Oxford University Press, pp. 137-198, 2005.
  3. T. Partala, V. Suraka, "The effects of affective interventions in human-computer interaction". Interacting with Computers, Vol. 16, pp. 205-309, 2004.
  4. H. Gunes, M. Piccardi, M. Pantic, "From the lab to the Real World : Affect Recognition Using Multiple Cues and Modalities," Affective Computing, Focus on Emotion Expression, Synthesis and Recognition, Tech Education and Publishing, pp.184-218, 2008.
  5. C. Izard, "Human Emotions", Plenum, New York, 1977.
  6. J. A. Russell, "A circumplex model of affect", Journal of Personality and Social Psychology, Vol. 39, No. 6, pp 1161-1178, 1980. https://doi.org/10.1037/h0077714
  7. K. R. Scherer, "Analyzing Emotion Blends", Proceedings of the Xth Conference of the International Society for Research on Emotions, Fischer, pp.142-148, 1998.
  8. P. Ekman, W. Friesen, "A new pan-cultural facial expression of emotion", Motivation and Emotion, Vol. 10, pp.159-168, 1986. https://doi.org/10.1007/BF00992253
  9. "표정 강도에 강건한 얼굴 표정 인식", 정보처리학회 논문지 B, 제 16-B권 5호, pp. 395-402, 2009. https://doi.org/10.3745/KIPSTB.2009.16B.5.395
  10. H. Gunes, M. Piccardi, "Bi-modal emotion recognition from expressive face and body gestures", Journal of Network & Computer Applications, Vol. 30, No. 4, pp.1334-1345, 2007. https://doi.org/10.1016/j.jnca.2006.09.007
  11. A.T. Dittman, "The role of body movement in communication", Nonverbal behavior and communication, Lawrence Erlbaum, Hillsdale, 1987.
  12. 김진옥. "감정 자세 인식을 위한 자세특징과 감정 예측 모델", 한국인터넷정보학회 논문지, 제 12권 6호, pp. 83-94, 2011.
  13. H.G. Wallbott, "Bodily expression of emotion", European Journal of Social Psychology, Vol. 28, pp.879-896, 1998. https://doi.org/10.1002/(SICI)1099-0992(1998110)28:6<879::AID-EJSP901>3.0.CO;2-W
  14. K.R. Scherer, H. Ellgring, "Multimodal Expression of Emotion: Affect Programs or Componential Appraisal Patterns", Emotion, Vol. 7, No. 1, pp.158-171, 2007. https://doi.org/10.1037/1528-3542.7.1.158
  15. J. K. Hietanen, J. M. Leppänen, "Judgment of other people''s facial expressions of emotions is influenced by their concurrent affective hand movements", Scandinavian Journal of Psychology, Vol. 49, pp.221-230, 2008. https://doi.org/10.1111/j.1467-9450.2008.00644.x
  16. H.K.M. Meeren, C. van Heijnsbergen, B. de Gelder, "Rapid perceptual integration of facial expression and emotional body language", Proceedings of the National Academy of Sciences of the USA, pp.16518-16523, 2005.
  17. R. Niewiadomski, "A model of complex facial expressions in interpersonal relations for animated agent",. PhD Thesis, University of Perugia, 2007.
  18. T. Ziemke, R. Lowe, "On the role of emotion in embodied cognitive architectures: From organisms to robots, Cognitive Computation, Vol. 1, pp. 104-117, 2009. https://doi.org/10.1007/s12559-009-9012-0
  19. C. Pelachaud, "Multimodal expressive embodied nonversational virtuals", Proceedings of the 13th annual ACM international conference on Multimedia, pp.632-689, 2005.
  20. D. Ballin, M. Gillies, B. Crabtree, "A Framework For Interpersonal Attitude And Non-Verbal communication in Improvisational Visual Media Production", 1st European Conference on Visual Media Production IEE,. 2004.
  21. S. Buisine, S. Abrilian, R. Niewiadomski, J. C. Martin, L. Devillers, C. Pelachaud, "recognition of blended emotions: From video corpus to expressive virtual", International Conference on intelligent Virtuals, Springer-Verlag, pp.93-106, 2006.
  22. J. A. Hall, D. Matsumoto, "Gender differences in judgments of multiple emotions from facial expressions",. Emotion, Vol. 4, No. 2, pp. 201-206, 2004. https://doi.org/10.1037/1528-3542.4.2.201
  23. 김진옥. "새로운 얼굴 특징공간을 이용한 모델 기반 얼굴 표정 인식, 한국정보처리학회 논문지 B, 제 17-B권 4호, pp.309-316, 2010. https://doi.org/10.3745/KIPSTB.2010.17B.4.309

Cited by

  1. NUI/NUX framework based on intuitive hand motion vol.15, pp.3, 2014, https://doi.org/10.7472/jksii.2014.15.3.11
  2. 한국 드라마 수용에 있어서 국가별 감정 반응 분석: 드라마 <도깨비>를 중심으로 vol.20, pp.4, 2017, https://doi.org/10.14695/kjsos.2017.20.4.31
  3. 독거노인용 가상 휴먼 제작 툴킷 vol.24, pp.9, 2020, https://doi.org/10.6109/jkiice.2020.24.9.1245