Affect Detection from Text-Based Virtual Improvisation and Emotional Gesture Recognition
We have developed an intelligent agent to engage with users in virtual drama improvisation previously. The intelligent agent was able to perform sentence-level affect detection from user inputs with strong emotional indicators. However, we noticed that many inputs with weak or no affect indicators a...
Saved in:
Main Authors: | Li Zhang, Bryan Yap |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2012-01-01
|
Series: | Advances in Human-Computer Interaction |
Online Access: | http://dx.doi.org/10.1155/2012/461247 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
An Improved Gesture Segmentation Method for Gesture Recognition Based on CNN and YCbCr
by: Yan Luo, et al.
Published: (2021-01-01) -
Human Motion Gesture Recognition Based on Computer Vision
by: Rui Ma, et al.
Published: (2021-01-01) -
Gesture recognition approach based on learning sparse representation
by: Ling XIAO, et al.
Published: (2013-06-01) -
An environment adaptive gesture recognition system based on visible light
by: Zhu WANG, et al.
Published: (2023-06-01) -
Female athletes explicitly gesture in emotional situations
by: Y. Adams, et al.
Published: (2025-01-01)