Jonghwa Kim
Jonghwa Kim
Cheju Halla University, Korea / University Augsburg, Germany
Verified email at
Cited by
Cited by
Emotion recognition based on physiological changes in music listening
J Kim, E André
IEEE transactions on pattern analysis and machine intelligence 30 (12), 2067 …, 2008
From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification
J Wagner, J Kim, E André
2005 IEEE international conference on multimedia and expo, 940-943, 2005
EMG-based hand gesture recognition for realtime biosignal interfacing
J Kim, S Mastnik, E André
Proceedings of the 13th international conference on Intelligent user …, 2008
Bimodal emotion recognition using speech and physiological changes
J Kim
Robust speech recognition and understanding 265, 280, 2007
Exploring fusion methods for multimodal emotion recognition with missing data
J Wagner, E Andre, F Lingenfelser, J Kim
IEEE Transactions on Affective Computing 2 (4), 206-218, 2011
A brain–computer interface method combined with eye tracking for 3D interaction
EC Lee, JC Woo, JH Kim, M Whang, KR Park
Journal of neuroscience methods 190 (2), 289-298, 2010
Position estimation using linear hall sensors for permanent magnet linear motor systems
J Kim, S Choi, K Cho, K Nam
IEEE Transactions on Industrial Electronics 63 (12), 7644-7652, 2016
Integrating information from speech and physiological signals to achieve emotional sensitivity
J Kim, E André, M Rehm, T Vogt, J Wagner
Ninth european conference on speech communication and technology, 2005
Multimodal emotion classification in naturalistic user behavior
S Walter, S Scherer, M Schels, M Glodek, D Hrabal, M Schmidt, R Böck, ...
Human-Computer Interaction. Towards Mobile and Intelligent Interaction …, 2011
The senseemotion database: A multimodal database for the development and systematic validation of an automatic pain-and emotion-recognition system
M Velana, S Gruss, G Layher, P Thiam, Y Zhang, D Schork, V Kessler, ...
Multimodal Pattern Recognition of Social Signals in Human-Computer …, 2017
Physiological signals and their use in augmenting emotion recognition for human–machine interaction
RB Knapp, J Kim, E André
Emotion-oriented systems: The Humaine handbook, 133-159, 2010
Emotion recognition using physiological and speech signal in short-term observation
J Kim, E André
International Tutorial and Research Workshop on Perception and Interactive …, 2006
Multi-Modal Pain Intensity Recognition Based on the SenseEmotion Database
P Thiam, V Kessler, M Amirian, P Bellmann, G Layher, Y Zhang, M Velana, ...
IEEE Transactions on Affective Computing 12 (3), 743-760, 2019
Bi-channel sensor fusion for automatic sign language recognition
J Kim, J Wagner, M Rehm, E André
2008 8th IEEE International Conference on Automatic Face & Gesture …, 2008
Transsituational individual-specific biopsychological classification of emotions
S Walter, J Kim, D Hrabal, SC Crawcour, H Kessler, HC Traue
IEEE Transactions on Systems, Man, and Cybernetics: Systems 43 (4), 988-995, 2013
Emote to win: Affective interactions with a computer game agent
J Kim, N Bee, J Wagner, E André
Informatik 2004–Informatik verbindet–Band 1, Beiträge der 34. Jahrestagung …, 2004
Automatic path planning of industrial robots comparing sampling-based and computational intelligence methods
L Larsen, J Kim, M Kupke, A Schuster
Procedia Manufacturing 11, 241-248, 2017
Multimodal emotion recognition from low-level cues
M Pantic, G Caridakis, E André, J Kim, K Karpouzis, S Kollias
Emotion-oriented systems: The humaine handbook, 115-132, 2011
Stress Recognition in Daily Work
Y Nakashima, J Kim, S Flutura, A Seiderer, E André
CCIS: Communications in Computer and Information Science 604, 23-33, 2016
Age and gender classification from speech using decision level fusion and ensemble based techniques
F Lingenfelser, J Wagner, T Vogt, J Kim, E André
Eleventh Annual Conference of the International Speech Communication Association, 2010
The system can't perform the operation now. Try again later.
Articles 1–20