Recognition of Emotions Conveyed by Touch Through Force-Sensitive Screens: Observational Study of Humans and Machine Learning Techniques

Author:

Heraz AliciaORCID,Clynes ManfredORCID

Abstract

Background Emotions affect our mental health: they influence our perception, alter our physical strength, and interfere with our reason. Emotions modulate our face, voice, and movements. When emotions are expressed through the voice or face, they are difficult to measure because cameras and microphones are not often used in real life in the same laboratory conditions where emotion detection algorithms perform well. With the increasing use of smartphones, the fact that we touch our phones, on average, thousands of times a day, and that emotions modulate our movements, we have an opportunity to explore emotional patterns in passive expressive touches and detect emotions, enabling us to empower smartphone apps with emotional intelligence. Objective In this study, we asked 2 questions. (1) As emotions modulate our finger movements, will humans be able to recognize emotions by only looking at passive expressive touches? (2) Can we teach machines how to accurately recognize emotions from passive expressive touches? Methods We were interested in 8 emotions: anger, awe, desire, fear, hate, grief, laughter, love (and no emotion). We conducted 2 experiments with 2 groups of participants: good imagers and emotionally aware participants formed group A, with the remainder forming group B. In the first experiment, we video recorded, for a few seconds, the expressive touches of group A, and we asked group B to guess the emotion of every expressive touch. In the second experiment, we trained group A to express every emotion on a force-sensitive smartphone. We then collected hundreds of thousands of their touches, and applied feature selection and machine learning techniques to detect emotions from the coordinates of participant’ finger touches, amount of force, and skin area, all as functions of time. Results We recruited 117 volunteers: 15 were good imagers and emotionally aware (group A); the other 102 participants formed group B. In the first experiment, group B was able to successfully recognize all emotions (and no emotion) with a high 83.8% (769/918) accuracy: 49.0% (50/102) of them were 100% (450/450) correct and 25.5% (26/102) were 77.8% (182/234) correct. In the second experiment, we achieved a high 91.11% (2110/2316) classification accuracy in detecting all emotions (and no emotion) from 9 spatiotemporal features of group A touches. Conclusions Emotions modulate our touches on force-sensitive screens, and humans have a natural ability to recognize other people’s emotions by watching prerecorded videos of their expressive touches. Machines can learn the same emotion recognition ability and do better than humans if they are allowed to continue learning on new data. It is possible to enable force-sensitive screens to recognize users’ emotions and share this emotional insight with users, increasing users’ emotional awareness and allowing researchers to design better technologies for well-being.

Publisher

JMIR Publications Inc.

Subject

Psychiatry and Mental health

Cited by 15 articles. 订阅此论文施引文献 订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献

同舟云学术

1.学者识别学者识别

2.学术分析学术分析

3.人才评估人才评估

"同舟云学术"是以全球学者为主线,采集、加工和组织学术论文而形成的新型学术文献查询和分析系统,可以对全球学者进行文献检索和人才价值评估。用户可以通过关注某些学科领域的顶尖人物而持续追踪该领域的学科进展和研究前沿。经过近期的数据扩容,当前同舟云学术共收录了国内外主流学术期刊6万余种,收集的期刊论文及会议论文总量共计约1.5亿篇,并以每天添加12000余篇中外论文的速度递增。我们也可以为用户提供个性化、定制化的学者数据。欢迎来电咨询!咨询电话:010-8811{复制后删除}0370

www.globalauthorid.com

TOP

Copyright © 2019-2024 北京同舟云网络信息技术有限公司
京公网安备11010802033243号  京ICP备18003416号-3