Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic Feedback.

INTERACT (1)(2023)

引用 0|浏览3
暂无评分
摘要
Smartphones are used in different contexts, including scenarios where visual and auditory modalities are limited (e.g., walking or driving). In this context, we introduce a new interaction concept, called Hap2Gest, that can give commands and retrieve information, both eyes-free. First, it uses a gesture as input for command invocation, and then output information is retrieved using haptic feedback perceived through an output gesture drawn by the user. We conducted an elicitation study with 12 participants to determine users’ preferences for the aforementioned gestures and the vibration patterns for 25 referents. Our findings indicate that users tend to use the same gesture for input and output, and there is a clear relationship between the type of gestures and vibration patterns users suggest and the type of output information. We show that the gesture’s speed profile agreement rate is significantly higher than the gesture’s shape agreement rate, and it can be used by the recognizer when the gesture shape agreement rate is low. Finally, we present a complete set of user-defined gestures and vibration patterns and address the gesture recognition problem.
更多
查看译文
关键词
gestures,smartphones,interaction,eyes-free
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要