计算机科学
拇指
人机交互
人工智能
虚拟现实
医学
解剖
作者
Kenrick Kin,Chengde Wan,K.M. Koh,Andrei Marin,Necati Cihan Camgöz,Yubo Zhang,Yujun Cai,Fedor Kovalev,Moshe Ben-Zacharia,Shannon Hoople,Marcos Nunes-Ueno,Mariel Sanchez-Rodriguez,Ayush Bhargava,Robert Wang,Eric L. Sauser,Shugao Ma
标识
DOI:10.1145/3613904.3642702
摘要
AR/VR devices have started to adopt hand tracking, in lieu of controllers, to support user interaction. However, today's hand input rely primarily on one gesture: pinch. Moreover, current mappings of hand motion to use cases like VR locomotion and content scrolling involve more complex and larger arm motions than joystick or trackpad usage. STMG increases the gesture space by recognizing additional small thumb-based microgestures from skeletal tracking running on a headset. We take a machine learning approach and achieve a 95.1% recognition accuracy across seven thumb gestures performed on the index finger surface: four directional thumb swipes (left, right, forward, backward), thumb tap, and fingertip pinch start and pinch end. We detail the components to our machine learning pipeline and highlight our design decisions and lessons learned in producing a well generalized model. We then demonstrate how these microgestures simplify and reduce arm motions for hand-based locomotion and scrolling interactions.
科研通智能强力驱动
Strongly Powered by AbleSci AI