刷卡
触摸屏
计算机科学
凝视
计算机视觉
手势
可达性
人工智能
眼动
校准
移动设备
移动交互
BitTorrent跟踪器
人机交互
理论计算机科学
计算机网络
统计
数学
操作系统
作者
Zhuojiang Cai,Jingkai Hong,Zhimin Wang,Feng Lu
标识
DOI:10.1145/3706598.3713739
摘要
Smartphones with large screens provide users with increased display and interaction space but pose challenges in reaching certain areas with the thumb when using the device with one hand. To address this, we introduce GazeSwipe, a multimodal interaction technique that combines eye gaze with finger-swipe gestures, enabling intuitive and low-friction reach on mobile touchscreens. Specifically, we design a gaze estimation method that eliminates the need for explicit gaze calibration. Our approach also avoids the use of additional eye-tracking hardware by leveraging the smartphone's built-in front-facing camera. Considering the potential decrease in gaze accuracy without dedicated eye trackers, we use finger-swipe gestures to compensate for any inaccuracies in gaze estimation. Additionally, we introduce a user-unaware auto-calibration method that improves gaze accuracy during interaction. Through extensive experiments on smartphones and tablets, we compare our technique with various methods for touchscreen reachability and evaluate the performance of our auto-calibration strategy. The results demonstrate that our method achieves high success rates and is preferred by users. The findings also validate the effectiveness of the auto-calibration strategy.
科研通智能强力驱动
Strongly Powered by AbleSci AI