HandSee: Enabling Full Hand Interaction on Smartphone with Front Camera-based Stereo Vision
Chun Yu, Xiaoying Wei, Shubh Vachher, Yue Qin, Chen Liang, Yueting Weng, Yizheng Gu, Yuanchun Shi
CHI '19: ACM CHI Conference on Human Factors in Computing Systems
Session: Gesture Sensing
Abstract
We present HandSee, a novel sensing technique that can capture the state and movement of the user's hands touching or gripping a smartphone. We place a right angle prism mirror on the front camera to achieve a stereo vision of the scene above the touchscreen surface. We develop a pipeline to extract the depth image of hands from a monocular RGB image, which consists of three components: a stereo matching algorithm to estimate the pixel-wise depth of the scene, a CNN-based online calibration algorithm to detect hand skin, and a merging algorithm that outputs the depth image of the hands. Building on the output, a substantial set of valuable interaction information, such as fingers' 3D location, gripping posture, and finger identity can be recognized concurrently. Due to this unique sensing ability, HandSee enables a variety of novel interaction techniques and expands the design space for full hand interaction on smartphones.
DOI:: [ Ссылка ]
WEB:: [ Ссылка ]
![](https://i.ytimg.com/vi/K1qcMdFkqP4/maxresdefault.jpg)