Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking
Overview The Handy AR presents a vision-based user interface that tracks a user's outstretched hand to use it as the reference pattern for augmented reality (AR) inspection, providing a 6-DOF camera pose estimation from the tracked fingertip configuration. A hand pose model is constructed in a one-time calibration step by measuring the fingertip positions relative to each other in presence of ground-truth scale information. Through frame-by-frame reconstruction of the camera pose relative to the hand, we can stabilize 3D graphics annotations on top of the hand, allowing the user to inspect such virtual objects conveniently from different viewing angles in AR.
Fingertips are detected using a curvature-based algorithm on the contour of a user's hand. The contour point with a high curvature value is sought as a candidate fingertip point. Then an ellipse is fitted to accurately locate the fingertip. Five fingertips are detected and ordered based on the position of a thumb so that the fingertips are used as point correspondences for a camea pose estimation algorithm. Camera Pose Estimation
As long as the five fingertips of the fixed hand posture are tracked successfully, the pose estimation method has enough point correspondences for estimating the extrinsic camera parameters for 6DOF camera pose relative to the hand. In order to inspect an AR object on top of the hand from different viewing angles, the user may rotate or move the hand arbitrarily.Interaction
The Handy AR can be used for interacting with AR objects such as world-stabilized objects using other marker-based AR library such as ARTag. Selecting an AR object and then inspecting it can be performed using a user's hand effectively. While the world coordinate system is defined by the markers' transformation matrix, the local coordinate system of the camera pose estimated via fingertips is used for inspecting the virtual object on top of the hand.