3d point-of-intention determination using a multimodal fusion of hand pointing and eye gaze for a 3d display

Journal article


Authors/Editors


Strategic Research Themes


Publication Details

Author listYeamkuan S., Chamnongthai K.

PublisherMDPI

Publication year2021

Volume number21

Issue number4

Start page1

End page31

Number of pages31

ISSN1424-8220

eISSN1424-8220

URLhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85100516851&doi=10.3390%2fs21041155&partnerID=40&md5=85b3e58079dcc84285ca42ef36bb5936

LanguagesEnglish-Great Britain (EN-GB)


View in Web of Science | View on publisher site | View citing articles in Web of Science


Abstract

This paper proposes a three-dimensional (3D) point-of-intention (POI) determination method using multimodal fusion between hand pointing and eye gaze for a 3D virtual display. In the method, the finger joint forms of the pointing hand sensed by a Leap Motion sensor are first detected as pointing intention candidates. Subsequently, differences with neighboring frames, which should be during hand pointing period, are checked by AND logic with the hand-pointing intention candidates. A crossing point between the eye gaze and hand pointing lines is finally decided by the closest distance concept. In order to evaluate the performance of the proposed method, experiments with ten participants, in which they looked at and pointed at nine test points for approximately five second each, were performed. The experimental results show the proposed method measures 3D POIs at 75 cm, 85 cm, and 95 cm with average distance errors of 4.67%, 5.38%, and 5.71%, respectively. © 2021 by the authors. Licensee MDPI, Basel, Switzerland.


Keywords

Hand recognitionHand trackingMultimodal systems


Last updated on 2024-04-10 at 00:00