Obscured 3D point-of-gaze estimation by multipoint cloud data

Conference proceedings article


Authors/Editors


Strategic Research Themes

No matching items found.


Publication Details

Author listPichitwong W., Chamnongthai K.

PublisherHindawi

Publication year2019

Start page947

End page950

Number of pages4

ISBN9781728133614

ISSN0146-9428

eISSN1745-4557

URLhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85078850350&doi=10.1109%2fECTI-CON47248.2019.8955244&partnerID=40&md5=c968b90b3d53e2893877c20a16c4bf82

LanguagesEnglish-Great Britain (EN-GB)


View on publisher site


Abstract

Point cloud sensor is currently used to sense point cloud data information which is 3D position on the surface of target object. Point cloud data information can be input to improve for 3D point of gaze estimation (3D POG). Presently, there are limitation on creating point cloud data information on target object since point cloud data cannot be found if any obstacle in front of the sensor, there are shadow projection. This paper proposes method of multipoint cloud data to create point cloud data on the surface of target object and obscured point cloud data in the shadow projection. Eye tracker sensor provides 3D eyes position data and 2D POG on screen data which each origin represents center of eye tracker and center of screen respectively. These mentioned data are integrated by model fitting to draw a straight line, originating from the center point between left pupil and right pupil, which passes through the2D POG on virtual screen and ends when the line meets the closest point on the target object. In performance evaluation of proposed method, firstly the obscured point cloud data are successfully defined. Secondly, experiment by 4 participants by watching 9 units of testing objects at 2 seconds in free move provide the result of 3D POG estimation at average distance errors by 1.09 cm. ฉ 2019 IEEE.


Keywords

Multipoint cloud data


Last updated on 2023-25-09 at 07:36