Editor-in-Chief : V.K. Rastogi
Asian Journal of Physics | Vol 31, No 8 (2022) 871-878 |
A gaze tracking system based on DLT calibration technique to control mobile robots
Hugo A Moreno, Hugo A Mendez, Diana C Hernandez, Omar F Loa, Cesar P Carrillo, and Víctor H Flores
Departamento de Ingeniería Robótica, Universidad Politécnica del Bicentenario, C.P. 36283, Silao, Gto., México.
This article is dedicated to Professor Cesar Sciammarella
In this paper, we present a gaze tracking system based on the Direct Linear Transformation (DLT) calibration technique. The focus of our study is to allow the remote interaction and inspection of a user via a mobile robot controlled by the movement of the pupil of the user. To do so, we implemented a differential mobile robot that has mounted a wireless camera. The image captured by the camera is seen by the user via a screen, allowing him/her to look the environment around the robot. By using the DLT algorithm, we can calculate the point of interest that the user observes on the image of the screen with the implementation of a pupil detection system which consists of a camera mounted on the frame of lenses; then, such detection is feedbacked to the system to control the trajectory of the movement of a mobile robot. The present contribution is based on the use of a camera calibration technique based on 6 control points and applied to calibrate the eye movement of the user based on the points observed on the screen. © Anita Publications. All rights reserved.
Keywords: Direct linear transformation, Mobile robot, Gaze tracking.
Peer Review Information
Method: Single- anonymous; Screened for Plagiarism? Yes
Buy this Article in Print © Anita Publications. All rights reserve
References
- Liu J, Chi J, Yang H, Yin X, In the Eye of the Beholder: A Survey of Gaze Tracking Techniques, Pattern Recognit, 132(2022)108944; doi.org/10.1016/j.patcog.2022.108944.
- Morimoto C H, Mimica M R, Eye gaze tracking techniques for interactive applications, Computer vision and image understanding, 98(2005)4–24.
- Beymer D, Flickner M, Eye gaze tracking using an active stereo head. 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2003. Proceedings, pp. II-451, doi: 10.1109/CVPR.2003.1211502.
- Shih S W, Liu J, A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 34(2004)234–245.
- Ohno T, Mukawa N, Yoshikawa A, Free Gaze: a gaze tracking system for everyday gaze interaction, Proceedings of the 2002 symposium on Eye tracking research & applications, 2002, pp 125-132.
- Wang K, Ji Q, 3D gaze estimation without explicit personal calibration, Pattern Recognit, 79(2018)216–227.
- Lu F, Chen X, Sato Y Appearance-Based Gaze Estimation via Uncalibrated Gaze Pattern Recovery in IEEE Transactions on Image Processing, 26(2017)1543–1553; doi: 10.1109/TIP.2017.2657880.
- Abdel-Aziz Y I, Karara H M, Direct linear transformation from comparator coordinates into object space in close-range photogrammetry, In Proceedings of the ASP Symposium on Close-Range Photogrammetry, 1971, pp 1–18.
- Genovese K, Casaletto L, Rayas J A, Flores V, Martinez A, Stereo-digital image correlation (DIC) measurements with a single camera using a biprism, Engineering, 51(2013)278–285.
- Ji Q, Yang X, Real-time eye, gaze, and face pose tracking for monitoring driver vigilance, Real-time imaging, 8(2002)357–377.
- Swaminathan A, Ramachandran M, U.S. Patent No. 9,996,150. Washington, DC: U.S. Patent and Trademark Office, (2018).
- Hansen J P, Andersen A W, Roed P, Eye-gaze control of multimedia systems, Advances in Human Factors/Ergonomics, 20(1995)37–42.
- Campion G, Bastin G, Dandrea-Novel B, Structural properties and classification of kinematic and dynamic models of wheeled mobile robots. IEEE transactions on robotics and automation, 12(1996)47–62.
- Ballard D H, Generalizing the Hough transform to detect arbitrary shapes, Pattern Recognit, 13(1981)111–122.