眼球光心标定与距离修正的3维注视点估计
3D gaze estimation using eyeball optical center calibration and distance correction
- 2019年24卷第8期 页码:1369-1380
收稿:2018-12-10,
修回:2019-1-18,
纸质出版:2019-08-16
DOI: 10.11834/jig.180643
移动端阅览

浏览全部资源
扫码关注微信
收稿:2018-12-10,
修回:2019-1-18,
纸质出版:2019-08-16
移动端阅览
目的
2
在基于双目视线相交方法进行3维注视点估计的过程中,眼球光心3维坐标手工测量存在较大误差,且3维注视点估计结果在深度距离方向偏差较大。为此,提出了眼球光心标定与距离修正的方案对3维注视点估计模型进行改进。
方法
2
首先,通过图像处理算法获取左右眼的PCCR(pupil center cornea reflection)矢量信息,并使用二阶多项式映射函数得到左、右眼的2维平面注视点;其次,通过眼球光心标定方法获取眼球光心的3维坐标,避免手工测量方法引入的误差;然后,结合平面注视点得到左、右眼的视线方向,计算视线交点得到初步的3维注视点;最后,针对结果在深度距离方向抖动较大的问题,使用深度方向数据滤波与
Z
平面截取修正法对3维注视点结果进行修正处理。
结果
2
选择两个不同大小的空间测试,实验结果表明该方法在3050 cm的工作距离内,角度偏差0.7°,距离偏差17.8 mm,在50130 cm的工作距离内,角度偏差1.0°,距离偏差117.4 mm。与其他的3维注视点估计方法相比较,在同样的测试空间条件下,角度偏差和距离偏差均显著减小。
结论
2
提出的眼球光心标定方法可以方便准确地获取眼球光心的3维坐标,避免手工测量方法带来的误差,对角度偏差的减小效果显著。提出的深度方向数据滤波与
Z
平面截取修正法可以有效抑制数据结果的抖动,对距离偏差的减小效果显著。
Objective
2
Gaze estimation can be divided into 2D and 3D gaze estimation. The 2D gaze estimation based on polynomial mapping uses only single-eye pupil center cornea reflection (PCCR) vector information to calculate the 2D (
x
y
) point of regard (POG) in a plane. The 3D gaze estimation based on binocular lines of sight intersection needs to use the PCCR vector information of both eyes and the 3D coordinate of the left and right eyeball optical centers (the point at which eye sight is emitted) to calculate 3D (
x
y
z
) POG in a 3D space. In the process of 3D gaze estimation
the measurement error exists as a result of manual measurement of the 3D coordinates of the eyeball optical center and the large deviation of the 3D gaze estimation results in the direction of depth. On the basis of the traditional binocular lines of the sight intersection method for 3D gaze estimation
we propose two primary improvements. We use a calibration method to obtain the 3D coordinates of the eyeball optical center to replace manual measurement. Then
we use data filtering in-depth direction and
Z
-plane intercepting correction method to correct the 3D gaze estimation results.
Method
2
First
the subject gazes at nine marked points on a calibration plane
which is at the first distance away from human eyes
and an infrared camera in front of the subject is used to capture eye images. The image processing algorithm can obtain the PCCR vector information of both eyes. The mapping functions of both eyes on the first plane can be solved according to the second-order polynomial mapping principle between the PCCR vector and the plane marked points. Second
with the calibration plane moved to a second distance
the subject gazes at the nine marked points again. With the use of the mapping functions of both eyes
the 2D POG of both eyes at the first calibrated distance can be calculated
and the nine marked points at the second distance to the left and right 2D POG at the first calibrated distance can be connected. Multiple lines will intersect at two points
and calculating these two equivalent intersection points obtains the calibration result of the 3D coordinates of the eyeball optical center. Third
3D gaze estimation can be performed. With the left and right planar 2D POG combined with the 3D coordinates of the eyeball optical center and with the establishment of an appropriate space coordinate system (taking the calibration plane as the
X
and
Y
plane and taking the depth of the distance as the
Z
axis)
the lines of sight of both eyes can be calculated. According to the principle of human binocular vision
both eyes' lines of sight will intersect at one point in space
and calculating the intersection point can obtain the rough 3D POG. The binocular vision lines are generally disjoint due to calculation and measurement errors. Thus
the midpoint of the common perpendicular should be chosen as the intersection. Finally
for the larger jitter of the resultant in-depth direction
the proposed data filtering in-depth direction and
Z
-plane intercepting correction method is used to correct the rough result. In this method
the data sequence of depth distance direction (
Z
coordinate) is first filtered. Using the filtered distance result generates a plane that is perpendicular to the
Z
axis. Then
the plane intercepts the left and right lines of sight to obtain two points
and the midpoint of two points is chosen as the correction result of the other two directions (
X
and
Y
). After this filtering and correction process
a more accurate 3D POG can be obtained.
Result
2
We use two different sizes of workspaces to test the proposed method
and the experiment result shows that in the small workspace (24×18×20 cm
3
)
the work distance in-depth direction is 3050 cm
the angular average error is 0.7°
and the Euclidean distance average error is 17.8 mm. By contrast
in the large workspace (60×36×80 cm
3
)
the work distance in-depth direction is 50130 cm
the angular average error is 1.0°
and the Euclidean distance average error is 117.4 mm. Compared with other traditional 3D gaze estimation methods
the proposed method considerably reduces the angle and distance deviation under the same distance testing condition.
Conclusion
2
The proposed calibration method for the eyeball optical center can obtain the 3D coordinates of the eyeball optical center conveniently and accurately. The method can avoid the eyeball optical center measurement error introduced by manual measurement and reduce the angle deviation of 3D POG significantly. The proposed data filtering in-depth direction and
Z
-plane intercepting correction method can reduce the jitter of the 3D POG result in-depth direction and can reduce the distance deviation of 3D POG significantly. This method is of great significance for the practical application of 3D gaze.
Takagi K, Kawanaka H, Bhuiyan M S, et al. Estimation of a three-dimensional gaze point and the gaze target from the road images[C]//Proceedings of the 14th International IEEE Conference on Intelligent Transportation Systems. Washington, DC: IEEE, 2011: 526-531.[ DOI: 10.1109/itsc.2011.6083129 http://dx.doi.org/10.1109/itsc.2011.6083129 ]
Shimizu S, Fujiyoshi H. Acquisition of 3D gaze information from eyeball movements using inside-out camera[C]//Proceedings of the 2nd Augmented Human International Conference. Tokyo, Japan: ACM, 2011: 6.[ DOI: 10.1145/1959826.1959832 http://dx.doi.org/10.1145/1959826.1959832 ]
Lidegaard M, Hansen D W, Krüger N. Head mounted device for point-of-gaze estimation in three dimensions[C]//Symposium on Eye Tracking Research and Applications. Safety Harbor, Florida: ACM, 2014: 83-86.[ DOI: 10.1145/2578153.2578163 http://dx.doi.org/10.1145/2578153.2578163 ]
Abbott W W, Faisal A A. Ultra-low-cost 3D gaze estimation:an intuitive high information throughput compliment to direct brain-machine interfaces[J]. Journal of Neural Engineering, 2012, 9(4):046016.[DOI:10.1088/1741-2560/9/4/046016]
Mujahidin S, Wibirama S, Nugroho H A, et al. 3D gaze tracking in real world environment using orthographic projection[C]//Proceedings of 2016 Conference on Fundamental and Applied Science for Advanced Technology (ConFAST 2016). Yogyakarta, Indoesia: AIP Conference Proceedings, 2016, 1746(1): 020072.[ DOI: 10.1063/1.4953997 http://dx.doi.org/10.1063/1.4953997 ]
Panev S, Manolova A. Improved multi-camera 3D eye tracking for human-computer interface[C]//Proceedings of the 8th International Conference onIntelligent Data Acquisition and Advanced Computing Systems: Technology and Applications. Warsaw, Poland: IEEE, 2015: 276-281.[ DOI: 10.1109/idaacs.2015.7340743 http://dx.doi.org/10.1109/idaacs.2015.7340743 ]
Hennessey C, Lawrence P. Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions[J]. IEEE Transactions on Biomedical Engineering, 2009, 56(3):790-799.[DOI:10.1109/tbme.2008.2005943]
Mlot E G, Bahmani H, Wahl S, et al. 3D gaze estimation using eye vergence[C]//Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies-Volume 5: HEALTHINF. Rome, Italy: Scitepress, 2016: 125-131.[ DOI: 10.5220/0005821201250131 http://dx.doi.org/10.5220/0005821201250131 ]
Wibirama S, Nugroho H A, Hamamoto K. Evaluating 3D gaze tracking in virtual space:a computer graphics approach[J]. Entertainment Computing, 2017, 21:11-17.[DOI:10.1016/j.entcom.2017.04.003]
Wang R I, Pelfrey B, Duchowski A T, et al. Online 3D gaze localization on stereoscopic displays[J]. ACM Transactions on Applied Perception (TAP), 2014, 11(1):3.[DOI:10.1145/2593689]
Liu C C, Herrup K, Shi B E. Remote gaze tracking system for 3D environments[C]//Proceedings of the 39th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Seogwipo, South Korea: IEEE, 2017: 1768-1771.[ DOI: 10.1109/embc.2017.8037186 http://dx.doi.org/10.1109/embc.2017.8037186 ]
Zhao X C, Pan S H, Wang Y P, et al. Eye gaze tracking in 3D immersive environments[J]. Journal of System Simulation, 2018, 30(6):2027-2035.
赵新灿, 潘世豪, 王雅萍, 等.沉浸式3维视线追踪算法研究[J].系统仿真学报, 2018, 30(6):2027-2035. [DOI:10.16182/j.issn1004731x.joss.201806004]
Li S P, Zhang X L, Webb J D. 3-D-gaze-based robotic grasping through mimicking human visuomotor function for people with motion impairments[J]. IEEE Transactions on Biomedical Engineering, 2017, 64(12):2824-2835.[DOI:10.1109/tbme.2017.2677902]
Pichitwong W, Chamnongthai K. 3-D gaze estimation by stereo gaze direction[C]//Proceedings of the 13th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology. Chiang Mai, Thailand: IEEE, 2016: 1-4.[ DOI: 10.1109/ecticon.2016.7561491 http://dx.doi.org/10.1109/ecticon.2016.7561491 ]
Leroux M, Achiche S, Raison M. Assessment of accuracy for target detection in 3D-space using eye tracking and computer vision[J]. PeerJ Preprints, 2017, 5:e2718v1.[DOI:10.7287/peerj.preprints.2718]
Zhang Y H, Wei W, Yu D, et al. Shadow based single camera vision system calibration[J]. Journal of Image and Graphics, 2009, 14(9):1895-1899.
张远辉, 韦巍, 虞旦, 等.基于影子的乒乓球机器人单目视觉系统标定[J].中国图象图形学报, 2009, 14(9):1895-1899. [DOI:10.11834/jig.20090929]
相关文章
相关作者
相关机构
京公网安备11010802024621