用户名: 密码: 验证码:
单目摄像机实现的注视方向估计研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
注视方向估计是计算机视觉和模式识别研究的热点问题之一,具有重要的理论意义和应用价值。对注视方向估计技术的系统研究能够推动这些领域的发展,在人机交互、心理学研究等领域具有重要的应用前景。近年来,虽然侵入式的注视方向估计取得了很大进展,但非侵入式的注视方向估计尚不成熟。要实现真正鲁棒、实用的非侵入式的自动视线估计和跟踪系统还需要解决大量的关键问题,尤其需要研究高效的人眼描述特征,以及实现头部自由动作的视线方向估计算法。
     本文研究了以单摄像机作为信息获取手段的非侵入式注视方向估计的相关问题,包括基于单摄像机的数据获取与基准数据(ground truth)的自动标注、眼部表观特征的表示、头部自由运动的注视方向估计等问题。概括而言,本文的主要研究工作包括:
     1.设计了一种同步采集注视方向、头部姿态数据和面部图像的方法,并实现了相应的装置。在基于统计学习的算法中,系统的性能依赖于大量标注的训练数据。因此,标注的数据集是注视方向估计研究的基础和前提。本文设计的这种数据采集方法,能够在复杂环境下,同步采集图像、姿态、注视方向以及各个目标间的空间位置关系。其采集的数据为后续实验的训练和测试提供了保障。
     2.提出一种基于方向二值模式特征的注视方向估计方法。随着注视方向的改变,在眼窝中巩膜和虹膜位置之间的相对位置也随着改变。这些改变可以看作是虹膜横向和纵向运动,这种运动会引起眼部图像纹理的相应变化。针对虹膜纵向和横向的移动变化特点,提出方向二值模式(Directional Binary Pattern, DBP)的表示方法。通过计算四个方向上差分信息,使DBP特征不仅包含局部纹理信息,同时还包含特定方向的二值差分信息。因此,DBP特征适合解析虹膜相对运动而引起眼部图像的纹理变化。同时DBP特征对光照变化具有鲁棒性,能减少因光照影响而引起的计算误差。
     3.提出了一种基于混合特征的注视方向估计方法。混合特征由模型特征和表观特征组成。模型特征提取特征点间的几何向量;表观特征是从眼睛图像提取基于Gabor特征的方向二值模式(Gabor Directional Binary Pattern, GDBP)。本文将两种特征通过支持向量回归(Support Vector Regression, SVR)算法融合起来,从而获得某一确定的头部姿态下的注视方向。将方向二值模式(DBP)用于编码图像的Gabor幅值特征,从而表示表观特征,并取得了较好的性能。基于混合特征的方法具有如下特点:(1)根据不同的计算方向对眼部图像进行二值化;(2)成功地将DBP算子和Gabor幅值特征进行结合,最后提取空间直方图特征作为判别特征;(3)既利用了表观特征较好的统计特性,也得益于模型特征在对光照变化的鲁棒性。
     4.提出一种头部自由动作的注视方向估计方法。对基于图像特征的注视方向估计研究而言,包含两个重要的问题:头部姿态和眼睛注视方向。目前,头部自由动作的注视方向估计方法多数通过先确定头部姿态,后估计注视方向的方法实现。本文提出一个分布式算法实现头部可动作的注视方向估计,分别估计头的姿态和眼睛的注视方向。在此基础上,提出了一种基于人脸和眼睛特征层级融合的注视方向估计方法。实验验证了该方法的有效性。
     通过上述工作,本文对基于单摄像机的注视方向估计涉及的一些相关问题进行了研究。结果表明:眼部图像的模型特征和表观特征从不同角度描述了注视方向的信息,高效地对二者进行融合,可以取得更为稳定的估计结果。另外,本文基于所提出的方法实现了相应的原型系统。实验结果表明,本文提出的方法具有潜在的应用价值。
Gaze estimation is one of the hot research topics in computer vision and patter recognition. It is very significant in the theoritic and practical aspects. Progress in gaze estimation could push these fields forward. Gaze estimation can also be used in Human-Compter Interaction (HCI), and psychology research. Although intrusive gaze estimation has made a big progress in recent years, non-intrusive gaze estimagtion is still in preliminary stage for application. To achieve robust non-intrusive gaze tracking system, it still needs to overcome some key problems. Especially, it needs effective feature and gaze estimation method to implement head-free gaze estimation.
     This thesis focuses on the some problems related to non-intrusive gaze estimation from a monocular camera. The problems include data collection, and automatically labeling the ground truth of the collected data, eye appreance feature representation, and head-free gaze estimation. The main contributions of the thesis are as following:
     1. Propose a data collection method which can capture gaze direction, head pose, and face image simultaneously, and a capture studio is implemented based on the above method. For a statistical learning algorithm, the performance relies on large amounts of labelled data. Therefore, the labeled data is the foundation of gaze estimation research. This thesis proposes a novel method of data collection in the complex environment. Our method can synchronously collect the images, head pose, gaze, and the spatial position of subjects. The collected data provides a guarantee for the further experimental training and testing.
     2. Propose a novel feature named Directional Binary Pattern (DBP) for gaze estimation. The sclera and the iris change their position within an eye socket with the change of gazing different directions. The change can be looked as horizontal and vertical movement of iris, which causes the texture change of eye image. To characterize iris vertical and horizontal movement, a directional binary pattern is proposed. By calculating the difference in the four directions, DBP not only contains the local texture information, but also contains specific directions binary differential information. Therefore, DBP is suitable to descript the texture
     changement of eye image related to the movement of iris. Mean while, DBP is robust to light variances and can decrease calculating error related to the light reflection.
     3. Propose a hybrid feature-based method for gaze estimation. Hybrid feature contains the model-based feature and appearance-based feature. Model-based feature contains the geometric vector among the feature points; appearance -based feature is extracted from the eye image based on the Gabor Directional Binary Pattern (GDBP). In this thesis, the combination of features is calculated by Support Vector Regression (SVR) algorithm and one of hybrid features corresponds to a gaze direction in a fixed head pose. For the appearance-based feature, the DBP operator successfully combines with the Gabor amplitude information, which has made a perfect performance. Hybrid feature-based approach has the following characteristics: (1) Binarize the eye image into different calculating directions. (2) Successful combination of the DBP operator and the Gabor amplitude informations, and the final discriminating feature is the extracted spatial histogram from the hybrid features. (3) Explode their statistical properties of features, and also benefit from the robustness to light variances.
     4. Propose a gaze estimation method which independs to head pose. To video-based gaze estimation, there are two important components: the head pose and gaze direction. At present, the algorithms realize the gaze tracking under the free head motion by calculating the head pose and gaze direction in sequence. This paper presents a distributed framework to estimate the head pose and gaze direction respectively, which can achieve the gaze tracking under the free head pose. On this basis, this paper proposes an algorithm for gaze tracking by the combination of the head and eye features. Experimental results show that our method is effective.
     In conclusion, through above-mentioned work, this dissertation makes a deep research on the problems of gaze estimation from a monocular camera. The experimental results show that the appearance-based feature and model-based feature have the discriminating information related to the same gaze direction. And then, excellent system performance can be acheved by effectively combining the two features. Moreover, in this dissertation, the gaze direction can be estimated by only one camera under the free head motion. And the proposed methods are applied in the gaze estimation system. The experimental results show that the proposed methods have the practical value.
引文
1. J. R. Jacob. Eye Tracking In Human Computer Interaction and Usability Research: Ready to Deliver the Promises. Computer Vision and Image Understanding, 2003: 573-605
    2. T. Ohno, N. Mukawa, A. Yoshikawa. FreeGaze: A Gaze Tracking System For Everyday Gaze interaction. In Proc. Of Eye Tracking Research and Applications Symposium, 2002: 125-132
    3. R. Atienza, A. Zelinsky. Active Gaze Tracking for Human Robot Interaction. International Conference on Multimodal Interfaces (ICMI 2002), USA, 2002: 261-266
    4.阎国利,白学军.广告心理学中的眼动研究和发展趋势[J].心理科学, 2004, 27(2): 459-461
    5.丁锦红,王军,张钦.平面广告中图形与文本加工差异的眼动研究[J].心理学探新, 2004, 24(4): 30-34
    6. L. Chen, L. Li. A New Image Processing Method For Evaluation The Papillary Responses In A HMD-type Eye-tracking Device[J]. Optics & Laser Technology, 2003(35): 505-515
    7. Z. Liu, X. Yuan. Characteristics of Eye Movement and Cognition During Simulated Landing of aircraft[J].航天医学与医学工程, 2002, 15(5): 379-380
    8. A. Duchowski. Eye Tracking Methodology: Theory and Practice. Springer- Verlag , London, UK, 2003
    9. D. Hansen, Q. Ji. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE trans. on PAMI, 2010:32(3): 478-500
    10. D. Tweed, T. Vilis. Geometric relations of eye position and velocity vectors during saccades. Vision Research, 1990, 30(1): 111-127
    11. X. Brolly, J. B. Mulligan. Implicit calibration of a remote gaze tracker. In Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW’04), USA, 2004,(8): 134-143
    12. Y. Ebisawa, S. Satoh. Effectiveness of pupil area detection technique using two light sources and image difference method. In Proceedings of the 15thAnnual Int. Conf. of the IEEE Eng. in Medicine and Biology Society, San Diego, CA, 1993: 1268-1269
    13. D. W. Hansen, J. P. Hansen, M. Nielsen, A. S. Johansen, M. B. Stegmann. Eye typing using markov and active appearance models. In IEEE Workshop on Applications on Computer Vision, 2003: 132-136
    14. D. W. Hansen, A. E.C. Pece. Eye tracking in the wild. Computer Vision and Image Understanding, 2005, 98(1): 182-210
    15. Q. Ji, X. Yang. Real-time eye, gaze, and face pose tracking for monitoring driver vigilance. Real-Time Imaging, 2002, 8(5): 357-377
    16. C. H. Morimoto, M. R. Mimica. Eye gaze tracking techniques for interactive applications. Computer Vision and Image Understanding, 2005, 98(1): 4-24
    17. O. M. Williams, A. Blake, R. Cipolla. Sparse and semi-supervised visual mapping with the S3p. In Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR 2006), New York, NY, USA, 2006: 230-237
    18. D. M. Stampe. Heuristic filtering and reliable calibration methods for video-based pupil-traking systems. Behaviour Research Methods, Instruments & Computers, 1993, 25(2): 137-142
    19. D. W. Hansen. Comitting Eye Tracking. PhD thesis, IT University of Copenhagen, 2003
    20. T. Ohno, N. Mukawa. A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Eye Tracking Research & Applications Symposium, 2004: 115-122
    21. A. Villanueva, R. Cabeza, S. Porta. Eye tracking: Pupil orientation geometrical modeling. Image and Vision Computing, 2006, 24(7): 663-679
    22. J. G. Wang, E. Sung, R. Venkateswarlu. Estimating the eye gaze from one eye. Computer Vision and Image Understanding, 2005, 98(1): 83-103
    23. J. Merchant, R. Morrissette and J. Porterfield. Remote measurements of eye direction allowing subject motion over one cubic foot of space. IEEE Transactions on Biomedical Engineering, 1972, 21(4): 309-317
    24. C. H. Morimoto, D. Koons, A. Amir, and M. Flickner. Pupil detection and tracking using multiple light sources. Image and vision computing, 2000, 18(4): 331-335
    25. J. White, T.E. Hutchinson and J.M. Carley. Spatially dynamic calibration of an eye-tracking system. IEEE Transactions on Systems, Man, and Cybernetics, 1993, 23(4): 1162-1168
    26. E. Guestrin, M. Eizenman. General theory of remote gaze estimation using the pupil center and corneal re?ections. IEEE Transactions on Biomedical Engineering, 2006,53(6): 1124-1133
    27. S. W. Shih, Y. Wu and J. Liu. A calibration-free gaze tracking technique. In Proceedings of the 15th International Conference on Pattern Recognition, 2000: 201-204
    28. Q. Ji, Z. Zhu. Eye and gaze tracking for interactive graphic display. In Proceedings of the 2nd international symposium on Smart graphics, 2002: 79-85
    29. Z. Zhu, Q. Ji and K. Bennett. Nonlinear eye gaze mapping function estimation via support vector regression. 18th International Conference on Pattern Recognition, 2006(1): 1132-1135
    30. D. W. Hansen, A. E. Pece. Eye tracking in the wild. Computer Vision and Image Understanding, 2005, 98(1): 182-210
    31. D. W. Hansen. Using Colors for Eye Tracking. Chapter Color Image Processing, Emerging Applications, 2000: 309-327
    32. D. W. Hansen, R. Hammoud. An improved likelihood model for eye tracking. Computer Vision and Image Understanding, Elsevier, 2007: 220-230
    33. D. W. Hansen, J. P. Hansen. Robustifying eye interaction. In 2th Conference on Vision for Human Computer Interaction, 2006: 152-158
    34. D. W. Hansen, H. Skovsgaard, J. P. Hansen and E. M?llenbach. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 symposium on Eye tracking research & applications, NY, USA, ACM, 2008: 205-212
    35. M. S. Kolakowski, J. B. Pelz. Compensating for eye tracker camera movement. In Proceedings of the 2006 symposium on Eye tracking research & applications, ACM, 2006: 79-85
    36. Z. Zhu, Q. Ji. Novel eye gaze tracking techniques under natural head movement. IEEE Transactions on Biomedical Engineering, 2007, 54(12): 46-60
    37. R. Carpenter. Movements of the Eyes. Pion Limited, London, 1988
    38. D. Beymer and M. Flickner. Eye gaze tracking using an active stereo head. In Proc. IEEE Conf. on Com. Vis. and Pat. Rec.(CVPR), 2003 (II): 451-458
    39. A. Villanueva, R. Cabeza and S. Porta. Gaze tracking system model based on physical parameter. International Journal on Pattern Recognition and Artificial Intelligence, 2007: 855-877
    40. C. Morimoto, A. Amir and M. Flickner. Detecting eye position and gaze from a single camera and 2 light sources. In 16th International Conference on Pattern Recognition, 2002(4): 40-43
    41. J.M. Miller, H.L. Hall, J.E Greivenkamp and D.L. Guyton. Quantification of the br¨uckner test for strabismus. Invesigation Ophthalmology & Visual Science, 1995, 36(4): 897-905
    42. B. Noureddin, P.D. Lawrence and C.F. Man. A non-contact device for tracking gaze in a human computer interface. Computer Vision and Image Understanding, 2005, 98(1): 52-82
    43. R. Newman, Y. Matsumoto, S. Rougeaux, and A. Zelinsky. Real-time stereo tracking for head pose and gaze estimation. In International Conference on Automatic Face and Gesture Recognition, 2000: 122-128
    44. S.W. Shih and J. Liu. A novel approach to 3-D gaze tracking using stereo cameras. IEEE Transactions on Systems, Man and Cybernetics, 2004, 34(1): 234-245
    45. D. H. Yoo and M. J. Chung. A novel non-intrusive eye gaze estimation using cross-ratio under large head motion. Computer Vision and Image Understanding, 2005, 98(1): 25-51
    46. F. L. Coutinho and C. H. Morimoto. Free head motion eye gaze tracking using a single camera and multiple light sources. In Manuel Menezes de Oliveira Neto and Rodrigo Lima Carceroni, editors, Proceedings. IEEE Computer Society, Oct, 2006: 8-11
    47. D. W. Hansen, A. Pece. Eye tracking in the wild. Computer Vision and Image Understanding, April, 2005, 98(1): 182-210
    48. C. H. Morimoto and M. Flickner. Real-time multiple face detection using active illumination. In Proc. of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France, 2000: 8-15
    49. J. G. Wang and E. Sung. Gaze determination via images of irises. Image and Vision Computing, 2001, 19(12): 891-911
    50. T. Ohno. One-point calibration gaze tracking method. In ETRA’06: Proceedings of the 2006 symposium on Eye tracking research & applications, ACM Press, New York, NY, USA, 2006: 34-41
    51. A. Villanueva and Rafael Cabeza. Models for gaze tracking systems. Journal on Image and Video Processing, Hindawi Publishing Corp. 2007, page Article ID 23570
    52. J. P. Hansen, K. Itoh, A. S. Johansen, K. T?rning, and A. Hirotaka. Gaze typing compared with input by head and hand. In Eye Tracking Research & Applications Symposium, ACM, 2004: 131-138
    53. J. Kang, E. D. Guestrin, and E. Eizenman. Investigation of the cross-ratio method for point-of-gaze estimation. IEEE Transactions on Biometical Engineering, 2008: 2293-2302.
    54. A. T. Duchowski, E. Medlin, N. Cournia, A. Gramopadhye, B. Melloy, and S. Nair. 3D eye movement analysis for visual inspection training. In ETRA’02: Proceedings of the 2002 symposium on Eye tracking research & applications, New York, NY, USA, ACM, 2002: 103-110
    55. C. Hennesy and P. Lawrence. 3D point-of-gaze estimation on a volumetric display. In roceedings of the 2008 symposium on Eye tracking research and applications, 2008: 59-59
    56. K. Talmi and J. Liu. Eye and gaze tracking for visually controlled interactive stereoscopic displays.Signal Processing: Image Communication, 1999, 14(10): 799-810
    57. F. Karmali and M. Shelhamer. Compensating for camera translation in video eye movement recordings by tracking a landmark selected automatically by a genetic algorithm. Annual International Conference of the IEEE Engineering in Medicine and Biology Proceedings, 2006: 5298-5301
    58. S. Kim and Q. Ji. Non-intrusive eye gaze tracking under natural head movements. In 26th Annual International Conference IEEE Engineering in Medicine and Biology, 2004: 2271-2274
    59. A. Tomono, M. Iida, and Y. Kobayashi. A tv camera system which extracts feature points for non-contact eye movement detection. In SPIE Optics, Illumination, and Image Sensing for Machine Vision, 1989(1194): 2-12
    60. C. Colombo and A. Bimbo. Real-time head tracking from the deformation of eye contours using a piecewise affine camera. PRL, 1999, 20(7): 721-730
    61. S. Baluja and D. Pomerleau. Non-intrusive gaze tracking using artificial neural networks. In Jack D. Cowan, Gerald Tesauro, and Joshua Alspector, editors, Advances in Neural Information Processing Systems, Morgan Kaufmann Publishers, 1994(6): 753-760
    62. L. Q. Xu, D. Machin, and P. Sheppard. A novel approach to real-time non-intrusive gaze finding. In Proc. British Machine Vision Conference, 1998: 191-195
    63. H. Crane and C. Steele. Accurate three-dimensional eye tracker. Journal of Optical Society of America, 1978, 17(5): 691-705
    64. P. Muller, D. Cavegn, G. Ydewalle and R. Groner. A comparison of a new limbus tracker, corneal re?ection technique, purkinje eye tracking and electro-oculography. In G. Ydewalle and J. Rensbergen, editor, Perception and Cognition, Elsevier Science Publishers, 1993: 393-401
    65. R. Stiefelhagen, J. Yang, and A. Waibel. Tracking eyes and monitoring eye gaze. In Proceedings of the Workshop on Perceptual User Interfaces, 1997: 98-100
    66. K. Tan, D. Kriegman, and N. Ahuja. Appearance-based eye gaze estimation. Sixth IEEE Workshop on Applications of Computer Vision (WACV2002). Proceedings.2002: 191-195
    67. D. Li, D. Winfield, and D. J. Parkhurst. Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches. In Proceedings of the Vision for Human-Computer Interaction Workshop, IEEE Computer Vision and Pattern Recognition Conference, 2005: 79-87
    68. J. Heinzmann and A. Zelinsky. 3-D Facial Pose and Gaze Point Estimation using a robust real-timetracking paradigm. In IEEE International Conference on Automatic Face and Gesture Recognition, 1998: 14-16.
    69. T. Ishikawa, S. Baker, I. Matthews, and T. Kanade. Passive driver gaze tracking with active appearance models. InProceedings of the 11th World Congress on Intelligent Transportation Systems, 2004: 20-38
    70. H. Yamazoe, A. Utsumi, T. Yonezawa, and S. Abe. Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions. In Proceedings of the 2008 symposium on Eye tracking research and applications, 2008: 140-145
    71. Y. Matsumoto, T. Ogasawara, and A. Zelinsky. Behaviour recognition based on head pose and gaze direction measurement. In IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2000: 2127-2132
    72. T. Liu and S. Zhu. Eye Detection and Tracking Based on Entropy in Particle Filter. International Conference on Control Automation, 2005: 1002-1006
    73.刘瑞安,靳世久,宋维.单摄像机视线跟踪.计算机应用, 2006, 26(9): 2101-2104
    74. T. Cootes, C. Taylor, D. Cooper, and J. Graham. Active Shape Models–Their Training and Application. Computer Vision and Image Understanding. 1995, 61(1): 38-59
    75. T. Cootes, G. Edwards, C. Taylor. Active Appearance Models. Proceeding of European Conference on Computer Vision. 1998, 2: 484-498
    76. B. Ma, S. Shan, X. Chen, and W. Gao. Head yaw estimation from asymmetry of facial appearance. IEEE Transactions on Systems, Man, and Cybernetics–Part B: Cybernetics, 2008, 38(6): 1501-1512
    77. C. Chang, C. Lin. LIBSVM: a library for support vector machines. 2001, Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm
    78. A. Barla1, F. Odone, A. Verri. Histogram Intersection kernel for image classification. In proc. of ICIP, 2003: 161-164
    79. S. Belongie, C. Fowlkes, F.N. Chung, and J. Malik. Spectral partitioning with indefinite kernels using the nystorm extensions. In proc. of ECCV, 2002: 531-542
    80. O. Chapelle, P. Hallner, and V.N. Vapnik. Support vector machines for histogram-based image classification. IEEE Transactions on Neural Networks, 1999, 10(5): 1055-1064.
    81. Eye Tracker, http://discover.news.163.com/10/0426/09/656FI51K000125LI.html
    82. R. Brunelli and T. Poggio. Face Recognition: Features versus Templates. IEEE Transactions on Pattern Analysis and Machine Intelligence,1993, 15(10): 1042-1052
    83. T. Sim, S. Baker, and M. Bsat. The CMU Pose, Illumination and Expression Database. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2003, 25(12): 1615-1618
    84. E. B. Bailliere, S. Bengio, F. Bimbot, M. Hamouz, J. Kittler, J. Mariethoz, J.Matas, K. Messer, V. Popovici, F. Poree, B. Ruiz, and J.-P. Thiran. The BANCAdatabase and evaluation protocol. in Audio- and Video-Based Biometric PersonAuthentication (AVBPA) , 2003: 625-638
    85. W. Gao, B. Cao, S. Shan, D. Zhou, X. Zhang, D. Zhao. The CAS-PEAL Large-Scale Chinese Face Database and Evaluation Protocols. Technical Report No. JDL_TR_04_FR_001, Joint Research & Development Laboratory, CAS, 2004
    86. A.R. Martinez and R. Benavente. The AR face database. Technical Report 24, Computer Vision Center (CVC) Technical Report. Barcelona, Spain, 1998
    87. K. Messer, J. Matas, J. Kittler, J. Luettin, and G. Maitre. XM2VTSDB: The extended M2VTS database. In Second International Conference on Audio and Video-basedBiometric Person Authentication,1999: 72-77
    88. M.Turk, A.Pentland. Eigen-faces for Recognition. Jounal of cognitive neuroscience.1991,3(1): 71-86
    89. U. Weidenbacher, G. Layher, P. Strauss, H. Neumann. A comprehensive head pose and gaze database. Proc. of 3rd IET International Conference on Intelligent Environments, 2007 : 455-458
    90. S. Asteriadis, D. Soufleros, K. Karpouzis, and S. Kollias. A natural head pose and eye gaze dataset. In Proc. of the AFFINE Workshop, 2009: 30-34
    91. A. Yilmaz and M. Shah. Automatic feature detection and pose recovery for faces. In Proceedings of the Fifth Asian Conference on Computer Vision, 2002: 284-289
    92. uEye. http://www.ids-imaging.com/
    93. Polhemus FASTRAK. http://www.polhemus.com/fastrak.htm
    94. Z. Niu, X. Chen, W.Gao. Enhance ASMs Based on AdaBoost-Based Salinet Landmarks Localization and Confidence-Constraint Shape Modeling. IWBRS, LNCS 3781, 2005: 9-14
    95. S.M. Smith and J.M. Brady. SUSAN-a new approach to low level image processing. International Journal of Computer Vision. 1997, 23: 45-78
    96. B. Zhang, Z. Wang, and B. Zhong. Kernel Learning of Histogram of Local Gabor Phase Patterns for Face Recognition. EURASIP Journal on Advances in Signal Processing, 2008, Article ID 469109: 8 pages.
    97. T. Ojala, M. Pietik?inen, and T. M?enp??. Multi-resolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Trans. PAMI, 2002, 24(7): 971-987
    98. M. Pietik?inen, T. Ojala, and Z. Xu. Rotation-invariant texture classification using feature distributions. Pattern Recognition, 2000 (33): 43-52
    99. D. Gabor. Theory of Communication. Journal of Institute for Electrical Engineering. 1946, 93(26): 429-457
    100. Daugman. Uncertainty Relation for Resolution in Space, Spatial Frequency, and Orientation Optimized by Two-Dimensional Visual Cortical Filters. J. Optical Soc. Amer. 1985, 2(7): 160-169
    101. L. Wiskott, J. Fellous, N. Krueger. Face Recognition by Elastic Bunch Graph Matching. IEEE Transactions on PAMI, 1997, 9(7): 775-779
    102. S. Shan, W. Gao, Y. Chang, B. Cao, P. Yang. Review the strength of Gabor features for face reconition from the angle of its robustness to mis-alignment. IEEE Int. Conf. on Pat. Rec. (ICPR), 2004: 338-341
    103. M. Lades, J.C. Vorbruggen, J. Buhmann, J. Lange, C. Malsburg, R.P. Wurtz, and W. Konen. Distortion invariant object recognition in the dynamic link architecture. IEEE Trans. Computers, 1993, 42(3): 300-311
    104. Y. Sugano, Y. Matsushita, Y. Sato, H. Koike. An Incremental Learning Method for Unconstrained Gaze Estimation. ECCV, 2008, Part III: 656- 667
    105. T. Cornsweet, H. Crane. Accurate two-dimensionnal eye tracker using first and fourth purkinje images. J. Opt. Soc. Am. 1973 (63): 921-928
    106. Y. Matsumoto, A. Zelinsky. An algorithm for real-time stereo vision implementation of head pose and gaze direction measurement. Proc. Fourth Int. Conf. on AFGR, 2000: 499-504
    107. J. S. Agustin, A. Villanueva, and R. Cabeza. Pupil brightness variation as a function of gaze direction. In ETRA’06: Proceedings of the 2006 symposium on Eye tracking research & applications, New York, NY, USA, ACM Press, 2006: 49-49
    108. K. Nguyen, C. Wagner, D. Koons, and M. Flickner. Differences in the infrared bright pupil response of human eyes. In ETRA’02: Proceedings of the symposium on Eye tracking research & applications, New York, NY, USA, ACM Press. 2002: 133-138
    109. H. Lu, C.wang, Y. Chen. Gaze Tracking By Binocular Vision and LBP Features. IEEE Int. Conf. on Pat. Rec. (ICPR), 2008: 1-4
    110. H. Lu, G. Fang, C. Wang, Y. Chen. A novel method for gaze tracking by local pattern model and support vector regressor. EURASIP Signal Processing, 2010, 90(4): 1290-1299
    111. A. Smola, B. Scholkopf. A tutorial on support vector regression. NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway Colleye, University of London, UK, 1998
    112.牛志恒,面部特征点定位方法研究[J],哈尔滨工业大学博士学位论文,2009。
    113. M. Kass, A. Witkin, D. Terzopoulos. Snakes: Active Contour Models, In Proceedings, First International Conference On Computer Vision, IEEE Comput. Soc. Press, 1987:259-268
    114. M. Stegmann, R. Fisker, B. Ersb?ll. Active Appearance Models: Theory and Cases. Proc. Danish Conference on Pattern Recognition and Image Analysis. 2000: 49-57.
    115. P. Mahalanobis. On the Generalised Distance in Statistics. In Proc. India National Institute of Science. 1936.
    116. E. Chutorian and M. Trivedi. Head Pose Estimation in Computer Vision: A Survey. IEEE transactions on PAMI, April, 2009, 31(4): 607-626
    117. A. Yoshitaka, M. Yosgida. Automated Construction of Real World-Oriented Database with Gaze Detection. IEEE International Conference on Multimedia and Expo., 2001: 641-644
    118. A. Yoshitaka, H. Seki. Detecting Auditory Information in Concentration Based on Eye Movement. IEEE International Conference on Multimedia and Expo ICME'03, 2003: 537-540
    119. H. Prendinger, J. Mori, and M. Ishizuka. Using human physiology to evaluate subtle expressivity of a virtual quizmaster in a mathematicalgame,”Int. J. Human-Computer Studies, 2005, 62(2): 231-245
    120. H. Prendinger, C. Ma, J. Yingzi, A. Nakasone, and M. Ishizuka. Understanding the effect of life-like interface agents through eye users’eye movements. Proc. 7th International Conference on Multimodal Interfaces (ICMI-05), ACM Press, 2005: 108-115
    121. H. Prendinger and M. Ishizuka. Social role awareness in animated agents. Proc. 5th International Conference on Autonomous Agents (Agents-01), ACM Press, New York, 2001: 270-277
    122. M. Miyakawa and T.Kosugi. A Three-dimensional Gazing Point Detection System for Communication Analysis. Proc.25th Annual International Conference on EMB, 2003: 918-921
    123. M. Miyakawa, Y. Kobayashi. A Method for Real-Time Detection of Gazing Point Developed for a MultiModal Interface. Trans. IEICE Japan, 2000: 2810-2821
    124. N. Ukita, A. Sakakihara, M. Kidode. Wearable Vision Interfaces: towords Wearable Information Playing in Daily Life. Proc. 1st CREST Workshop on Advanced Computing and Communicating Techniques for Wearable Information Playing,2002: 47-56
    125. N. Ukita, A. Sakakihara, M. Kidode. Extracting a Gaze Region with the History of View Directions. In Proc. of the 10th Symposium on Sensing via Imaging Information, 2004: 279-284
    126. T. Kawamura, Y. Kono and M. Kidode. Wearable interfaces for a video diary: towards memory retrieval, exchange, and transportation. Proceedings of ISWC, 2002
    127. T. Fukuhara, H. Takeda, Y. Kono and M. Kidode. Ubiquitous Memories: Wearable Interface for Computational Augmentation of Human Memory based on Real World Objects. Information Science Technical Report #NAIST-IS-TR2002012, Nara Institute of Science and Technology, 2002
    128. V. Surakka, M. illi, P. Isokoski. Gazing and Frowing as A New Human-Computer Interaction Technique. ACM Transactions on Applied Perceptions, 2004(1): 40-56
    129. V. Vinaygamoorthy, M. Garau, A. Steed, M. Slater. An Eye Gaze Model for Dyadlic Interaction in an Immsersive virtual Enviroment: Practice and Experience. ACM, Computer Graphics, 2004, 23(1): 1-11
    130. V. Tantri, R. J. K. Jacob. Intercating with Eye Movements in Virtual Environment. ACM CHI, 2000: 265-271
    131. J. Zhu, J. Yang. Subpixel eye gaze tracking. In: Proceedings of the 5th IEEE international conference on automatic face and gesture recognition, 2002: 131-136
    132. Z. Zhu, K. Fujimura, and Q. Ji. Real-time eye detection and tracking under various light conditions. In presented at the Symp. Eye Tracking Research Applications, New Orleans, LA, 2002: 25-57
    133. Q. Ji and X. Yang. Real time visual cues extraction for monitoring driver vigilance. In presented at the Proc. Int. Workshop Computer Vision Systems, Vancouver, BC, Canada, 2001(2095): 107-124
    134. F. Corno, A. Garbo. Multiple Low-cost Cameras for Effective Head and Gaze Tracking. In the 11th International Conference on Human-Computer Interaction, Las Vegas, USA, July 2005: 23-27
    135. J. Wang, E. Sung. Study on Eye Gaze Estimation. IEEE Tansaction on System, Man, and Cybernetics, 2002, Part B, 32(3): 332-350
    136. M. Pilu, A. W. Fitzgibbon, and R. B. Fisher. Ellipse-specific direct least-square fitting. Internationnal Conference on Image Processing, 1996: 599-602
    137. Sang-Gyu Cho, Kyung-Sik and Jae-Jeong Hwang. Gaze Tracking Based Pointer: Eye-Click. Intelligent Signal Processing and Communication Systems, 2004: 71-74
    138. R. Atienza, A. Zehrky. Intuitive Human-Robot Interaction through Active 3D Gaze Tracking. Symposium of Robotics Research, Siena, Italy, 2003: 1146-1148
    139. Y. Ebisawa. Improved Video-Based Eye-Gaze Detection Method. IEEE Trans.on Instrument and Measurement, 1998,47(4): 948-955
    140.马丙鹏,基于表观的人脸姿态估计问题研究[J],中国科学院计算技术研究所学位论文。
    141. J. chen,Q. Ji, 3D Gaze Estiamtion with a Single Camera without IR Illmination , IEEE Int. Conf. on Pat. Rec. (ICPR), 2008: 328-331
    142.张宝昌,人脸特征提取和非线性识别方法的研究[J],哈尔滨工业大学学位论文。
    143.苏煜,融合全局和局部特征的人脸识别[J],哈尔滨工业大学学位论文。
    144. J. Kittler, M. Hatef, R. Duin, J. Matas. On Combining Classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence. 1998, 20(3): 228-239
    145. Y. W. Chang and C. Lin. Feature Ranking Using Linear SVM. JMLR Workshop and Conference Proceedings: Causation and Prediction Challenge (WCCI 2008), 2008: 53-64
    146. A. Smola and B. Scholkopf. A tutorial on support vector regression. NeuroCOLT Technical Report NC-TR-98-030, Royal Holloway Colleye, University of London, UK, 1998
    147. H. Yamazoe, A. Utsumi, T. Yonezawa, and S. Abe. Remote and Head-Motion-Free Gaze Tracking for Real Environments with Automated Head-Eye Model Calibrations. IEEE CVPR Workshop for Human Communicative Behavior Analysis (CVPR4HB), 2008: 1-6
    148. P. Viola, M. Jones. Rapid Object Detection using a Boosted Cascade of Simple Features. Proceeding of the International Conference on Computer Vision and Pattern Recognition. 2001: 511-518
    149. Z. Niu, S. Shan, S. Yan, X. Chen, W. Gao. 2D Cascaded AdaBoost for Eye Localization. Proceeding of the International Conference on Pattern Recognition (ICPR’06), 2006: 20-24

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700