用户名: 密码: 验证码:
一种无标识图像的FLANN-LM配准算法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Research on FLANN-LM registration algorithm based on unidentified image
  • 作者:张乃千 ; 王占刚
  • 英文作者:ZHANG Naiqian;WANG Zhangang;School of Information and Communication Engineering,Beijing Information Science & Technology University;
  • 关键词:增强现实 ; 虚实融合 ; FLANN-LM ; 图像配准 ; 算法结合
  • 英文关键词:augmented reality;;virtual reality;;FLANN-LM(Levenberg-Marquardt);;image registration;;algorithm combination
  • 中文刊名:BJGY
  • 英文刊名:Journal of Beijing Information Science & Technology University
  • 机构:北京信息科技大学信息与通信工程学院;
  • 出版日期:2019-04-15
  • 出版单位:北京信息科技大学学报(自然科学版)
  • 年:2019
  • 期:v.34;No.128
  • 基金:北京市科技创新服务能力建设-基本科研业务费(科研类)(71E1810969);; 2018年促进高校内涵发展-北京信息科技大学学科建设项目
  • 语种:中文;
  • 页:BJGY201902013
  • 页数:6
  • CN:02
  • ISSN:11-5866/N
  • 分类号:66-71
摘要
为提升增强现实系统效率,加强增强现实系统的抗干扰性与实用性,提出一种改进的图像配准方法 FLANN-LM。运用SURF算子提取图像特征点;采用FREAK算子描述特征点;对提取到的特征点运用二进制快速高维最近邻FLANN算法粗匹配;运用非线性最小二乘法LM(Levenberg-Marquardt)计算单应性矩阵,优化特征点匹配精度,实现图像增强。用FLANN-LM方法进行图像配准,提升了增强现实系统中无标识图片发生旋转及遮挡时的注册精度,完善了系统对实时性的追求。较传统图像配准算法RANSAC,图像匹配的准确性及速度得到了明显提高,保证了增强现实系统虚实融合的准确性。
        In order to improve the efficiency of the augmented reality system and enhance the antiinterference and practicality of the augmented reality system,an improved image registration method FLANN-LM is proposed. The SURF operator is used to extract the image feature points; the FREAK operator describes the feature points; the extracted feature points are roughly matched by the binary fast high-dimensional nearest neighbor FLANN algorithm; and the nonlinear least squares LM(LevenbergMarquardt) is used to calculate the homography matrix,optimizing feature point matching accuracy,and achieving image enhancement. Image registration using FLANN-LM method improves the registration accuracy of unmarked pictures in rotating and occlusion in augmented reality systems,and improves the system's pursuit of real-time performance. Compared with the traditional image registration algorithm RANSAC,the accuracy and speed of image matching have been significantly improved,which ensures the accuracy of the virtual reality fusion of the augmented reality system.
引文
[1] Kato H,Billing M. Marker tracking and HMD calibration for a video-based augmented reality conferencing system[C]//IEEE and ACM International Workshop on Augmented Reality,San Francisco,USA,1999:85-94.
    [2] Lowe D G.Object recognition from local vision[C]//Proceedings of the 7th IEEE International Conference on Computer Vision.Washington D.C.USA:IEEE Press,1999:1150-1157.
    [3] Bay H,Ess A,Tuytelaars T,et al.Speeded-up robust features(SURF)[J]. Computer Vision and Image Understanding,2008,110(3):346-359.
    [4] Michael Calonde, Vincent Lepetit,Mustafa Oezuysal. BRIEF:computing a local binary descriptor very fast[C]. IEEE Transactions on PatternAnalysisandMachine Intelligence,2012.
    [5] Rublee E,Konolige K. ORB:an efficient alternative to SIFT or SURF[C]//Proceedings of the 2011 International Conference of Computer Vision. Piscataway:IEEE,2011:2564-2571.
    [6] Alahi A,Ortiz R FREAK:fast retina keypoint[C]//Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition. Providence,RI,USA,2012:510-517.
    [7]谢孔凯,申闫春.东巴古迹虚拟场景增强现实仿真研究[J].计算机仿真,2015,32(06):407-411.
    [8]阚友宝.基于改进SURF算法的目标识别研究[D].哈尔滨:哈尔滨工程大学,2014.
    [9]安维胜,余让明,伍玉铃.基于FAST和SURF的图像配准算法[J].计算机工程,2015,41(10):232-239.
    [10]房贻广,刘武,高梦珠,等.基于FREAK描述子的精确图像配准改进算法[J].计算机应用,2016,36(12):3402-3405.
    [11]张焕炯,王国胜,钟义信.基于汉明距离的文本相似度计算[J].计算机工程与应用,2001(19):21-22.
    [12] Muja M. Fast approximate nearest neighbors with automatic algorithm configuration[C]//International Conference on Computer Vision Theory and Application Vissapp. 2009:331-340.
    [13]唐瑞,方方,郑玉琴,等.基于OpenCV的棋盘格角点检测算法设计[J].西部皮革,2018,40(20):63.
    [14]王道累,胡松.基于量子粒子群算法的摄像机标定优化算法[J].激光与光电子学进展,2018,55(12):386-391.
    [15]李瑞,熊显名.基于棋盘格的线结构光平面标定方法[J].仪器仪表用户,2017,24(04):6-9.
    [16]祝强,李少康,徐臻. LM算法求解大残差非线性最小二乘问题研究[J].中国测试,2016,42(03):12-16.
    [17]德芳,刘传才.基于人工标识的移动增强现实配准方法[J].现代电子技术,2015,38(08):26-30.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700