用户名: 密码: 验证码:
黄瓜收获机器人视觉系统的研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
我国是一个农业大国,农业技术装备事关国民经济的竞争力和农业经济的可持续发展。为进一步提高农业机械化和自动化水平,降低生产成本,农业收获机器人的研究受到了越来越多的关注,成为农业机械装备的重要研究方向。
     黄瓜是世界性蔬菜,栽培历史悠久,种植广泛。我国目前黄瓜收获和分类的方式多以手工为主,工作效率低下,劳动强度高,而且人为因素也会导致分类标准的参差不齐,难以满足现代化农业的需要。现在农业机械化已驶入快车道,采用先进设备使农业生产的各个环节实现智能化和自动化已成为主流。这跟现代农业精准高效、绿色环保的要求是一致的。因此,探索研制黄瓜收获机器人很有必要。一方面可以改善人们的作业环境,降低劳动强度;另一方面可以提高采摘效率和保护自然环境。
     结合目前农业工程中的研究热点-机器视觉技术,本文对黄瓜收获机器人视觉系统的关键技术一目标识别和导航路径提取进行较为系统地研究,主要研究内容包括以下几个方面:
     (1)对黄瓜和其茎叶在近红外波段的光谱特性进行了研究。通过试验,找到了黄瓜和茎叶反射光谱差异较大的波段。在利用样本的主成分得分结合马氏距离法剔除了异常样本后,对定标建模样本进行了交互验证,得到7个最佳主成分并建立了PLS模型,对预先设定的验证样本集的预测结果表明,正确识别率达到100%。
     (2)以温室黄瓜为对象,根据黄瓜图像的颜色特点,研究了黄瓜图像的颜色空间转换和黄瓜灰度图像增强处理。对黄瓜图像的3种颜色空间(9个色彩通道)的色差统计分析,发现RGB颜色空间黄瓜图像适合作为图像处理的基础;在图像增强处理中,利用脉冲耦合神经网络赋时矩阵的方法进行图像增强,该方法结合了人眼的视觉特性,充分应用了Weber定律,能够得到较好的处理效果。试验结果表明:增强处理突出了图像对比度,同时保留了图像细节信息。
     (3)探讨了温室黄瓜图像的分割方法。针对图像中黄瓜与其茎叶色彩相近的特点,应用改进脉冲耦合神经网络进行黄瓜图像分割。该方法将黄瓜图像的灰度信息和空间信息耦合到连接系数中,通过自适应调整其各参数,以动态阈值分割图像,以图像的二维Tsallis熵最大为脉冲耦合神经网络终止迭代准则。分别与基于Shannon熵最大终止迭代准则和基于最小交叉熵终止迭代准则的改进脉冲耦合神经网络分割效果进行比较,采用试验优度法统计分割结果。统计结果表明:基于二维Tsallis熵最大终止准则的PCNN可取得良好的分割效果。
     (4)探讨了温室环境中黄瓜目标的识别方法。针对采用脉冲耦合神经网络分割所得的二值图像,利用数学形态学方法进一步处理;分别提取二值图像中各连通区域的4个几何特征值和灰度共生矩阵基础上的3个纹理特征值,作为最小二乘支持向量机(LS-SVM)的输入特征向量;利用训练好的LS-SVM分类器判别图像中的黄瓜目标。试验结果显示:用于试验的70幅黄瓜图像,正确识别率达82.9%。这表明脉冲耦合神经网络结合LS-SVM的方法,适合复杂背景的温室黄瓜识别。
     (5)提出了一种基于广义模糊霍夫变换识别有部分遮挡黄瓜的方法。应用Bezier曲线拟合黄瓜的中心轴,应用广义柱的思想描述黄瓜;通过提取黄瓜二值图像边缘对点的特征不变角度作为索引项,对点的距离与模板索引项对应的点对距离的比例确定图像的尺度;将模糊概念应用到投票机制中,可减少虚假投票,采用广义霍夫变换计算黄瓜目标的潜在位置,并分别比较待识别黄瓜与模板各旋转角度下的“面积差”,选出目标的位姿,并恢复黄瓜形状。试验表明:广义模糊霍夫变换可以识别部分遮挡、不同位姿的黄瓜目标。
     (6)研究了黄瓜收获机器人在温室非结构环境中导航路径提取的算法。对CCD拍摄的图像进行列扫描,从每列的像素灰度值累加值确定导航路径可能的区域。通过分析,图像中黄瓜植株和其行间路面颜色差异较大,应用RGB图像的三个分量的数学运算(EXG和EXR)进行图像预处理,得到直方图具有双峰特征的灰度图像,利用OTSU法对该灰度图像进行分割并得到二值图像;从可能的导航路径分别向左右逐行扫描,记录行扫描中每个灰度突变的像素点位置;将同一行中离潜在导航线两侧最近的记录像素点作和运算并取平均值,得到导航离散点;以连续5个离散点的平均值作为新的离散点,进而利用最小二乘法拟合导航直线。
China is a large agricultural country; agricultural technology and equipment are related to the national economy's competitiveness and sustainable development of the agricultural economy. To further enhance agricultural mechanization and automation levels and to reduce production costs, more and more attention has been paid on agricultural harvesting robots research world widely and it has been become an important research direction of agricultural machinery and equipment.
     Cucumber is a kind of world vegetables with a long cultivation history. In China, cucumber harvesting and classification are mostly done by labor, which are inefficiency and high labor intensity; human factors can bring different criteria for classification, which is difficult to meet the needs of modern agriculture. Agricultural mechanization has entered the fast lane; advanced equipment has been used in all aspects of agricultural production and achieved intelligence automation. This is consistent to requirements of precise and efficient modern agriculture. Therefore, it is necessary to explore and research cucumber harvest robot. The one hand, it can improve labor's operating environment, low labor intensity; the other hand, it can increase picking efficiency and protect the natural environment.
     The research focus on agricultural engineering-machine vision technology, the key technology of cucumber harvesting robot vision system-object recognition and navigation path extraction were systematic studied, the main contents include following aspects:
     (1)Spectral characteristics of cucumber and their stems、leaves in the near infrared spectral were studied. From the experiment significant differences of the contrasting reflectance spectra of cucumber and stems、leaves was found. principal component scoring and Mahalanobis distance calculation were used to eradicate abnormal samples, cross-validation was made for sample modeling, which resulted into7principal components and a PLS model was established. The prediction of the pre-set validation sample set confirmed that the recognition rate up to100%.
     (2)Color space conversion and grey scale image enhancement processing was performed on greenhouse cucumber, based on its image characteristics. Statistical analysis of Chromatic aberration on the three kinds of cucumber image color space (9color channel) shows that RGB color space is suitable and to be the basis of image processing. In image enhancement process, pulse coupled neural network time matrix was employed to enhance cucumber gray scale image. This method combines the visual characteristics of human eye and Weber's law, which effectively enhance cucumber image. The results showed an enhanced image highlighting the image contrast and preserve the image detail.
     (3)Segmentation method of greenhouse cucumber image was investigated. For cucumbers with similar color to the stems and leaves, improved pulse coupled neural networks was employed to segment cucumber image, which coupled the gray information and spatial information of cucumber image in connection coefficient, and adaptive adjustment of the parameters was made to segment cucumber image with dynamic threshold. When image's two-dimensional Tsallis entropy achieves its maximum, the maximum Shannon entropy as well as the minimum Cross entropy was adopted in the processing, which is named as the iteration of PCNN in this paper. The statistical results also shows that the two-dimensional Tsallis entropy terminating criterion of PCNN can obtain good segmentation results.
     (4)Cucumber target identification strategy in greenhouse environment was investigated. Based on the binary image of pulse coupled neural network segmentation, mathematical morphology was employed to further process the binary image. Four geometrical features and3texture feature values based on gray level comatrix were extracted from each connected region in binary image, these values were used as input feature vectors to train LS-SVM classifier for distinguishing the cucumber target in image. The experiment results showed that correct identification rate up to82.5%in70cucumber images, indicating that pulse couples neural networks combines with LS-SVM method was suitable for recognizing cucumber in complex background greenhouse.
     (5)A generalized fuzzy Hough transform process was proposed for the partially shielded cucumbers. Bezier curve was applied to fit the central axis of cucumber. The idea of generalized column was employed to describe the cucumber. Double points of binary cucumber image were extracted, whose invisible angle was used as an index, the ratio of distance of double points and the distance of double points in template to determine the scale of cucumber image. Also fuzzy concept was applied to the voting mechanism to reduce false voting. Generalized Hough transform was used for calculating the potential location of cucumber target. By comparing the "area of difference" between the cucumber and the different rotation angle template the position and orientation of target and the restoration of the cucumber shape was achieved. Results confirmed that the generalized fuzzy Hough transform is best to identify the part of block and the different positions and orientation of cucumber target.
     (6)The navigation path extraction algorithm for cucumber robot in unstructured greenhouse environments was investigated. Column scanning of cucumber image captured by the CCD accumulated gray value of each column was performed to determine the possible regions of navigation path. Then the color differences of the plant and the road in image was analyzed. The three components of RGB image was were computed (EXG and EXR) to acquire gray image with bimodal characteristics histogram. OTSU was used for image segmentation and progressive scanning was carried out from the possible navigation path in both left and right direction on the images. Discrete point of navigation was obtained for getting the average of horizontal and vertical distances from the mutual points on each gray scale. Values of discrete point obtained by taking the average of the continuous five discrete points and final least square method used to fit the navigation straight line.
引文
[1]日本机器人协会.机器人技术手册四[J].北京:科学出版社,1996.
    [2]农业部.2006年全国各地蔬菜播种面积和产量[J].中国蔬菜,2008(1):65.
    [3]Y Sarig. The status of fruit and vegetable harvest mechanization in the USA[C]. An ASAE Meeting Presentation. Paper No.991098
    [4]SarioY. Robotics of fruit harvesting:a state-of-the-art review[J].Joumal of Agricultural Engineering Research,1993,54(3):265-280.
    [5]Van Kollenburg-Crisan L M, Bontsema J, Wennekes P. Meehatronic system for automaticHarvesting of cucumber[A]. IFAC Control Application and Engonomics in Agriculture[C]. Athens, Greeee:1998.289-29
    [6]汤修映,张铁中.果蔬收获机器人研究综述闭[J].机器人,2005,27(1):90-96.
    [7]EdanY, Gaines E. Systems engineering of agricultural robot design[J]. IEEE Transactions on Systems, Man and Cybernetics,1994,24(8):1259-1265.
    [8]Harrell R C, Adsit P D, Munilla R D,et al. Robotic picking of citrus[J]. Robotica,1990,8(4):269-278.
    [9]SehertzC.E, Bron GK. Basic considerations in mechanizing citrus harvest[J]. Transactions of the ASAE,1968,11(2):343-346.
    [10]冈本嗣男,邹诚,刘蛟龙译.生物农业机器人[M].北京:科学技术文献出版社,1994.
    [11]王勇.棉花收获机器人视觉系统的研究[D].南京农业大学,2007.
    [12]徐贵力,毛罕平,李萍萍.彩色图像颜色和纹理特征提取的应用算法[J].计算机工程,2001,28(6):25—26.
    [13]应义斌,景寒松,等.黄花梨果形的机器视觉识别方法研究[J].农业工程学报,1999,15(1):192—196.
    [14]徐贵力,毛罕平,李萍萍.缺素叶片彩色图像颜色特征提取的研究[J].农业工程学报,2002,18(4):150—154.
    [15]张瑞合,姬长英,等.计算机视觉技术在番茄收获中的应用[J].农业机械学报,2001,32(5):50—52.
    [16]ParrishE., and A.K.Goksel. Pietorial pattern recognition applied to Fruit harvesting[J]. Transaetions of theASAE,1977,20(5):822-827.
    [17]D'EsnonAG, RabatelG, pelleneR. A self propelled robot to pick apples[J]. Transaetions of the ASAE,987,7:1032-1037.
    [18]Whittaker A.D, iles G E, itchell O R, et al. Fruit location in a partially occluded image[J]. Transactions of the ASAE,1987,30(3):91-96.
    [19]Slaughter D C, Harrell R C. Discriminating fruit for robotic harvest using color in natural outdoor scenes[J]. Transactions of the ASAE,1989,32(2):757-763.
    [20]Kassay L. Hungarian robotic apple harvesten[C]. ASAE St. JosePh. Mieh:ASAE.1992, paperNo.92-7042:1-14.
    [21]Pla F, Juste F, Ferri F. Feature extraction of spherical objects in image analysis:an application to robotic citrus harvesting[J]. Computers and Electronics in Agriculture,1993,8:57-72.
    [22]Buemi F., M. Massa, andG.Sandini. Agrobot:A robotic system for greenhouse operations[C]. In 4th WorkshoP on Robotic in Agriculture, IARP, Tolouse,1995:172-184.
    [23]Kondo N, Nishitsuji Y, Ling P P etal.Visual feedback guided robotic cherry tomato harvesting[J]. Transactions of the ASAE,1996,39(6):2331-2338.
    [24]Zhang Shuhai, Takahashi-T, etal. Studies on autom ation of work in orchards(partl). Ddtection of apple by pattern recognition[J]. Journal of the Japanese Society of Agricultural Machinery, 1996,58(1):9-16.
    [25]Bulanon D.M, T.Kataoka, Y.Ota, et al. A Maehine Vision System for the Apple Harvesting Robot. Agricultural Engineering International:the CIGR Journal of Seientific Researeh and Development. Manuseript PM 01 006.vol.III
    [26]Limsiroratana, Y.Lkeda. On image analysis for harvesting tropieal fruits[J]. SICE,2002,0824:1336-1341.
    [27]Naoshi Kondo. A New Challenge of Robot for Harvesting Strawberry Grown on Table Top Culture[C]. ASAE,2005, Paper Number:053138.
    [28]Tomowo Shiigi,Strawberry harvesting robot for fruits grown on table top culture[C]. ASABE,2008, Paper Number:084046.
    [29]Shigehiko Hayashi, Evaluation of a strawberry-harvesting robot in a field test[J]. Biosystems engineering 2010,105:160-171.
    [30]周云山,李强,李红英,等.计算机视觉在蘑菇采摘机器上的应用[J].农业工程学报1995,11(4):27—32.
    [31]曹其新,吕恬生,永田雅辉,等.草莓拣选机器人的开发[J].上海交通大学学报,1999,33(7):880—884.
    [32]张瑞合,姬长英,沈明霞等.计算机视觉技术在番茄收获中的应用[J].农业机械学报,2001, 32(5):50—52.
    [33]杨国彬,赵杰文,向忠平.利用计算机视觉对自然背景下西红柿进行判别[J].农机化研究,2003(1):60—62.
    [34]赵金英,张铁中,杨丽.西红柿采摘机器人视觉系统的目标提取[J].农业机械学报,2006,37(10):201—203.
    [35]徐丽明,张铁中.果蔬果实收获机器人的研究现状及关键问题和对策[J].农业工程学报,2004,20(5):38—42.
    [36]蔡健荣,周小军,李玉良,等.基于机器视觉自然场景下成熟柑橘识别[J].农业工程学报,2008,24(1):175—178.
    [37]李斌, Ning Wang,汪懋华,等.基于单目视觉的田间菠萝果实识别[J].农业工程学报,2010,26(10):345—349.
    [38]汤晓东,刘满华,赵辉,等.复杂背景下的打斗叶片识别[J].电子测量与仪器学报,2010,24(4):385—390.
    [39]熊俊涛,邹湘军,陈丽娟,等.基于机器视觉的自然环境中成熟荔枝识别[J].农业机械学报,2011,42(9):162—166.
    [40]Johnson M. Automation in citrus sorting and packing[C]. Proc. Agrimation I Conf. And Exposition, StJoseph, Mich.:ASAE,1985,63-68.
    [41]Sarkar N, Wolfe R R. Computer vision based system for quality separation of fresh market tomatoes [J]. Transactions of the ASAE,1985,28(5):1714-1718.
    [42]Rehkugler G E, Throop J A. Apple sorting with machine vision[J]. Transactions of the ASAE,1986, 29(5):1388-1397.
    [43]Delwiche M J, Tang S, Thompson J F. Prune defectdetection by line scan imaging[J]. Transactions of the ASAE,1990,33(3):950-954.
    [44]Shearer S A, Payne F A. Colour and defect sorting of bell peppers using machine vision[J]. Transactions of the ASAE,36(6):2045-2050.
    [45]Rigney M P, Brusewitz G H, Kranzler G A. Asparagus defect inspection withmachine vision[J]. Transactions of the ASAE,1992,35(6):18734-1878.
    [46]Miller B K, Delwiche M J. A colour vision system for peach grading[J]. Transactions of the ASAE, 1989,32(4):1484-1490.
    [47]Miller B K, Delwiche M J. Peach defect detection with machine vision [J]. Transactions of the ASAE,1991,34(6):2509-2515.
    [48]Miller B K, Delwiche M J. Spectral analysis of peach surface defects [J]. Transactions of the ASAE, 1991,34(6):2588-2597.
    [49]Rehkugler G E, Throop J A. Apple sorting with machine vision[J]. Transactons of the ASAE,1985, 29(5):1388-1395
    [50]Heinemann P H, Varghese Z A, Morrow C T, et al. Machine vision inspection of golden delicious apples[J]. Appl. Eng.Agric.ASAE,1995, 11(6):90-906.
    [51]Molto E, Blasco J, Benlloch J V. Computer vision for automatic inspection of agricultural produces[J]. In SPIE Symposium on Precision Agriculture and Biological Quality. Boston, MA, USA. 1998,11:1-6.
    [52]Chinchuluun R, Lee W.S. Citrus yield mapping system in natural outdoor scenes using the watershed transform[C]. St. Joseph, Mich:ASABE,2006, Paper No.063010.
    [53]Hiroshi Okamoto.Won Suk Lee. Green citrus detection using hyperspectral imaging[J]. Computers and Electronics in Agriculture,2009,66:201-208.
    [54]刘禾,王懋华.苹果自动分级中的图像分割[J].中国农业大学学报,1996,1(6):89—93.
    [55]刘禾,汪懋华.用计算机图像技术进行苹果坏损自动检测的研究[J].农业机械学报,1998(12):81—86.
    [56]王江枫,罗锡文.计算机视觉技术在芒果重量及果面坏损检测中的应用[J].农业工程学报,1998(12):186—189.
    [57]应义斌,景寒松,马俊福,等.机器视觉技术在黄花梨尺寸和果面缺陷检测中的应用[J].农业工程学报,1999(1):197—200.
    [58]应义斌,景寒松,马俊福,等.黄花梨果形的机器视觉识别方法研究[J].农业工程学报,1999,15(1):192—196.
    [59]沈佐锐,于新文.温室白粉虱自动计数计数技术研究初报[J].生态学报,2001,21(1):94—99.
    [60]包晓安,张瑞林,钟乐海.基于人工神经网络与图像处理的苹果识别方法研究[J].农业工程学报,2004,20(3):109—112.
    [61]吴艳兵,樊啟洲.计算机图像处理技术在温室黄瓜幼苗生长信息检测中的应用[J].湖南农机,2007,(3):7—9
    [62]赵文杰,刘少鹏,邹小波,等.基于支持向量机的缺陷红枣机器视觉识别[J].农业机械学报,2008,39(3):113—115.
    [63]骆伟.基于计算机视觉的纽荷尔脐橙图像形状识别方法研究[J].农业与技术,2009,29(2):158—161.
    [64]曹乐平.基于周长面积分形维数的柑橘品种机器识别[J].农业工程学报,2010,26(2):351—355.
    [65]李赞民,韩东海,王秀.光谱分析技术及其应用[M].北京:科学出版社,2006.112—162.
    [66]Felton W L, A F Doss. A Microprocessor controlled technology to selectively spot spray weeds[J]. Proc. Of the automated agricultural for the 21st Century, Chicago, II.1991,427-432.
    [67]朱登胜,潘家志,何勇.基于光谱和神经网络模型的作物与杂草识别方法研究[J].光谱学与光谱分析,2008,28(5):1102—1106.
    [68]吴迪,黄凌霞,何勇等.作物和杂草叶片的可见-近红外反射光谱特性[J].光学学报,2008,28(8):1618—1622.
    [69]陈树人,栗移新,毛罕平,等.基于光谱分析技术的作物中杂草识别研究[J].光谱学与光谱分析,2009,29(2):463—466.
    [70]陈树人,栗移新,毛罕平,等.水稻中稗草光谱分析与识别[J].农业机械学报,2008,39(9):96—99.
    [71]刘燕德,应义斌.苹果糖分含量的近红外漫反射检测研究[J].农业工程学报,2004,20(1):189—191.
    [72]刘良云,王纪华.利用新型光谱指数改善冬小麦估产精度[J].农业工程学报,2004,20(1):172—175.
    [73]Rabatel G. A vision system for the fruit picking robot[C]. Agrieultural Engineering, Paris, London, U.K.Paper88293, AGENG88, Int, Conf. MinistryofAgrieulture, Fisheriesand Food 1988.
    [74]NOHH, ZHANG Q,HAN S, et al Reum dynamic calibration and image segmentation methods for multispe-ctral imaging crop nitrogen deficiencysensons[J].Transaction of the ASAE,2005,48(1):393-401.
    [75]Kevin E. Kane,Won Suk Lee. Multispectral Imaging for In-field Green Citrus Identification[C]. 2007 St. Joseph, Mich.:ASABE, Paper No.073025.
    [76]林伟明,毛罕平,王新忠,等.多光谱视觉技术在收获机器人中的应用.农业装备技术[J]2004,30(6):15—18.
    [77]Arima S, Kondo N. Basic studies on cucumber harvesting robot[J]. Proceeding of ARBIP95, Japan Society of Agricultural Machinery,1995,1:195-202.
    [78]Arima S,Kondo N.Cucumber Harvesting Robot and Plant Training System[J]. Journal of Robotics and Mechatronics,1999, 11(3):208-212.
    [79]Van Henten EJ, Hemming J, Van Tuijl BAJ, et al. An Autonomous Robot for Harvesting Cucumber in Greenhouses[J].Autonous Robots,2002,13:241 -258.
    [80]Van Henten EJ,Van Dijk G, Kuypers M C,et al. Motion planning for a cucumber picking robot[A]. IFAC Modeling and Control in Agriculture, Horticulture and Post-harvesting Processing[C].Wageningen, the Netherlands,2000:39-44.
    [81]Arima S, Kondo. Cucumber harvesting robot and plant training system[J]. Journal of Robotics and Mechatronics,1999, 11(3):208-212.
    [82]纪超,冯青春,袁挺,等.温室黄瓜采摘机器人系统研制及性能分析[J].机器人,2011,33(6):726—730.
    [1]Kondo N, Nakamura H, Monta M, et al. Visual sensor for cucumber harvesting robot[C]. proceedings of Processing Automation Conference Ⅲ. Janpan:Rransactions of the AS AE,1994:461-470.
    [2]李伟,王库,谭豫之,等.温室环境下黄瓜采摘机器人技术研究[C].第四届全国先进制造装备与机器人技术论文集.哈尔滨:农机化研究,2008:79—88.
    [3]袁挺,张俊雄,李伟,等.基于机器视觉的非结构环境下黄瓜目标特征识别[J].农业机械学报,2009,40(8):170—174.
    [4]Yang qing-hua, Qi li-yong, Bao Guanjun. Cucumber image segmentation algorithm based on rough set theory[J]. New Zealand Journal of Agricultural Research,2007,50(5):1293-1298.
    [5]袁挺,许晨光,任永新,冯青春,谭豫之,李伟。基于近红外图像的温室环境下黄瓜果实信息获取[J].光谱学与光谱分析,2009,29(8):2054—2058.
    [6]俞晓磊,赵志敏。光谱导航技术在果树果实定位中的应用[J].光谱学与光谱分析,2010,30(3):770--773.
    [7]张喜杰,李民赞,崔笛,等.温室作物长势的光谱学诊断方法研究与仪器开发[J].光谱学与光谱分析,2006,26(5):887—890.
    [8]陈树人,栗移新,毛罕平,等.基于光谱分析技术的作物中杂草识别研究[J],光谱学与光谱分析,2009,29(2):463—466.
    [9]刘树深,易忠胜.基础化学计量学[M].北京:科学出版社,1999,127.
    [10]严诗楷,罗国安,王义明,等.栀子药材提取工艺的近红外光谱实时控制方法研究[J].光谱学与光谱分析,2006,26(6):1026—1030.
    [11]Bona M T, Andres J M. Application of chemometric tools for coal classification and multivariate calibration by transmission and drift mid-infrared spectroscopy[J]. Analytica Chimica Acta,2008,624(1): 68-78.
    [12]朱登胜,潘家志,何勇.基于光谱和神经网络模型的作物与杂草识别方法研究[J].光谱学与光谱分析,2008,28(5):1102—1106.
    [13]陈斌,邹贤勇,朱文静.PCA结合马氏距离法剔除近红外异常样品[J].江苏大学学报(自然科学版),2008,29(4):277—279.
    [14]王丽,卓林,何鹰,等.近红外光谱技术鉴别海面溢油[J].光谱学与光谱分析,2004,24(12):1537—1539.
    [15]曾九孙,刘祥官,罗世华,等主成分回归和偏最小二乘法在高炉冶炼中的应用[J].浙江大学学报(理学版),2009,36(1):33—36.
    [16]Delalieux. S, van Aardt. J, Keulemans. W, et al. Detection of biotic stress (Venturia inaequalis) in apple trees using hyperspectral data:Non-parametric statistical approaches and physiological implications[J]. European Journal of Agronomy,2007,27(1):130-143.
    [17]C.D. Jonesa, J.B. Jonesb, W.S. Leea. Diagnosis of bacterial spot of tomato suing spectral Signatures[J]. Computers and Electronics in Agriculture,2010,74:329-335.
    [18]Zhanyu Liu, Jingfeng Huang,Jingjing Shi, et al. Characterizing and estimating rice brown spot disease severity using stepwise regression, principal component regression and partial least-square regression[J]. Journal of Zhejiang University Science B,2007,8(10):738-744.
    [19]Efron B, Gong G. A Leisurely Look at the Bootstrap, the Jackknife, and Cross-Validation[J]. The American Statistician,1983,37(1):36-48.
    [20]Eike Luedeling, Adam Hale, Minghua Zhang, et al. Remote sensing of spider mite damage in California peach orchards[J]. International Journal of Applied Earth Observation and Geoinformation, 2009,11:244-255.
    [1]凌贺飞,卢正鼎,杨双远.基于YCbCr颜色空间的二维DCT彩色图像数字水印实用技术[J].小型微型计算机系统,2005,26(3):482—484.
    [2]任坷,屠康,潘磊庆,陈育彦.青花菜贮藏期间颜色变化动力学模型的建立[J].农业工程学.2005,21(8):146—150
    [3]杨发权,李武钢,李玉祥.不同色度空间的颜色分离方法[J].广西梧州师范高等专科学校学报,2000,16(2):72—76.
    [4]刘津,陈奇,俞瑞钊.计算机颜色科学的发展[J].计算机工程,1997,23(2):47—51.
    [5]Suranjan Panigrahi,Curt Detkott,Huanzhong Gu. Evaluation of different color coordinates for color evaluation of foof products[J]. An ASABE meeting Presentation, Paper No.993156.
    [6]H.D.Cheng,X.H.Jiang,Y.Sun,J.Wang.Color image segmentation advances and prospects[J].Pattern Recognition,2001,34(12):2259-2281.
    [7]Eckhorn R, Reiboeck H J, Arndt M,et al. A neural networks for feature linking via synchronous activity:Results from cat visual cortex and from simulations[M]. In:Cotterill R M J,ed.Models of Brain Function,Cambridge:Cambridge Univ Press,1989.
    [8]张军英,卢志军,石林,等.基于脉冲耦合神经网络的椒盐噪声图像滤波[J].中国科学E辑信息科学2004,34(8):882—894.
    [9]MA Yide, DAI Rolan, LI Lian, WEI Lin. Image segmentation of embryonic plant cell using pulse-coupled neural networks [J]. Chinese Science Bulletin,2002,47(2):167-172
    [10]熊雪梅,王一鸣,张小超,等.基于脉冲耦合神经网络的蝗虫图像分割[J].农机化研究,2007,1:180—183.
    [11]Skourikhine, AlexeiN, Prasad,Lakshman, et al, Bernd R.Neural network for image segmentation[A]. PROC SPIE INT SOC OPTENG, SOCIETY OF PHOTO-OPTICAL INSTRUMENTATION ENGINEERS [C]. BELLINGHAM,WA(USA),2000,4120:28-35
    [12]Henrik Berga, Roland Olssona, Thomas Lindbladb, at el. Automatic design of pulse coupled neurons for image segmentation[J]. Neurocomputing,2008(71):1980-1993.
    [13]Xiaodong Gu, Daoheng Yu, Liming Zhang. Image thinning using pulse coupled neural network[J]. Pattern Recognition Letters,2004,25:1075-1084.
    [14]Kinser J M.Foveation by a Pulse-Coupled Neural Network[J].IEEE Trans. NeuralNetworks,1999, 10(3):621-625.
    [15]Jcaufield H and Kinser J M.Finding shortest path in the shortest time using PCNN's[J].EEEE Trans.Neural Networks,1999,10(3):604-606.
    [16]Eckhorn R, Reitboeck H J, Arndt M, et al. Feature linkig via synchronization among distributed assemblies:Simulation of results from cat cortex[J]. Neural Computation,1990,2(3):293-307.
    [17]Ranganath H S, Kuntimad G, Johnson.J.L. Pulse coupled neural networks for image processing[J]. Proceedings of 1995 IEEE Southeast Conference, Raleigh NC,1995,37-43.
    [18]ECKHORN R. Neural Mechanisms of Scene Segmentation:Recordings from the Visual Cortex Suggest Basic Circuits or Linking Filed Models[J]. IEEE Trans on Neural Networks,1999,10(3): 464-479.
    [19]L.Johnson,H.Ranganath,G.Kuntimad, et al. Pulse-coupled neural networks[J]. Neural Networks and Pattern Recognition, Academic Press, San Diego, CA,1998,1-56.
    [20]G. Kuntimad, H. S. Ranganath. perfect image segmentation using pulse coupled neural networks[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS,1999,10(3):591-598.
    [21]顾晓东,余道衡,PCNN的原理及其应用[J],电路与系统学报,2001,6(3):45-50.
    [22]Shen Jianhong. Weber's law and weberized TV restoration[J]. Physica D:Nonlinear Phenomena, 2003,175(3/4):241-251.
    [23]http://neuroelec.com/2011/04/led-brightness-to-your-eye-gamma-correction-no/#pings.
    [24]顾晓东,程承旗,余道衡.基于PCNN的二值图像细化新方法[J].计算机工程与应用,2003,13:5—6,28.
    [25]熊雪梅,王一鸣,张小超,等.基于简化脉冲耦合神经网络的蝗虫图像二值分割[J].农业机械学报,2007,38(10):84—86.
    [26]Kun Zhan, Hongjuan Zhang, Yide Ma. New Spiking Cortical Model for InvariantTextureRetrieval and Image Processing[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS,2009,20(12): 1980-1986
    [27]Yuli Chen, Sung-Kee Park, Yide Ma, at el. A New Automatic Parameter Setting Method of a Simplified PCNN for Image Segmentation[J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2011,22(6)880-891.
    [28]于江波,陈后金,王巍,等.脉冲耦合神经网络在图像处理中的参数确定[J].电子学报,2008,36(1):81—85.
    [29]姚畅,陈后金,李居朋.改进型脉冲耦合神经网络在图像处理中的动态行为分析[J].自动化学报,2008,34(10):1291—1297.
    [30]梁晓冰.脉冲发放神经元及其耦合系统的随机模型研究与应用[D].国防科学技术大学,2009,4.
    [1]袁国勇,张铁中.温室黄瓜果实的模式识别与分割[J].农机化研究,2006,7:150-153.
    [2]YANG QING-HUA, QI LI-YONG, BAO GUAN-JUN, at el. Cucumber image segmentation algorithm based on rough set theory[J]. New Zealand Journal of Agricultural Research,2007,50:989-996.
    [3]袁挺,徐晨光,任永新,等.基于近红外图像的温室环境下黄瓜果实信息获取[J].光谱学与光谱分析,2009,29(8):2054—2058.
    [4]Lindblad, T., Kinser, J.M. Image Processing Using Pulse-coupled Neural Networks[M]. pringer-Verlag New York, Inc., Secaucus, NJ, USA,2005.
    [5]徐学强,汪渤,于家城,等.基于改进型脉冲耦合神经网络的图像分割方法[J].弹箭与制导学报,2006,1:126—128
    [6]石美红,张军英,张晓滨,等.基于改进型脉冲耦合神经网络的图像二值分割[J].2002,19(4):42—46.
    [7]Stewart R D, Fermin I, Opper M. Region growing with pulse "coupled neural networks:an alternative to seeded region growing [J]. IEEE Trans, on Neural Network,2002,13(6):1557-1562. [8]Tsallis, C. Possible generalization of Boltzmann-Gibbs statistics. J. Stat. Phys.1988,52 (1),480-487.
    [9]Abutaleb A S. Automatic thresholding ofgrey levelpictures using two dimensional Entropy [J]. Computer Vision, Graphics and Image Processing,1989,47(1):22-32.
    [10]Sahoo, P.K., Arora, G. Image thresholding using two-dimensional Tsallis-Havrda-Charvat entropy[J]. Pattern Recognition. Lett.2006,27 (6):520-528.
    [11]Sahoo. P.K., Arora. G. A thresholding method based on two-dimensional Renyi's entropy [J]. Pattern Recognition,2004,37 (6),1149-1161.
    [12]马义德,戴若兰,李廉.一种基于脉冲耦合神经网络和图像熵的自动图像分割方法[J].通信学报,2002,23(1):46—51.
    [13]Yin Pengyeng. Multilevel minimum cross entropy threshold selection based on particle swarm optim ization[J]. Applied Mathematics and Computation,2007,184(2):503-513.
    [14]Levine M D, Nazif A M. Dynamic measurement of computer generated image segmentations[J]. IEEE-PAMI,1985,7:155-164.
    [15]Sahoo P K,et al.A survey of thresholding techniques[J]. CVGIP,1988,41:233-260.
    [16]鲍晴峰,王继成.基于PCNN的彩色图像分割新方法[J].计算机工程与应用,2005,27:48—50.
    [1]Thomas Rath, Marco Kawollek. Robotic harvesting of Gerbera Jamesonii based on detection three-dimensional modeling of cut flower pedicels[J]. Computers and Electronics in Agriculture,2009, 66:85-92.
    [2]Hongpeng Yin, Yi Chai, Simon X, et al. Ripe Tomato Recognition and Localization for a Tomato Harvesting Robotic System[A].2009 International Conference of Soft Computing and Pattern Recognition[C]. Malacca, Malaysia,2009,567-562.
    [3]E.J. Van Henten, B.A.J. Van Tuijl, J. Hemming, et al. Van Os Field Test of an Autonomous Cucumber Picking Robot[J]. Biosystems Engineering,2003 86 (3),305-313.
    [4]Kanae Tanigaki, Tateshi Fujiura, Akira Akase, et al. Cherry-harvesting robot[J]. Computers and Electronics in agriculture,2008,63:65-72.
    [5]Zhang Libin, Wang Yan, Yang Qinghua, et al. Kinematics and trajectory planning of a cucumber harvesting robot manipulator[J]. Int J Agric & Biol Eng,2009,2(1):1-7.
    [6]Hsyashi S, Ganno K, Ishii Y, et al. Robotic Harvesting System for Eggplants[J]. JARQ,2002,36(3): 163-168.
    [7]袁挺,张俊雄,李伟,等.基于机器视觉的非结构环境下黄瓜目标特征识别[J].农业机械学报,2009,40(8):170-174,2 1 8.
    [8]Haiqing Wang, Changying Ji, Baoxing Gu, Guangzhao Tian. A Simplified Pulse-coupled Neural Network for Cucumber Image Segmentation[C].2010 International Conference on Computational and Information Sciences, Chengdu, China,2010,1053-1057.
    [9]张学工.关于统计学习理论与支持向量机[J].自动化学报,2000,26(1):32-42.
    [10]MA Yide, DAI Rolan, LI Lian, et al. Image segmentation of embryonic plant cell using pulse-coupled neural networks [J]. Chinese Science Bulletin,2002,47(2):167-172.
    [11]熊雪梅,王一鸣,张小超,等.基于脉冲耦合神经网络的蝗虫图像分割[J].农机化研究,2007,(1):180-183.
    [12]李云,胡学龙.二值图像中标定目标区域的几何特征提取[J].微机发展,2000,(5):55-57.
    [1.3]John R Smith, Shih2Fu Chang. Automated binary texture feature sets for image retrieval[A]. IEEE International Conference on Acoustics, Speech, and Signal Processing[C], Atlanta, GA, USA,1996,4: 2239-2242.
    [14]范德耀,姚青,杨保军,等.田间杂草识别与除草技术智能化研究进展[J].中国农业科学,2010,43(9):1823-1833.
    [15]Robert M. Haralick, K. Shanmugam, Its'hak Dinstein. Texture features for image classification [J]. EEE Transactions on Systems, Man and Cybernetics,1973,3 (6):610-621.
    [16]P R Gray, P J Hurst, S H Lewis, et al. Analysis and design of analog integrated circuits,4th ed [M]. New York:John Wiley & Sons Inc,2001,206-213.
    [17]D Shahrjerdi, B Hekmatshoar, M Talaie, et al. A fast settling, high DC gain, low power OPAMP design for high resolution, high speed A/D converters[A]. Proceedings of the 15 th International Conference on Microelectronics[C]. Cairo Egypt:Electronics and Communications Department Faculty of Engineering Cairo University,2003,207-210.
    [18]薄华,马缚龙,焦李成.图像纹理的灰度共生矩阵计算问题的分析[J].电子学报,2006,34(1):155-158,134.
    [19]Cortes C, Vapnik V. Support- vector networks[J]. Machine Learning,1995,20(3):273-297.
    [20]J.A.K. Suykens, J. Vandewalle. Least squares support vector machine classifiers[J]. Neural Processing Letters,1999,9(3):293-300.
    [21]冯学军.最小二乘支持向量机的研究与应用[J].安庆师范学院学报(自然科学版),2009,15(1):112-113,121.
    [22]吴迪,何勇,冯水娟,等.基于LS-SVM的红外光谱技术在奶粉脂肪含量无损伤检测中的应用[J].红外与毫米波学报,2008,27(3):180-184.
    [23]http://www.esat.kuleuven.be/sista/lssvmlab.
    [24]Suykens J A K, Vandew alle J. Least squares support vector machines classifiers [J]. Neural Network Letters,1999,9(3):293-300.
    [25]郑水波,韩正之,唐厚君,等.最小二乘支持向量机在汽车动态系统辨识中的应用[J].上海交通大学学报,2005,39(3):392-395.
    [1]Wang guangyi, Qiu shushing, Li hongwei, et al. A new chaotic system and its circuit realization[J]. Chinese Physics,2006,15(12):2872-2977.
    [2]闫维新,马文涛,付庄,等.遍历广义霍夫变换的烹饪机器人锅具椭圆锅具参数拟合算法[J]哈尔滨工程大学学报,2010,31(10)1373—1379.
    [3]Oguz Altun, Songul Albayrak. Turkish fingerspelling recognition system using generalized hough transform,interest regions,and local descriptors[J]. Pattern reconition letters,2011,32:1626-1632.
    [4]赵丽丽,温维亮,郭新宇等.草莓三维形态集合建模与真实感绘制[J].中国农学通报,2011,27(6):201-205.
    [5]周淑秋,郭新宇,雷蕾.黄瓜生长可视化系统的设计与实现[J].计算机技术与发展,2007:17(1):227—228,232.
    [6]雷蕾,郭新宇,周淑秋等.黄瓜果实的几何造型及可视化研究[J].计算机应用与软件,2006,23(5):24—25,45.
    [7]欢欢.水果为什么是圆球形[J].山东农机化,2004,(5):33.
    [8]葛长军,秦智伟,周秀艳.黄瓜果实曲直性评价方法及相关性分析[J].中国蔬菜,2009(8):28—31.
    [9]雷蕾,郭新宇,周淑秋,等.黄瓜果实的几何造型及可视化研究[J].计算机应用与软件,2006,23(5):24—25,45.
    [10]Singh C, Bhatia N, Kaur A. Hough transform based fast skew detection and accurate skew correction methods. Pattern Recognit 2008,41:3528-46.
    [11]Gonzalez RC, Woods RE. Digital image processing[M]. Prentice Hall,2008.
    [12]D.H. Ballard, Generalizing the Hough transform to detect arbitrary shapes, Readings in Computer Vision:Issues, Problems, Principles, and Paradigms,1987:714-725.
    [13]刘德刚,余旭初,张鹏强.基于广义Hough变换的不规则形状目标提取方法.测绘学院学报,2005,22(2):125—127
    [14]Gall J, Lempitsky V. Class-specific Hough forests for object detection[C]. In:IEEE conference on computer vision and pattern recognition. IEEE Press, New York,2009:1-8.
    [15]Leibe B, Leonardis A, Schiele B. Robust object detection with interleaved categorization and segmentation. Int J Comput Vis,2008,77(1-3):259-289.
    [16]Recuero ABM, Beyerlein P, Schramm H. Discriminative optimization of 3D shape models for the Generalized Hough transform. In:AMIES Kiel,2008.
    [17]Suryanto, Dae-Hwan Kim, Hyo-Kak Kim, Sung-Jea Ko. Spatial color histogram based center voting method for subsequent object tracking and segmentation. Image and Vision Computing,2011,29850-860.
    [18]Ruppertshofen H, Lorenz C, Beyerlein P, et al. Fully automatic model creation for object localization utilizing the Generalized Hough transform. In:Bildverarbeitung fur die Medizin. Springer, Berlin,2010, 281-285.
    [19]W.E.L. Grimson, D.P. Huttenlocher, On the sensitivity of the hough transform for object recognition, IEEE Transactions on Pattern Analysis and Machine Intelligence,1990,12 (3) 255-274.
    [20]A.S. Aguado, E.Montiel,M.S.Nixon, Bias error analysis of the generalized hough transform, Journal of Mathematical Imaging and Vision,2000,12 (1):25-42.
    [21]J. Illingworth, J. Kittler. The adaptive hough transform[J]. IEEE transactions on pattern analysis and machine intelligence,1987,9(5):690-698.
    [22]Lei Xu, Erkki Oja. Randomized Hough transform:basic mechanisms algorithms and computional complexities[J]. Comput Vision Graphic Image Processing:Image Understanding,1993,57(2):131-154.
    [23]Pekka Kultanen, Lei Xu, Erkki Oja. Randomized Hough Transform(RHT). The lOthlnternational Conference on Pattern Recognition,1990,1:631-635.
    [24]Fung P-F, Lee W-S, King I. Randomized generalized Houghtransform for 2-D grayscale object detection [C]. ICPR 96:Proceedings of the 13th International Conference on Pattern Recognition. Washington, DC:IEEE Computer Society,1996,2:511-515.
    [25]Izadinia H, Sadeghi F, Ebazadehm M. Fuzzy generalized Hough transform invariant to rotation and scale in noisy environment[C]. Proceedings of the 18th International Conference on Fuzzy Systems. Washington, DC:IEEE,2009:153-158.
    [26]T.E.Dufrsne, A.P. Dhawan. Chord-tangnet transformation for pbject recognition[J]. Pattern recognition,1995,28(9):1321-1331.
    [27]Kimura A, Watanabe T. Fast generalized Hough transform:rotation, scale and translation invariant detection of arbitrary shapes(in Japanese). Trans IEICE J81-D-Ⅱ,1998,4:726-734
    [28]Suetake, N., Uchino, E. and Hirata, K.. Generalized Fuzzy Hough Transform for Detecting Arbitrary Shapes in a Vague and Noisy Image. Soft Computing-A Fusion of Foundations, Methodologies and Applications,,2006,10(12):1161-1168.
    [1]D Khadraoui, C Debain, R Rouveure, et al. Vision-based control in driving assistance of agriculture vehicles[J]. The International Journal of Robotics Research,1998,17(10):1040-1054.
    [2]Tillett N D, Hague T, Marchant J A. A robotic system for plant-scale husbandry[J]. J. Agric. Eng. Res.,1998,69(2):169-178.
    [3]Hague T, Marchant J A, Tillett N D. Ground based sensing systems for autonomous agricultural vehicles[J]. Computers and Electronics in Agriculture,2000,25:11-28.
    [4]Wilson J N. Guidance of agricultural vehicles-a historicalperspective[J]. Computers and Electronics in Agriculture,2000,25:3-9.
    [5]周俊.农用轮式移动机器人视觉导航系统的研究[D].南京农业大学,2003.
    [6]冯建农,柳明,吴捷.自主移动机器人智能导航研究进展[J].机器人,1997,19(6):468-472.
    [7]Keicher R, Seufert H. Automatic guidance for agricultural vehicles in Europe[J]. Computers and Electronics in Agricuture,2000,25(1-2):169-194.
    [8]Rovira-Ma's F, Zhang Q, Reid J F, et a.l. Hough-transform-based vision algorithm for crop row detection of an automated agricultural vehicle[J]. Proceedings of the IMechE, PartD: JournalofAutomobile Engineering,2005,219(8):999-1010.
    [9]赵瑞娇,李民赞,张漫,等.基于改进Hough变换的农田作物行快速检测算法[J].农业机械学报,2009,40(7):163-165.
    [10]LeiXu, ErkkiOja, Pekka Kultanen. A new curve detection method:randomized Hough transform [J]. Pattern Recognition Letters,1990,11(5):331-338.
    [11]Woebbecke, D.M. et al. Shape features for identifying young weeds using image analysis[J]. Trans. Am. Soc. Agric. Eng,1995,38(1):271-281.
    [12]Meyer, G.E., Hindman, T.W., Lakshmi, K. Machine vision detection parameters for plant species idetification[M]. SPIE, Bellingham, WA,1998.
    [13]Neto, J.C. A combined statistical-soft computing approach for classification and mapping weed species in minimum tillage systems[J]. ETD collection for University of Nebraska, Lincoln,2004.
    [14]Ostu N. Discriminant and least square threshold selection[C]. In:Proc 4IJCPR,1978,592-596.
    [15]Derek York. Least squares fitting of a straight line with correlated errors[J]. Earth and planetary science letters,1968,5:320-324.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700