机器视觉技术在果园自动化中的应用研究
详细信息    本馆镜像全文|  推荐本文 |  |   获取CNKI官网全文
摘要
针对目前果园种植和管理技术落后、果园劳动强度大、水果品质落后、农药喷洒过多等问题,采用机器视觉技术,研究了水果外观品质分级方法、根蘖对靶喷药技术和水果采摘机器人技术,所研究的方法提升了果园的自动化程度,既能减轻劳动强度,又能抢农时,减少损失,促进果品优质高产。在总结国内外相关研究的基础上,本文主要研究了收获前检测樱桃外观品质的方法,葡萄根蘖的识别和定位方法以及树上苹果的空间识别与定位方法:
     1、研究了自然环境下樱桃表面颜色分级的模式识别方法。首次建立了樱桃表面均一颜色在同种光照环境下变化的数学模型,并依据此模型建立了樱桃表面颜色模式识别分类器,解决了樱桃表面的颜色受环境色温变化影响的问题。在色温分别为5700K(果园)、5400K和6500K(室内)三种光照条件下,樱桃分级处理试验的结果表明:果园环境下樱桃分级准确率达到87%,室内环境下樱桃分级准确率86.7%。
     2、研究了利用主动光源对樱桃表面颜色分级方法。应用主动光源有效去除樱桃表面自然光和镜面反射,提出了一种七等级樱桃表面颜色的图像处理算法。在三种光照条件下(阳光直射4800K,明亮阴影5700K,阴暗阴影7700K),试验结果明:樱桃颜色分级的准确率达到93%。
     3、研究了樱桃外径的测量方法。比较了椭圆拟合、圆拟合和旋转搜索三种方法测量樱桃外径的精确度,试验结果表明:椭圆拟合方法最有效,其标准偏差为0.64mm。
     4、研究了利用机器视觉系统和激光扫描雷达组合实现葡萄根蘖的实时检测和定位方法,开发了葡萄根蘖对靶喷药系统。研究了摄像机数学模型和自校正方法,首次提出了相机和激光扫描雷达组合测量空间位置和标定方法,研究了利用根蘖颜色、纹理和位置信息实现对根蘖的实时检测和定位的方法。在三种速度下(1.6、2.4、3.2km/h)的试验结果表明:根蘖实时识别与定位系统能快速精准的定位葡萄根蘖,葡萄茎秆的识别率达到98%以上,葡萄根蘖尺寸检测的平均精度为80%左右。
     5、建立了苹果采摘机器人的智能视觉系统,实现了机器人眼手空间信息的精准转换。首次提出了单双目组合式智能立体视觉系统,系统由红外与彩色双目立体视觉系统和近距离单目视觉系统组成。研究了基于红外与彩色立体视觉系统三维重建技术,在空间中利用颜色、深度和位置信息进行苹果识别的方法,多视觉传感器之间组合标定的方法。试验结果表明:在距离目标2.4m时,苹果识别率为83.3%,空间定位的标准偏差为4.9cm;在距离1.5m时,苹果识别率为100%,空间定位的标准偏差为3.3cm。
Now, there are a lot of problems about orchard planting and management technology, such as labor-intensity, excessive pesticide spraying and low fruit quality and so on. Using machine vision technology to study fruit quality rating, target spraying and fruit picking robot, is significant to improve orchard automation technology, and reduce labor intensity, grab farming, reduce losses, promote high fruit quality. In summary, based on international research, this paper studies pre-harvest cherry quality detection methods, suckers real-time identification and location, and apple detection and location methods in space:
     1、Research on color rating for cherry fruits under natural lighting conditions. The changing mathematical model of cherry skin color is established firstly in the same light conditions. Based on the model a color classifier is established to solve the problem that cherry skin color affected by environmental color temperature. Validation test have been conducted under different lighting conditions (5700K in orchard, and 5400K,6500K in lab). The results showed that the method could automatically grade cherry samples into seven pre-defined color levels under varying lighting conditions. Using rating results obtained by an experienced horticulturist on same samples as bench marks, this developed computer rating method resulted in an acceptable accuracy level of 87% match.
     2、Using an active light source to conduct accurate color rating of sweet cherries in an outdoor environment. It used a camera flash to reduce the effects of two major obstacles in outdoor color rating:(1) inconsistent ambient light and (2) specular reflections on cherry skins. An image processing algorithm was developed to classify cheery colors into seven levels. Tests showed that the overall rating accuracy was 93% in three different outdoor lighting conditions(4800K,5700K, 7700K). The results validated the feasibility of using computer vision to realize accurate color rating of sweet cherries in an outdoor environment.
     3、Research on the measurement methods of cherry fruit width, including oval fitting, circle fitting and rotation searching three methods. Using cherry width results obtained by a ruler on same samples as bench marks, the results showed that the oval fitting method is better than others, the standard deviation (std) is 0.48mm, the coefficient of determination (R2) is 0.822.
     4、Developed the sucker location and recognition system based on a laser range finder and a color CCD camera. The camera mathematical model and self-calibration methods are studied. The space measurement and calibration method are developed firstly based on laser and camera. A real-time image processing algorithm was developed to detect and locate sucker based on sucker's color, texture and position. The results of field trials showed that this method could locate and recognize suckers effectively. The measurement accuracy of sucker size is 80%, the identification accuracy of trunk is 98%.
     5、In order to auto pick apple, an intelligent vision system was design and implemented. A single and binocular stereo vision system was developed firstly, the system consists of a binocular vision system with infrared and color vision sensor and a single vision system to measure close object. A calibration with two vision system was done to ensure that the information can transmit each other accurately. Using three-dimensional reconstruction techniques, an image processing algorithm was developed to detect the apple, based on the color, depth shape and location. Validation test show that the accuracy of identification is 83.3%, the depth's Mean Square Error (MSE) is 0.049m; when the distance is about 2.4 m from sensor to object, the accuracy is 100%, MSE is 2.3cm when the distance is about 1.5m. The results validated the feasibility of an intelligent vision system to realize accurate detection and location.
引文
[1]中国农业信息网.2005年上半年水果市场形势分析报告http://www.agri.gov.cn/xxfb/t20050829_447677.htm
    [2]中国农业发展报告.2005年上半年水果市场形势分析报告http://www.agri.gov.cn/sjzl/baipsh/2005.htm.
    [3]李传友.京郊林果机械化发展现状与趋势.[J]农机科技推广.2008(9),31-32.
    [4]Takeda, F., V. Drane, and M. S. Saunders. Inhibiting sprouting in Muscadine grapes [J]. Proc. Fla. State Hort. Soc., 1982,95:127-128.
    [5]Dolci, M., F. Galeotti, P. Curir, et al. New 2-naphthyloxyacetates for trunk sucker growth control on grapevine (Vitis vinifera L.) [J]. Plant Growth Regulation,2004,44:47-52.
    [6]Byrne, M. E., and G. S. Howell. Initial response of Baco noir grapevines to pruning severity, sucker removal and weed control [J]. Am. J. Enol. Vitic.,1978,29(3):192-198.
    [7]Smith, R. J., K. M. Klonsky and R. L. De Moura. Sample costs to establish a vineyard and produce winegrapes [R]. University of California Cooperative Extension,2010.
    [8]Brumat, B. Implement applicable to a tractor for sucker removal, bud removal and topping of vines [P]. U.S. Patent: 4257213,1981.
    [9]Simmons, N. J. Grapevine suckering tool [P]. U.S. Patent: 4328660,1982.
    [10]崔玉洁,张祖立,白晓虎.采摘机器人的研究进展与现状分析[J].农机化研究,2007,2:4-7.
    [11]Milan, S., H. Vaclav, B. Roger. Image Processing, Analysis and Machine Vision, Second Edition. U.S.A: Thomson Brooks/Cole Press,2001
    [12]贾云得.机器视觉.北京:科学出版社,2000
    [13]Kondo, N.,2009. Robotization in fruit grading system. Sensing and Instrumentation for Food Quality and Safety 3, 81-87.
    [14]Kondo, N., Ahmad, U., Monta, M., Murase, H.,2000. Machine vision based quality evaluation of Iyokan orange fruit using neural networks. Computers and Electronics in Agriculture 29,135-147.
    [15]Leemans, V., Magein, H., Destain, M.F.,2002. On-line fruit grading according to their external quality using machine vision. Biosystems Engineering 83,397-404.
    [16]Zou, X., Zhao, J., Li, Y.,2007. Apple color grading based on organization feature parameters. Pattern Recognition Letters 28,2046-2053.
    [17]Guyer, D.E., Uthaisombut, P., Stockman, G.C.,1996. Tissue reflectance and machine vision for automated sweet cherry sorting, Optics in Agriculture, Forestry, and Biological Processing II, November 19,1996 - November 20, 1996, Boston, MA, USA, pp.152-165.
    [18]Rosenberger, C., Emile, B., Laurent, H.,2004. Calibration and quality control of cherries by artificial vision. Journal of Electronic Imaging 13,539-546.
    [19]Mustafa, N.B.A., Fuad, N.A., Ahmed, S.K., Abidin, A.A.Z., Ali, Z., Wong, B.Y., Sharrif, Z.A.M.,2008. Image processing of an agriculture produce: determination of size and ripeness of a banana, International Symposium on Information Technology 2008, ITSim, August 26,2008 - August 29,2008. Inst. of Elec. and Elec. Eng. Computer Society, Kuala Lumpur, Malaysia, p. IEEE.
    [20]Lee, D.-J., Archibald, J.K., Chang, Y.-C., Greco, C.R.,2008. Robust color space conversion and color distribution analysis techniques for date maturity evaluation. Journal of Food Engineering 88,364-372.
    [21]Petrisor, C., Radu, G.-L., Balan, V., Campeanu, G.,2010. Reflectance spectroscopy as a useful tool for monitoring apricot fruit quality and ripening. UPB Scientific Bulletin, Series B:Chemistry and Materials Science 72,75-80.
    [22]Rahman, F.Y.A., Baki, S.R.M.S., Yassin, A.I.M., Tahir, N.M., Ishak, W.I.W.,2009. Monitoring of watermelon ripeness based on fuzzy logic,2009 WRI World Congress on Computer Science and Information Engineering, CSIE 2009, March 31,2009-April 2,2009. IEEE Computer Society, Los Angeles, CA, United states, pp.67-70.
    [23]Zhang, Y., Yin, X., Xu, T., Zhao, J.,2009. On-line sorting maturity of cherry tomato by machine vision, Computer
    and Computing Technologies in Agriculture. Springer, Boston, MA, pp.2223-2229.
    [24]Alcixos. N. Blascu. J. Multispectral inspection of citrus in real-time using machine vision and digital signal processor. Computer and Electronics in agriculture.2002(33):121-137.
    [25]Aleixos. N, Biusco. J, Molto'E. Design of a vision system for real-time inspection of oranges. Pattern Recogn Image Anal.1999,1,387-394.
    [26]Abdullah M z. Mohamad-Salch J, Fathinul · Syabir A s'd al. Discrimination and classification of fresh-cut starfruits(Averrhoa carambola L.) using automatic machine vision system. Journal of Food Engineering,2006,76(4): 506-523
    [27]Mendoza. F, Dejmek. P. Aguilera J M. Calibrate Color measurements of agricultural foods using image analysis. Postharvest Biology and Technology,2006,41(3):285-295.
    [28]饶秀勤.基于机器视觉的水果品质实时检测与分级生产线的关键技术研究.[博士学位论文].浙江:浙江大学2007.
    [29]刘禾.机器视觉在水果自动分析中的应用研究--苹果自动分级[博士学位论文].北京:中国农业大学1995。
    [30]籍保平,吴文才.计算机视觉苹果分级系统.[J]农业机械学报.2000,31(6):118-121.
    [31]龙满生,何东健.基于遗传神经网络的苹果综合分级系统.[J]西北农林科技大学学报,2001(12):108-111.
    [32]应义斌.水果形状的傅里叶描述予研究.[J]生物数学学报.2001,16(2):234-240.
    [33]黄险峰,瞿磊.基于数字轮廓图像的水果体积估计.[J]广东自动化与信息工程.2003(3):14-17.
    [34]郭峰,曹其新,谢国俊等.基于OHTA颜色空间的瓜果轮廓提取方法.[J]农业机械学报,2005,36(11):113·116
    [35]冯斌,王懋华基于颜色分形的水果计算机视觉分级技术.[J]农业工程学报.2002,18(2):141·144
    [36]Slaughter, D.C., D. K. Giles and D. Downey. Autonomous robotic weed control systems:a review [J]. Computers and Electronics in Agriculture,2008,61(1):63-78.
    [37]Lee, W. S., D. C. Slaughter and D. K. Giles. Robotic weed control system for tomatoes [J]. Precision Agriculture, 1999,(1):95-113.
    [38]Lamm, R. D., D. C. Slaughter and D. K. Giles. Precision weed control system for cotton [J]. Transactions of the ASAE,2002,45(1):231-238.
    [39]Tian, L., J. F. Reid and J. W. Hummel. Development of a precision sprayer for site-specific weed management [J]. Transactions of the ASAE,1999,42(4):893-900.
    [40]Lei Tian. Development of a sensor-based precision herbicide application system [J]. Computers and Electronics in Agriculture,2002, (36):133-149.
    [41]Lei Tian. Sensor-based precision chemical application systems [C]. Proc. World Congress of Computers in agriculture and Natural Resources, Iguacu Falls, Brazil, March 13-15,2002:279-289.
    [42]Nieuwenhuizen, A. T., Tang, L., Hofstee, J. W., Muller, J.,& van Henten, E. J. (2007). Colour based detection of volunteer potatoes as weeds in sugar beet fields using machine vision. Precision Agriculture,8(6),267-278.
    [43]A.T. Nieuwenhuizen, J.W. Hofstee and E.J. van Henten.2010. Performance evaluation of an automated detection and control system for volunteer potatoes in sugar beet fields.Biosystems Engineering, Volume 107, Issue 1:46-53
    [44]Woebbecke D. M., G. E. Meyer, K. Von Bargen, et al. Color indices for weed identification under various soil, residue, and lighting conditions. Transactions of the ASAE,1995,38(1):259-269.
    [45]Tang L., L.F. Tian, B.L.Steward, et al. Texture-based weed classification using Gabor wavelets and neural network for real-time selective herbicide applications. ASAE,1999,Paper No.993036
    [46]Tang L., L. Tian, B.L.Steward. Color image segmentation with genetic algorithm for in2field weed sensing. Transactionof the ASAE,2000,43(4):1019-1027
    [47]Hayes J.C., Y.J.Han.Comparison of crop cover measuring systems. ASAE,1989, Paper No.89-2663
    [48]Shear S.A.,R.G.Holmes. Plant identification using clor cooccurrence matrices. Transactions of the ASAE,1990, 33(6):2037-2044
    [49]Meyer G.E.,T. Mehta,M.F. Kocher, et al. Textural imaging and discriminant analysis for distinguishing weeds for spot spraying. Transactions of the ASAE,1998,41(4):1189-1197
    [50]Burks T.F., S.A.Shearer, R.S.Gates, et al. Backpropagation neural network design and evaluation for classifying weed species using color image texture. Transactions of the ASAE,2000,43(4):1029-1037
    [51]Burks T. F., S. A.Shearer, F. A.Payne. Classification of weed species using color texture features and discriminant analysis. Transactions of ASAE,2000,43 (2):441-448
    [52]相阿荣.识别杂草和土壤背景物的图像处理方法研究:[硕士学位论文].北京:中国农业大学,2001
    [53]龙满生.玉米苗期杂草识别的机器视觉研究:[硕士学位论文].杨凌:西北农林科技大学,2002
    [54]刘敏.基于分形的田间杂草图像分析与识别的研究:[硕士学位论文].北京:北京工业大学,2002
    [55]张健钦.计算机视觉技术在杂草识别中的应用研究:[硕士学位论文].河北:河北农业大学,2003
    [56]纪寿文,王荣本,陈佳娟,等.应用计算机图像处理技术识别玉米苗期田间杂草的研究.农业工程学报,2000,17(2):154-155
    [57]Y.Sariy.Robotics of fruit harvesting:A sate-of-the-art review[J]Journal of Agricultural Engineering Research,1993,54:265-280.
    [58]R. C. Harrell, D. C. Slaughter and P. D. Adsit. A fruit-tracking system for robotic harvesting. [J]Machine Vision and Applications Volume 2, Number 2,69-80, DOI: 10.1007/BF01212369
    [59]陈飞,蔡健荣.柑橘收获机器人技术研究进展[J].农机化研究,2008,7:232-235.
    [60]Kondo N.Ting K C. Robotics for plant production[J].Artificial Intelligence Review,1998,12(13):227-243.
    [61]N.Kondo.Robotics for Bio-production Systems[M].[S.l.]:ASAE Publisher,1998.
    [62]周增产,J Bontsema,L Van Kollenburg-Crisan荷兰黄瓜收获机器人的研究开发[J].农业工程学报,2001,17(6):77-80.
    [63]崔玉洁,张祖立,白晓虎.采摘机器人的研究进展与现状分析[J].农机化研究,2007,2:4-7.
    [64]Johan B., Kevin D., Sven B., et al. Autonomous Fruit Picking Machine:A Robotic Apple Harvester.6th International Conference on Field and Service Robotics-FSR 2007, Chamonix:France
    [65]Hannan, M. W. And T.F. Burks.2004. Current developments in automated citrus harvesting. ASAE Paper No. 04-3087. St. Joseph, Mich.:ASAE.
    [66]Bulanon, D.M., T. Kataoka, Y. Ota and T. Hiroma.2001. A machine vision system for the apple harvesting robot. Agricultural Engineering International:the CIGR Journal of Scientific Research and Development. Manuscript PM 01 006. Vol. III.
    [67]Bulanon DM, Kataoka T, Ukamoto H, Hata S (2004) Development of a real-time machine vision system for the apple harvesting robot. In SICE Annual Conference in Sapporo, pages 595-598, Hokkaido Inst. Of Techn., Japan
    [681 Bulanon D M.Kataoka T.Okamoto H Determining the 3-D location of the apple fruit during harvest. Automation Technology for Off-Road Equipment, Proceedings of the 7-8 October 2004 Conference (Kyoto, Japan) Publication Date 7 October 2004 ASAE Publication Number 701P1004
    [69]ulanon DM, Kataoka T, Okamoto H, Hata S. Feedback control of manipulator using machine vision for robotic apple harvesting.2005, In Proceedings of ASAE, Paper No.053114, Tampa, USA
    [70]胡桂仙,于勇,王俊.农业机器人的开发与应用[J].试验与研究,20022:45-47.
    [71]EDAN Y,ROGOZIN D,FLASH T,etal.Robotic melon harvesting[J].Robotics and Automation,2000,16(6):831-835.
    [72]http://mama.agr.okayama-u.ac.jp/lase/straw.html
    [73]周云山,李强,李红英等.计算机视觉在蘑菇采摘机器人上的应用[J].农业工程学报,1995,11(4):27-32.
    [74]陈利兵.草莓收获机器人采摘系统研究[D].北京:中国农业大学硕士学位论文,2005.
    [75]张瑞合,姬长英,沈明霞.计算机视觉技术在番茄收获中的应用[J].农业机械学报,2001,32(5):50-58.
    [76]周天娟,张铁中.果蔬采摘机器人技术研究进展与分析[J].农业机械,2006,22:38-39.
    [77]汤修映.果蔬收获机器人系统的研究[博士学位论文].中国农业大学.2006
    [78]赵金英.基于三维视觉的西红柿采摘机器人技术研究[博士学位论文].中国农业大学,2006.
    [79]宋健.开放式茄子采摘机器人关键技术研究[博士学位论文].中国农业大学,2006
    [80]仲琴.基于机器视觉的番茄收获机器人目标定位技术研究[硕士学位论文].江苏大学,2005
    [81]李明喜.基于多源图像融合的收获目标准确定位研究[博士学位论文].江苏大学,2008
    [82]王津京.基于支持向量机苹果采摘机器人视觉系统的研究[硕士学位论文].江苏大学,2009
    [83]周孝宽,曹晓光,陈建革.实用微机图像处理.北京:北京航空航天大学出版社,1994,135-138.
    [84]沈清,汤霖.模式识别导论.长沙:国防科技大学出版社,1998,3-5.
    [85]李云翔.相机标定与三维重建技术研究[硕士学位论文].青岛大学,2009.
    [86]Davis J, Nehab D, Ramamoothi R. Spacetime Stereo:A Unifying Framework for Depth from Triangulation. IEEE Trans. On Pattern Analysis and Machine Intelligence,2005,27(2):296-302
    [87]Tarini M, Caflieri M, Montani C. Marching Intersections:An Efficient Approach to Shape from Silhouette. Proc. of the 5th IEEE Workshop on Vision, Modeling and Visualization(VMV),2002:283-290
    [88]Zhengyou Zhang. A flexible new technique for camera calibration[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, November 2000,22(11):1330—1334
    [89]Brown D C. Close-range camera calibration[J]. Photogrammetric Engineering,1971,37(8):855.866
    [90]Gennery D B. Stereo—camera calibration[M]. Proc. of Image Understanding Workshop,1979
    [91]K. M. Wong. Mathematical formulation and digital analysis in close-range photogrammetry[J]. Photogrammetric Engineering,1975,41(8):1355—1373
    [92]J. Y Luh, J. A. Klaasen. A three—dimensional vision by off-shelf system with multi—cameras[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,1985, PAMI-7:35—45
    [93]Tsai R Y A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-shelf TV cameras and lenses[J]. IEEE Journal of Robotics and Automation,1987,3(4):323—344
    [94]H. Martinsson, E Gaspard, A. Bartoli,et al. Energy—based reconstruction of 3D curves for quality control[M]. Proc.,2007,4679 NCS:414-428 Ezhou,China:Springer Vedag,Heidelberg, D·69121, Germany
    [95]H. C. Longuet-Higgins. A computer algorithm for reconstructing a scene from two projections. Nature,1981, 293(10):133-135
    [96]D. Faugeras, S. J. Maybank. Motion from point matches:multiplicity of solutions. Intl. J. Computer Vision, 1990,4:225-246
    [97]Maybank S, Faugeras o[J]. Atheory of self-calibration of a moving camera. International Journal of Computer Vision,1992,8(2):123-151
    [98]Q. T. Luong, R. Deriche, O. D. Faugeras[J]. On determining the fundamental matrix:analysis of different method and experiment results. NRIA Research Report,1993, No.1 894:
    [99]S Ma,G Wei. A self-calibration technique for active vision system[M]. Proc ACCV'95,1995,1995:628-631
    [100]S. Ma. A self-calibration technique for active vision system[J]. IEEE Trans Robotics and Automation,1996,12: 114—120
    [101]D. Faugeras, B. Mourrain. On the geometry and algebra of the point and line correspondences between images[M]. Proc. ICCV'95,1 995:95 1-956
    [102]R. I. Hartley. A Linear method for reconstruction from lines and points[M]. Proc. ICCV'95,1995,1995: 883.887
    [103]马颂德、张正友,计算机视觉--计算机理论与算法基础,科学出版社.
    [104]Rafael C. Gonzalez, Richard E. Woods著.阮秋琦,阮宇智等译.数字图像处理(第二版).北京:电子工业出版社.2003
    [105]陈纯.计算机图像处理技术与算法.北京:清华大学出版社.2003,75-89
    [106]孟章荣.各种颜色模型选用需求分析.中国图像图形学报,1996,1(3):238-24
    [107]章毓晋.图像分割.北京:科学出版社.2001
    [108]http://baike.baidu.com/view/83837.htm
    [109]方如明,蔡健荣,许俐.计算机图象处理技术及其在农业工程中的应用.北京:清华大学出版社,1999
    [110]吕凤军.数字图象处理编程入门——做一个自己的Photoshop北京:清华大学出版社,1999,43-47
    [111]叶晓东,朱兆达.中值滤波的快速算法.信号处理,1997,13(3):227-230
    [112]王润生.图像理解.长沙:国防科技大学出版社,1995,88-96
    [113]崔屹.图象处理与分析——数学形态学方法及应用.北京:科学出版社,2000
    [114]谷尻丰寿.最新画像处理入门.日本东京:株式会社技术评论社,1996,108-112
    [115]http://baike.baidu.com/view/1575310.htm
    [116]毛文华.基于机器视觉的田间杂草识别技术研究[博士学位论文].中国农业大学,2004
    [117]B.S. Manjunathi and W.Y. Ma. Texture Features for Browsing and Retrieval of Image Data.[J] IEEE Transactions pattern analysis and machine intelligence, VOL.18, NO.8, AUGUST 1996
    [118]Tamura, H., Mori, T., and Yamawaki, T., Textural Features Corresponding to Visual Perception, SMC(8), June 1978, pp.460-473.
    [119]Diaz-Mula, H.M., Castillo, S., Martinez-Romero, D., Valero, D., Zapata, P.J., Guillen, F., Serrano, M.,2009. Sensory, nutritive and functional properties of sweet cherry as affected by cultivar and ripening stage. Food Science and Technology International 15,535-543.
    [120]Timm, E.J., Guyer, D.E., Brown, G.K., Schulte, N.L.,1995. Michigan sweet cherry color measurement and prototype color chip development. Applied Engineering in Agriculture 11,403-407.
    [121]Francis, F.J.,1980. Colour quality evaluation of horticultural crops. HortScience 15,58-59.
    [122]Brearley, N., Cuthbert, R.M., Breeze, J.E.,1964. Production of Standard Comparator for Skin Color of Mature Cherries. Food Technology 18,231-233.
    [123]Brosnan, T., Sun, D.-W.,2004. Improving quality inspection of food products by computer vision--a review. Journal of Food Engineering 61,3-16.
    [124]Esti, M., Cinquanta, L., Sinesio, F., Moneta, E., Di Matteo, M.,2002. Physicochemical and sensory fruit characteristics of two sweet cherry cultivars after cool storage. Food Chemistry 76,399-405.
    [125]Slaughter, D.C., Harrell, R.C.,1989. Discriminating fruit for robotic harvest using color in natural outdoor scenes. Transactions of the American Society of Agricultural Engineers 32,757-763.
    [126]Sojodishijani, O., Ramli, A.R., Rostami, V., Samsudin, K., Saripan, M.I.,2010. Just-in-time outdoor color discrimination using adaptive similarity-based classifier. IEICE Electronics Express 7,339-345.
    [127]Shankar Rao, Hossein Mobahi, Allen Yang, Shankar Sastry and Yi Ma. Natural Image Segmentation with Adaptive Texture and Boundary Encoding.Proceedings of the Asian Conference on Computer Vision (ACCV) 2009, Part I, LNCS 5994, pp.135-146, Springer
    [128]Jimenez, A. R., A. K. Jain, R. Ceres, et al. Automatic fruit recognition: a survey and new results using range/attenuation images [J]. Pattern Recognition,1999,32(10):1719-1736.
    [129]Jimenez, A. R., R. Ceres and J. L. Pons. A vision system based on a laser range-finder applied to robotic fruit harvesting [J]. Machine Vision and Applications,2000,11:321-329.
    [130]Kondo, N., M. Monta and T. Fujiura. Fruit harvesting robots in Japan [J]. Adv. Space Res.,1996,18(1/2):181-184.
    [131]Monta, M., K. Namba and N. Kondo. Three dimensional sensing system using laser scanner [C]. ASAE Annual International Meeting, Ottawa, Ontario, Canada, August 1-4,2004, Paper No:041158.
    [132]Kawata, H. Communication protocol specification for SCIP2.0 standard [Z]. Japan: Hokuyo Automatic CO., LTD, 2006-10-10.
    [133]宋健,张铁中,徐丽明等.果蔬采摘机器人研究进展与展望.农业机械学报,2006,5(37):158-162
    [134]林宝龙.基于三维视觉的水果采摘机器人技术研究[硕士学位论文]北京:中国农业大学,2004
    [135]陈利兵.草莓收获机器人采摘系统研究[硕士学位论文].中国农业大学,2005
    [136]谢志勇.草莓采摘机器人视觉技术研究[硕士学位论文].中国农业大学,2006.
    [137]袁国勇.黄瓜采摘机器人目标识别与定位系统研究[硕士学位论文].中国农业大学,2006
    [138]赵金英.基于三维视觉的西红柿采摘机器人技术研究[博士学位论文].中国农业大学,2006.
    [139]郑小东.基于双目立体视觉的番茄识别与定位技术[硕士学位论文].江苏大学,2004
    [140]汤建华.番茄收获机器人中视觉目标的自动分割与识别[硕士学位论文].江苏大学,2005
    [141]仲琴.基于机器视觉的番茄收获机器人目标定位技术研究[硕士学位论文].江苏大学,2005
    [142]林伟明.收获机器人成熟番茄视觉识别技术研究[硕士学位论文].江苏大学,2005
    [143]王沈辉.机器人采摘番茄中的双目定位技术研究[硕士学位论文].江苏大学,2006
    [144]赵庆波.果树采摘机器人控制与避障技术研究[硕士学位论文].江苏大学,2008
    [145]李明喜.基于多源图像融合的收获目标准确定位研究[博士学位论文].江苏大学,2008
    [146]徐惠荣.基于机器视觉的树上柑桔识别方法研究[硕士学位论文].浙江大学,2004
    [147]申川.基于双目立体视觉系统的设施农业作物位置信息提取[硕士学位论文].浙江大学,2006.
    [148]吕宏明.基于机器视觉的番茄图像匹配算法研究[硕士学位论文].南京农业大学,2008
    [149]姚立健.茄子收获机器人视觉系统和机械臂避障规划研究[博士学位论文].南京农业大学,2008
    [150]尹东晓.基于机器视觉的果实识别与定位技术[硕士学位论文].华南农业大学,2007
    [151]张江州.荔枝采摘机械手机器视觉部分的关键技术研究[硕士学位论文].华南农业大学,2009
    [152]Subrata, I.D.M.; Fujiura, T.; Nakao, S.; et al,T.3-D vision sensor for cherry tomato harvesting robot. Japan agricultural reserch quartely,31(4):257-264,1997.
    [153]http://baike.baidu.com/view/3766855.htm
    [154]Vincenzo L,Bruno S,Luigi V. Position-based visual servoing in industrial multirobot cells using a hybrid camera configuration [J].IEEE tran-sactions on robotics,2007,23(1):73-86.
    [155]Gian L M,Giuseppe O.Domenico P.Image-based visual servoing for nonholonomic mobile robots using epipolar geometry[J].IEEE transactions on robotics,2007,23(1):87-100.
    [156]石玉秋,孙炜,孙洪淋.一种三关节机器人视觉伺服系统研究[J].装备制造技术,2006,2:12-15.
    [157]席砺莼,闫宏伟.彩色图像的分割技术.微机发展,2003,13(4):46-48
    [158]张铁中.周天娟草莓采摘机器人的研究:Ⅰ.基于BP神经网络的草莓图像分割.-中国农业大学学报2004(4):65-68
    [159]于辉,何雄奎,仲崇山,等.在静电喷雾中喷液物化特性对荷质比的影响[J].安徽农业科学,2007,35(15):4706-4707.
    [160]宋坚利,何雄奎,曾爱军,等.三种果园施药机械施药效果研究[J].中国农机化.2006,(5):79-82.
    [161]何雄奎,曾爱军,何娟.果园喷雾机风速对雾滴的沉积分布影响研究[J].农业工程学报,2002,18(4):75-77.
    [162]Landers, A. and E. Gil. Development and validation of a new deflector system to inprove pesticide application in New York and Pennsylvania grape production areas [C]. AS ABE Annual International Meeting, Portland, Oregon. USA, July 9-12,2006.
    [163]翟长远,赵春江,王秀,等.树型喷洒靶标外形轮廓探测方法[J].农业工程学报,2010,26(12):173-177.
    [164]Kise M., Q. Zhang and F. Rovira Mas. A stereovision-based crop row detection method for tractor-automated guidance [J], Biosystems Engineering,2005,90(4):357-367.
    [165]Rovira-Mas F., J. F. Reid and Q. Zhang. Stereovision data processing with 3D density maps for agricultural vehichles [J]. Transactions of the ASABE,2006,49(4):1213-1222.
    [166]Rovira-Mas, F., S. Han, J. Wei, et al. Autonomous guidance of a corn harvester using stereo vision. Agricultural Engineering International:the CIGR Ejournal. Manuscript ATOE 07 013. Vol. IX. July,2007.
    [167]Kise, M. and Q. Zhang. Development of a stereovision sensing system for 3D crop row structure mapping and tractor guidance [J]. Biosystems Engineering,2008,101:191-198.
    [168]Kise, M., B. Park, K. C. Lawrence, et al. Design and calibration of a dual-band imaging system [J]. Sens.& Instrumen. Food Qual.,2007,1:113-121.
    [169]Kise, M. and Q. Zhang. Reconstruction of a vitual 3D field scene from ground-based muti-spectral stereo imagery [C]. ASABE Annual International Meeting, Portland, Oregon, July 9-12,2006, Paper No.063098.
    [170]Giles, D. K. and D. C. Slaughter. Precision band spraying with machine-vision guidance and adjustable yaw nozzles [J]. Transactions of the ASAE,1997,40(1):29-36.
    [171]张俊雄,曹峥永,耿长兴,等.温室精准对靶喷雾机器人研制[J].农业工程学报,2009,25(增刊2):70-73.
    [172]Jiaqiang Zheng, Hongping Zhou, Youlin Xu, et al. Pilot study on toward-target precision pesticide application in forestry [C]. ASAE Annual International Meeting, Ottawa, Ontario, Canada, August 1-4,2004, Paper No:041006.
    [173]Yong Chen, Jiaqiang Zheng, Haitao Xiang, et al. Study on an intelligent system for precision pesticide application based on fuzzy control and machine vision [C]. ASABE Annual International Meeting, Portland, Oregon, July 9-12, 2006, Paper No:061129.
    [174]李志臣.基于机器视觉的杂草对准喷药控制系统研究[D].南京农业大学农业机械化工程系,2007.
    [175]Zwiggelaar, R. A review of spectral properties of plants and their potential use for crop/weed discrimination in row-crops [J]. Crop Protection,1998,17(3):189-206.
    [176]Scotford, I. M. and P. C. H. Miller. Applications of spectral reflectance techniques in Northern European cereal production: a review [J]. Biosystems Engineering,2005,90(3):235-250.
    [177]Barnes, E. M. and M. G. Baker. Multispectral data for mapping soil texture: possibilities and limitations [J]. Applied Engineering in Agriculture,2000,16(6):731-741.
    [178]Basso, B., J. T. Ritchie, F. J. Pierce, et al. Spatial validation of crop models for precision agriculture [J]. Agricultural Systems,2001,68:97-112.
    [179]Piron, A., V. Leemans, O. Kleynen, et al. Selection of the most efficient wavelength bands for discriminating weeds from crop [J]. Computers and Electronics in Agriculture,2008, (62):141-148.
    [180]Solari, F., J. Shanahan, R. Ferguson, et al. Active sensor reflectance measurements of corn nitrogen status and yield potential [J]. Agronomy Journal,2008,100(3):571-579.
    [181]Moshou, D., C. Bravo, J. West, et al. Automatic detection of'yellow rust'in wheat using reflectance measurements and neural networks [J]. Computers and Electronics in Agriculture,2004, (44):173-188.
    [182]Cui, D., Q. Zhang, M. Li, et al. Detection of soybean rust using a multispectral image sensor [J]. Sens.& Instrumen. Food Qual.,2009, (3):49-56.
    [183]Sui, R., J. A. Thomasson, J. Hanks, et al. Ground-based sensing system for weed mapping in cotton [J]. Computers and Electronics in Agriculture,2008, (60):31-38.
    [184]邓巍,赵春江,何雄奎,等.绿色植物靶标的光谱探测研究[J].光谱学与光谱分析,2010,30(8):2179-2183.
    [185]邓巍,何雄奎,张录达,等.自动对靶喷雾靶标红外探测研究[J].光谱学与光谱分析,2008,28(10): 2285-2289.
    [186]李丽,宋坚利,何雄奎.农作物喷雾靶标自动探测器设计与应用[J].农业机械学报,2010,41(7):54-63.
    [187]王瑞凤,杨宪江,吴伟东.发展中的红外热成像技术[J].红外与激光工程,2008,37(6):699-702.
    [188]宋玉伟,宋纯鹏.红外成像技术在生命科学中的应用[J].生命科学研究,2004,8(2):121-125.
    [189]Lee, K. H. and R. Ehsani. A laser scanner based measurement system for quantification of citrus tree geometric characteristics [J]. Applied Engineering in Agriculture,2009,25(5):777-788.
    [190]Lee, K. H., R. Ehsani, and W. S. Castle. A laser scanning system for estimating wind velocity reduction through tree windbreaks [J]. Computers and Electronics in Agriculture,2010,73:1-6.
    [191]Holmgren, J., M. Nilsson and H. Olsson. Estimation of tree height and stem volume on plots using airborne laser scanning [J]. Forest Science,2003,49(3):419-428.
    [192]Wangler, R. J., K. L. Fowler and R. E. McConnell Ⅱ. Object sensor and method for use in controlling an agricultural sprayer [P]. U.S. Patent: 5278423,1994-1-11.
    [193]孙长库,叶声华.激光测量技术[M].天津:天津大学出版社,2001:6-8.
    [194]陈家璧.激光原理及应用[M].北京:电子工业出版社,2004:10-17.
    [195]刘敬海,徐荣甫.激光器件与技术[M].北京:北京理工大学出版社,2004:1-4.
    [196]Lee, K. and R. Ehsani. Comparion of two 2D laser scanners for sensing object distances, shapes, and surface patterns [J]. Computers and Electronics in Agriculture,2008,60:250-262.
    [197]Hebert, M. Active and passive range sensing for robotics [C]. Proceedings of the 2000 IEEE International Conference on Robotics & Automation, San Francisco, CA, April 2000:102-110.
    [198]Lu, F. and E. Millios. Globally consistent range scan alignment for environment mapping [J]. Autonomous Robots, 1997,4(4):333-349.
    [199]Planas, S, J. R. Rosell, E. G. Moya, et al. Optimizing pesticide spray application in tree crops [C]. ASABE Annual International Meeting, Portland, Oregen, July 9-12,2006, Paper No:061128.
    [200]Teejet. Catalogs of Teejet nozzle tips [Z]. Teejet Technologies,37-43.
    [201]史久根,张培仁,陈真勇.CAN现场总线系统设计技术[M].北京:国防工业出版社,2004.
    [202]饶运涛,邹继军,王进宏,等.现场总线CAN原理与应用技术[M].北京:北京航空航天大学出版社,2007.
    [203]胡兴军,唐向阳,张勇.机器视觉技术及其在汽车制造质量检测中的应用[J].现代零部件,2005.
    [204]邓继忠,张泰岭,罗锡文.计算机视觉在农产品品质检测中的应用[J].广东农机,2000第2期:25.27.
    [205]Bosch, R. Controller area network, version 2.0 [Z]. Robert Bosch GmbH, Postfach 30 02 40, D-70442, Stuttgart, Germany,1991.
    [206]Scarlett, A. J. Integrated control of agricultural tractors and implements: a review of potential opportunities relating to cultivation and crop establishment machinery [J]. Computers and Electronics in Agricultures,2001,30:167-191.
    [207]Young, S. C., D. G. Sokol, and R. P. Strosser. Utilization of CAN technology in a distributed control system[C]. ASAE Annual International Meeting, Chicago, IL, December 14-17,1993, Paper No: 931535.
    [208]Holtmann, W. Electronics on Fendt tractors: a voyage through the tractor via computer [J]. Profi Int.,1996,96(11): 56-59.
    [209]Roberts, M. MF 8200 tractors: take a CAN bus trip around the big new Masseys [J]. Profi Int.,1999,99(9):47-49.
    [210]SAE standard recommended practice for a serial control and communication vehicle network [S]. J1939,2000-04.
    [211]SAE standard vehicle application layer diagnostics [S]. SAE J1939/73,1996.
    [212]SAE standard data link layer [S]. SAE J1939/21,2001.
    [213]SAE standard vehicle application layer (through December 2004) [S]. SAE J1939-71,2006.
    [214]Hofstee, J. W. and D. Gonense. Simulation of a controller area network-based tractor-implement data bus according to ISO 11783 [J]. Agric. Engng Res.,1996, (20):383-394.
    [215]Kawata, H., W. Santosh, T. Mori, et al. Development of ultra-small lightweight optical range sensor system [C]. IEEE/RSJ International Conference on Intelligent Robots and Systems, August 2005, Edmonton: 3277-3282.
    [216]De Moor A., J. Langenakens and E. Vereecke. Image of water sensitive paper as a tool for the evaluation of spray distribution of orchard sprayers [J]. Aspects of Applied Biology,2000,57:329-341.
    [217]Panneton, B. Image analysis of water-sensitive cards for spray coverage experiments [J]. Applied Engineering in Agriculture,2002,18(2):179-182.
    [218]Zhu, H. and S. M. Sciarini. Portable scanning system for spray deposit qualification [Z]. USDA-ARS Application Technology Research Unit, Wooster, Ohio, USA.
    [219]Kang, F., F. J. Pierce, D. B. Walsh, et al. A multi-nozzle sprayer system for targeted control of cutworm in vineyards [C]. ASABE Annual International Meeting, Pittsburgh, Pennsylvania, June 20-23,2010, Paper No:1008686.
    [220]Ohta, Y., T. Kanade and T. Sakai. Color information for region segmentation [J]. Computer Graphics and Image Processing,1980,13:222-241.
    [221]Han, S., Q. Zhang, B. Ni, et al. A guidance directrix approach to vision-based vehicle guidance systems [J]. Computers and Electronics in Agriculture,2004,34(3):179-195.
    [222]曹晶晶,王一鸣,毛文华,等.基于纹理和位置特征的麦田杂草识别方法[J].农业机械学报,2007,38(4):107-110.
    [223]徐增辉,张彦娥.温室黄瓜叶片图像的白平衡处理[J].农业机械学报,2007,38(11):189-191.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700