CN102800083A - Crop spraying positioning method based on binocular vision gridding partition matching algorithm - Google Patents

Crop spraying positioning method based on binocular vision gridding partition matching algorithm Download PDF

Info

Publication number
CN102800083A
CN102800083A CN2012102037124A CN201210203712A CN102800083A CN 102800083 A CN102800083 A CN 102800083A CN 2012102037124 A CN2012102037124 A CN 2012102037124A CN 201210203712 A CN201210203712 A CN 201210203712A CN 102800083 A CN102800083 A CN 102800083A
Authority
CN
China
Prior art keywords
grid
target crop
point
matching
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012102037124A
Other languages
Chinese (zh)
Other versions
CN102800083B (en
Inventor
张宾
刘涛
郑承云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN201210203712.4A priority Critical patent/CN102800083B/en
Publication of CN102800083A publication Critical patent/CN102800083A/en
Application granted granted Critical
Publication of CN102800083B publication Critical patent/CN102800083B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

本发明公开了农作物施药技术领域中的一种基于双目视觉网格划分匹配算法的农作物喷雾定位方法。包括:利用双摄像头标定目标作物并获取目标作物的图像;分别获取左摄像头目标作物的二值图像和右摄像头目标作物的二值图像;分别对左摄像头获取的目标作物的二值图像和右摄像头获取的目标作物的二值图像进行网格划分;对左侧网格区域中的每个点,在右侧网格区域中进行匹配搜索,得到相互匹配的点,形成匹配点对;由各匹配点对计算左右图像视差并求取所述匹配点对对应的目标作物的点的三维坐标;删除错误点并对目标作物的点进行拟合处理,得到拟合曲线或曲面;根据拟合曲线或曲面规划喷头路径。本发明实现了目标作物的三维信息轮廓提取与定位。

Figure 201210203712

The invention discloses a crop spray positioning method based on a binocular vision grid division matching algorithm in the technical field of crop spraying. Including: use dual cameras to calibrate the target crop and obtain the image of the target crop; respectively obtain the binary image of the target crop of the left camera and the binary image of the target crop of the right camera; respectively obtain the binary image of the target crop obtained by the left camera and the right camera The acquired binary image of the target crop is divided into grids; for each point in the grid area on the left, a matching search is performed in the grid area on the right, and the matching points are obtained to form matching point pairs; Calculate the parallax of the left and right images and obtain the three-dimensional coordinates of the points of the corresponding target crops of the matching point pairs; delete the wrong points and perform fitting processing on the points of the target crops to obtain a fitting curve or surface; according to the fitting curve or Surface planning sprinkler path. The invention realizes the three-dimensional information contour extraction and positioning of target crops.

Figure 201210203712

Description

基于双目视觉网格划分匹配算法的农作物喷雾定位方法Crop spray positioning method based on binocular vision grid division matching algorithm

技术领域 technical field

本发明属于农作物施药技术领域,尤其涉及一种基于双目视觉网格划分匹配算法的农作物喷雾定位方法。The invention belongs to the technical field of crop spraying, and in particular relates to a crop spray positioning method based on a binocular vision grid division matching algorithm.

背景技术 Background technique

在农业生产过程中,为防治病虫害,往往需要经过多次农药喷洒。而施药的过程会对环境和操作人员的健康造成一定的危害,特别是在温室条件下,空间相对封闭,施药次数多,因此危害更为明显。通过研制自动化精准施药系统,可将药液直接喷洒到作物表面,避免药液浪费,提高药液使用效率、减少环境污染、保障劳动者健康、减轻劳动强度。研制自动化施药系统有着重要的现实意义和社会价值。In the process of agricultural production, in order to prevent pests and diseases, it is often necessary to spray pesticides many times. The process of applying pesticides will cause certain harm to the environment and the health of operators, especially in greenhouse conditions, where the space is relatively closed and the frequency of pesticide application is high, so the hazards are more obvious. Through the development of an automated precision pesticide application system, the liquid medicine can be sprayed directly on the surface of the crops, avoiding waste of the liquid medicine, improving the use efficiency of the medicine liquid, reducing environmental pollution, ensuring the health of workers, and reducing labor intensity. The development of automatic pesticide application system has important practical significance and social value.

在自动化施药系统中,目前主要存在的问题有:In the automated pesticide application system, the main problems currently are:

1、喷药定位不够准确,药液浪费严重。根据2010年国际植保机械与施药技术学术会议资料显示我国农药平均利用率极低,只有20%左右。大部分的农药都没有得到充分有效的利用,究其根源一方面在于施药方法和手段不够科学合理,另外在农药的使用上多采用粗放式喷药,缺乏精准施药的技术与条件。1. The spray positioning is not accurate enough, and the liquid medicine is wasted seriously. According to the 2010 International Conference on Plant Protection Machinery and Pesticide Application Technology, the average utilization rate of pesticides in my country is extremely low, only about 20%. Most of the pesticides have not been fully and effectively used. The root cause is that the methods and methods of pesticide application are not scientific and reasonable. In addition, extensive spraying is often used in the use of pesticides, and the technology and conditions for precise pesticide application are lacking.

2、农药喷洒不够均匀,作物表面药液残留超标,尤其在温室中生产的作物更为明显。药液喷洒时的雾化效果和喷雾作业方式对喷雾的均匀性有着很大的影响。资料显示,采用静电喷雾可以形成微小的雾滴颗粒,并具有良好的附着性,有利于减少重喷和漏喷,提高喷雾的均匀性。采用防漂移等技术也可一定程度上改善喷雾效果,但从根本上讲喷雾定位的准确性直接会影响到喷雾质量。2. The spraying of pesticides is not uniform enough, and the residues of pesticides on the surface of crops exceed the standard, especially for crops produced in greenhouses. The atomization effect and spray operation mode of the chemical liquid spray have a great influence on the uniformity of the spray. According to the data, the use of electrostatic spraying can form tiny droplet particles with good adhesion, which is beneficial to reduce re-spraying and missed spraying, and improve the uniformity of spraying. Anti-drift and other technologies can also improve the spray effect to a certain extent, but fundamentally speaking, the accuracy of spray positioning will directly affect the spray quality.

3、喷雾农机具的使用适应性有限。例如国外在果树园中使用的喷雾机,采用超生波喷雾定位,这种方式要求果树以特定的距离和排列方式栽培,只要在超声波检测范围内存在物体,就会进行喷雾。喷雾时不论作物形态如何,都以同样方式工作。因此,当环境和作物发生变化后就很难有效工作。3. The use adaptability of spray agricultural machinery is limited. For example, sprayers used in foreign orchards use ultrasonic spray positioning. This method requires fruit trees to be cultivated at a specific distance and arrangement. As long as there is an object within the ultrasonic detection range, it will be sprayed. It works the same way regardless of the crop form when spraying. Therefore, it is difficult to work effectively when the environment and crops change.

4、用于自动化精准喷雾的机器人定位检测效果不够理想,实时性较差。例如,运用视觉检测技术对特定病虫害区域进行喷雾的机器人,其定位检测的算法上,往往较为复杂,需要一定的计算时间。同时,对需要施药的目标作物检测也存在一定的错误率。4. The robot positioning detection effect for automatic and precise spraying is not ideal, and the real-time performance is poor. For example, for a robot that uses visual detection technology to spray a specific area of pests and diseases, its positioning detection algorithm is often complicated and requires a certain amount of computing time. At the same time, there is also a certain error rate in the detection of target crops that need to be sprayed.

5、对农作物喷雾施药基本使用的都属于二维定位系统。在工作过程中,一般都是先通过特定的传感器或摄像头先检测并获取喷雾对象的二维信息,将喷头移动到指定位置或对多个喷头的开闭进行控制,而喷头与目标作物的距离往往都是事先设定好的,工作过程中并不调整。因此,当作物形态、大小存在一定差异时,就会造成不同的喷雾效果。5. The two-dimensional positioning system is basically used for the spraying and application of crops. During the working process, it is generally first to detect and obtain the two-dimensional information of the spray object through a specific sensor or camera, to move the nozzle to the specified position or to control the opening and closing of multiple nozzles, and the distance between the nozzle and the target crop It is often set in advance and is not adjusted during the work process. Therefore, when there is a certain difference in the shape and size of the crops, different spray effects will be caused.

6、农药喷洒自动化系统的性价比同样是制约其广泛应用的一个问题。但是,随着设施农业数量和技术的不断发展,同时与老龄化社会到来相伴随的劳动力成本不断上升,自动化喷雾作业将有着广阔的应用前景。6. The cost performance of the pesticide spraying automation system is also a problem that restricts its wide application. However, with the continuous development of the number and technology of facility agriculture, and the rising labor costs associated with the arrival of an aging society, automated spraying operations will have broad application prospects.

综上所述,当前最重要的问题是解决喷雾目标的识别和定位问题,开发一种具有良好适应性、定位准确、实时性好、性价比高的喷雾定位系统。To sum up, the most important problem at present is to solve the problem of spray target identification and positioning, and to develop a spray positioning system with good adaptability, accurate positioning, good real-time performance and high cost performance.

目前,对于物体空间三维信息获取的方法主要有激光、超声波、雷达、红外和双目视觉等。前四者工作时通常是以通过反射波时间或相位差来计算距离信息,双目视觉主要通过三角测距原理,通过左右图像匹配来实现定位信息获取。双目视觉定位系统的优点在于,适用范围广泛,可通过一定的算法配合直接实现对目标的识别和定位;其缺点是识别与定位算法往往较为复杂,实时性和鲁棒性较差,特别是对物体形态不规则、环境复杂、光照条件差的场合更加难以检测。此外,采用多传感器融合技术,将视觉与激光、红外、超声波等信息结合的定位方式可在一定程度上提高定位精度及定位可靠性。At present, the methods for obtaining three-dimensional information in object space mainly include laser, ultrasonic, radar, infrared and binocular vision. When the first four work, the distance information is usually calculated through the reflected wave time or phase difference. The binocular vision mainly uses the triangular ranging principle to achieve positioning information acquisition through left and right image matching. The advantage of the binocular vision positioning system is that it has a wide range of applications, and it can directly realize the recognition and positioning of the target through a certain algorithm; its disadvantage is that the recognition and positioning algorithms are often complicated, and the real-time performance and robustness are poor. It is more difficult to detect objects with irregular shapes, complex environments, and poor lighting conditions. In addition, using multi-sensor fusion technology, the positioning method that combines vision with laser, infrared, ultrasonic and other information can improve positioning accuracy and positioning reliability to a certain extent.

本发明所提出的喷雾定位方式属于视觉定位范畴,采双目视觉定位系统对目标农作物进行位置检测。目前使用双目视觉定位的技术方法中,如何快速、稳定、准确、可靠的识别判断目标并确定目标位置轮廓信息是急需要解决的主要问题。The spray positioning method proposed by the present invention belongs to the category of visual positioning, and a binocular visual positioning system is used to detect the position of target crops. In the current technical method of using binocular vision positioning, how to quickly, stably, accurately and reliably identify and judge the target and determine the target position contour information is the main problem that needs to be solved urgently.

发明内容 Contents of the invention

本发明的目的在于,提供一种基于双目视觉网格划分匹配算法的农作物喷雾定位方法,用于快速准确地计算目标作物的位置信息,并根据目标作物的位置信息,规划喷头运行路径以使喷头按照合理的喷雾距离和角度进行喷雾。The object of the present invention is to provide a crop spray positioning method based on binocular vision grid division matching algorithm, which is used to quickly and accurately calculate the position information of the target crop, and plan the running path of the sprinkler according to the position information of the target crop so that The nozzle sprays according to a reasonable spray distance and angle.

为实现上述目的,本发明提供的技术方案是,一种基于双目视觉网格划分匹配算法的农作物喷雾定位方法,其特征是所述方法包括:In order to achieve the above object, the technical solution provided by the present invention is a crop spray positioning method based on a binocular vision grid division matching algorithm, characterized in that the method includes:

步骤1:利用双摄像头标定目标作物并获取目标作物的图像;所述双摄像头分别记为左摄像头和右摄像头;Step 1: Use dual cameras to calibrate the target crop and obtain images of the target crop; the dual cameras are respectively recorded as the left camera and the right camera;

步骤2:分别将左摄像头获取的目标作物的图像和右摄像头获取的目标作物的图像从背景中分离出来,得到左摄像头目标作物的二值图像和右摄像头目标作物的二值图像;Step 2: Separate the image of the target crop acquired by the left camera and the image of the target crop acquired by the right camera from the background respectively, and obtain the binary image of the target crop of the left camera and the binary image of the target crop of the right camera;

步骤3:分别对左摄像头获取的目标作物的二值图像和右摄像头获取的目标作物的二值图像进行网格划分,得到左侧网格区域和右侧网各区域;Step 3: Carry out grid division on the binary image of the target crop acquired by the left camera and the binary image of the target crop acquired by the right camera respectively, to obtain the left grid area and the right grid area;

步骤4:对左侧网格区域中的每个点,在右侧网格区域中进行匹配搜索,得到相互匹配的点,形成匹配点对;Step 4: For each point in the grid area on the left, perform a matching search in the grid area on the right to obtain matching points and form a pair of matching points;

步骤5:由各匹配点对计算左右图像视差并求取所述匹配点对对应的目标作物的点的三维坐标;Step 5: Calculate the parallax of the left and right images by each matching point pair and obtain the three-dimensional coordinates of the point of the target crop corresponding to the matching point pair;

步骤6:分析目标作物的点的三维坐标,删除错误点;Step 6: Analyze the three-dimensional coordinates of the points of the target crop, and delete the wrong points;

步骤7:对目标作物的点进行拟合处理,得到拟合曲线或曲面;Step 7: Fitting the points of the target crop to obtain a fitting curve or surface;

步骤8:根据拟合曲线或曲面规划喷头路径。Step 8: Plan the nozzle path according to the fitting curve or surface.

所述将左/右摄像头获取的目标作物的图像从背景中分离出来,得到左/右摄像头目标作物的二值图像包括:The image of the target crop acquired by the left/right camera is separated from the background, and the binary image obtained by the left/right camera target crop includes:

步骤101:在HSI颜色空间中,利用固定阈值分割法获取目标作物的初步分割图像;Step 101: In the HSI color space, use a fixed threshold segmentation method to obtain a preliminary segmented image of the target crop;

步骤102:利用超绿算法获取目标作物的灰度图像;Step 102: Obtain the grayscale image of the target crop by using the super green algorithm;

步骤103:对目标作物的灰度图像进行直方图统计,得到目标作物的直方图;Step 103: Perform histogram statistics on the grayscale image of the target crop to obtain the histogram of the target crop;

步骤104:利用近邻多点平均法对目标作物的直方图进行平滑处理;Step 104: smoothing the histogram of the target crop using the nearest neighbor multi-point averaging method;

步骤105:搜索经过平滑处理的目标作物的直方图的峰值并计算峰值左右两侧的波谷位置,从而得到目标作物的二值图像。Step 105: Search for the peak of the smoothed histogram of the target crop and calculate the positions of valleys on the left and right sides of the peak, so as to obtain a binary image of the target crop.

所述步骤4包括:Said step 4 includes:

步骤201:初始化参数,令j=1,wj=1,Min=10000;其中,j为右侧网格区域中的点的纵坐标,wj用于记录匹配成功的右侧网格区域中的点的纵坐标,Min用于记录左侧网格和右侧网格的修正的绝对差之和的最小值;Step 201: Initialize the parameters, let j=1, wj=1, Min=10000; wherein, j is the ordinate of the point in the right grid area, and wj is used to record the matching points in the right grid area The ordinate of , Min is used to record the minimum value of the sum of the corrected absolute differences of the left grid and the right grid;

步骤202:选取左侧网格区域中的点(u,i),将其所处的左侧网格记为p并计算左侧网格p的灰度均值Mi和方差Ei(u,i);Step 202: Select a point (u, i) in the left grid area, record the left grid where it is located as p, and calculate the gray mean Mi and variance Ei(u,i) of the left grid p ;

步骤203:选取右侧网格区域中的点(u,j),使该点的横坐标与左侧网格区域中选取的点(u,i)的横坐标相同,将其所处的右侧网格记为q并计算右侧网格q的灰度均值Nj和方差Fj(u,j);Step 203: Select the point (u, j) in the grid area on the right, make the abscissa of the point the same as the point (u, i) in the grid area on the left, and set the point on the right The side grid is denoted as q and the gray mean N j and variance Fj(u,j) of the right grid q are calculated;

步骤204:判断|Mi-Fj|<ε是否成立,如果|Mi-Fj|<ε成立,则执行步骤205;否则,令j=j+1并返回步骤203;其中,ε是设定值;Step 204: Judging whether |Mi-Fj|<ε is true, if |Mi-Fj|<ε is true, then execute step 205; otherwise, set j=j+1 and return to step 203; where ε is a set value;

步骤205:计算左侧网格p和右侧网格q的绝对差之和,记为SAD;计算左侧网格p的方差Ei(u,i)和右侧网格q的方差Fj(u,j)的绝对值之差,记为FCC;Step 205: Calculate the sum of absolute differences between the left grid p and the right grid q, denoted as SAD; calculate the variance Ei(u,i) of the left grid p and the variance Fj(u ,j) of the absolute value of the difference, recorded as FCC;

步骤206:根据公式SF=SAD×a+FCC×b计算左侧网格p和右侧网格q的修正的绝对差之和,其中a和b分别为比例参数;Step 206: Calculate the sum of the corrected absolute differences of the left grid p and the right grid q according to the formula SF=SAD×a+FCC×b, where a and b are scale parameters respectively;

步骤207:如果左侧网格p和右侧网格q的修正的绝对差之和小于Min,则执行步骤208;否则,令j=j+1并返回步骤203;Step 207: If the sum of the corrected absolute differences of the left grid p and the right grid q is less than Min, then execute step 208; otherwise, set j=j+1 and return to step 203;

步骤208:令Min=SF,wj=j;判断j的取值是否经过所有极线点,如果j的取值经过所有极线点,则执行步骤209;否则,令j=j+1并返回步骤203;Step 208: Let Min=SF, wj=j; judge whether the value of j passes through all epipolar points, if the value of j passes through all epipolar points, then execute step 209; otherwise, make j=j+1 and return Step 203;

步骤209:右侧网格区域中的点(u,wj)为左侧网格区域中的点(u,i)的匹配点。Step 209: The point (u, wj) in the grid area on the right is the matching point of the point (u, i) in the grid area on the left.

所述比例参数a和b的比值为1:1。The ratio of the proportional parameters a and b is 1:1.

所述步骤5包括:Said step 5 includes:

步骤301:利用公式D=XL-XR计算左右图像视差,其中XL为匹配点对中一个点的横坐标,XR为匹配点对中另一个点的横坐标;Step 301: Use the formula D= XL - XR to calculate the parallax of the left and right images, where XL is the abscissa of one point in the matching point pair, and XR is the abscissa of the other point in the matching point pair;

步骤302:利用公式 x c = BX L D y c = BY D z c = Bf D 计算匹配点对对应的目标作物的点的三维坐标;其中,xc、yc和zc分别为目标作物的点的三维坐标值,B为左右两个摄像头的光轴之间的距离,XL为匹配点对中一个点的横坐标,Y为匹配点对中任意一点的纵坐标,f为左摄像头或右摄像头的焦距。Step 302: Use the formula x c = BX L D. the y c = BY D. z c = Bf D. Calculate the three-dimensional coordinates of the point of the target crop corresponding to the matching point pair; where x c , y c and z c are the three-dimensional coordinate values of the point of the target crop respectively, B is the distance between the optical axes of the left and right cameras, and X L is the abscissa of a point in the matching point pair, Y is the ordinate of any point in the matching point pair, and f is the focal length of the left or right camera.

本发明采用细化分割自适用阈值分割的方法,提高了自适应分割算法的鲁棒性和准确性;采用网格划分的方法减少了计算总体计算量;在双目视觉定位的图像匹配过程中,采用SAD改进算法,很大程度上减少了单一SAD算法中容易出现的错误匹配,并在计算效率上比最大相关系数法速度更快。The invention adopts the method of fine segmentation self-applicable threshold segmentation, which improves the robustness and accuracy of the adaptive segmentation algorithm; adopts the method of grid division to reduce the overall calculation amount; in the image matching process of binocular vision positioning , using the improved SAD algorithm, greatly reduces the error matching that is easy to occur in a single SAD algorithm, and is faster than the maximum correlation coefficient method in terms of computational efficiency.

附图说明 Description of drawings

图1是基于双目视觉网格划分匹配算法的农作物喷雾定位方法流程图;Fig. 1 is the flow chart of the crop spray positioning method based on the binocular vision grid division matching algorithm;

图2是利用双摄像头标定目标作物的示意图;其中,(a)是利用左摄像头标定目标作物的示意图,(b)利用右摄像头标定目标作物的示意图;Fig. 2 is a schematic diagram of using a dual camera to calibrate a target crop; wherein, (a) is a schematic diagram of using a left camera to calibrate a target crop, and (b) is a schematic diagram of using a right camera to calibrate a target crop;

图3是利用固定阈值分割法获取目标作物的初步分割图像;Figure 3 is a preliminary segmented image of the target crop obtained by using the fixed threshold segmentation method;

图4是对左摄像头获取的目标作物的二值图像进行网格划分的示意图;Fig. 4 is a schematic diagram of meshing the binary image of the target crop acquired by the left camera;

图5是形成匹配点对的流程图;Fig. 5 is a flowchart of forming a matching point pair;

图6是平行光轴双摄像头原理图。Figure 6 is a schematic diagram of dual cameras with parallel optical axes.

具体实施方式 Detailed ways

下面结合附图,对优选实施例作详细说明。应该强调的是,下述说明仅仅是示例性的,而不是为了限制本发明的范围及其应用。The preferred embodiments will be described in detail below in conjunction with the accompanying drawings. It should be emphasized that the following description is only exemplary and not intended to limit the scope of the invention and its application.

图1是基于双目视觉网格划分匹配算法的农作物喷雾定位方法流程图。如图1所示,本发明提供的基于双目视觉网格划分匹配算法的农作物喷雾定位方法包括:Fig. 1 is a flow chart of crop spray positioning method based on binocular vision grid division matching algorithm. As shown in Figure 1, the crop spray positioning method based on the binocular vision grid division matching algorithm provided by the present invention includes:

步骤1:利用双摄像头标定目标作物并获取目标作物的图像,双摄像头分别记为左摄像头和右摄像头。Step 1: Use dual cameras to calibrate the target crop and obtain images of the target crop, and the dual cameras are respectively marked as the left camera and the right camera.

在布设左右摄像头时,应当保证两个摄像头的光轴相互平行并且处于同一个水平面上。图2是利用双摄像头标定目标作物的示意图。When arranging the left and right cameras, it should be ensured that the optical axes of the two cameras are parallel to each other and on the same horizontal plane. Figure 2 is a schematic diagram of using dual cameras to calibrate target crops.

步骤2:分别将左摄像头获取的目标作物的图像和右摄像头获取的目标作物的图像从背景中分离出来,得到左摄像头目标作物的二值图像和右摄像头目标作物的二值图像。Step 2: Separate the image of the target crop captured by the left camera and the image of the target crop captured by the right camera from the background respectively, and obtain the binary image of the target crop of the left camera and the binary image of the target crop of the right camera.

下面以左摄像头为例,说明获取左摄像头目标作物的二值图像的过程。The following takes the left camera as an example to illustrate the process of obtaining the binary image of the target crop of the left camera.

步骤101:在HSI颜色空间中,利用固定阈值分割法获取目标作物的初步分割图像。Step 101: In the HSI color space, use a fixed threshold segmentation method to obtain a preliminary segmented image of the target crop.

在该步骤中,首先需要将左摄像头获取的目标作物的图像(RGB图像)转换为HSI图像并进行归一化处理。In this step, it is first necessary to convert the target crop image (RGB image) acquired by the left camera into an HSI image and perform normalization processing.

其次,设定H(色调)、S(色饱和度)、I(亮度)的取值范围,将范围之外的像素定位为黑色,其余保留原值。Next, set the value ranges of H (hue), S (saturation), and I (brightness), position the pixels outside the range as black, and keep the rest of the original values.

接下来,将保留原值的像素由HSI图像转换为RGB图像并进行反归一化处理,最终得到目标作物的初步分割图像。图3是利用固定阈值分割法获取目标作物的初步分割图像。Next, the pixels that retain the original value are converted from the HSI image to the RGB image and denormalized, and finally the preliminary segmented image of the target crop is obtained. Figure 3 is a preliminary segmented image of the target crop obtained by using the fixed threshold segmentation method.

步骤102:利用超绿算法获取目标作物的灰度图像。Step 102: Obtain the grayscale image of the target crop by using the supergreen algorithm.

超绿算法是一种通过提高绿色通道的权重增加与非绿色背景的对比度的一种算法。该算法能够较好地提取出绿色农作物的信息,其经常在绿色农作物图像处理中使用。该算法采用公式2×G-R-B来处理每个像素,其中G、R和B分别代表像素的绿色、红色和蓝色通道的数值。经过超绿算法的处理,可以得到目标作物的灰度图像。The super green algorithm is an algorithm that increases the contrast with the non-green background by increasing the weight of the green channel. This algorithm can better extract the information of green crops, which is often used in image processing of green crops. The algorithm processes each pixel using the formula 2×G-R-B, where G, R, and B represent the values of the pixel's green, red, and blue channels, respectively. After the processing of the super green algorithm, the grayscale image of the target crop can be obtained.

步骤103:对目标作物的灰度图像进行直方图统计,得到目标作物的直方图。Step 103: Perform histogram statistics on the grayscale image of the target crop to obtain the histogram of the target crop.

步骤104:利用近邻多点平均法对目标作物的直方图进行平滑处理。Step 104: smoothing the histogram of the target crop by using the nearest neighbor multi-point averaging method.

近邻多点平均法是解决当直方图出现局部突变,而宏观趋势仍然未到波谷时的一种算法解决方案。对初步分割后背景置零的图像做超绿计算,其直方图中除去灰度值为0的背景部分,当只有单个灰度级对应的像素数目为局部较小值时,并不认为该点一定是波谷位置,因此采用邻接连续多点平均值做为波谷位置检测可有效避免局部极小值对分割结果的影响。具体算法是,将找到的极小值位置附近以左邻域、右邻域、中心区域划分三个等分区间,计算每个小区域内的直方图数据平均值,根据数值结果判断是否为合理波谷位置。例如,当求取左侧波谷位置时,区域数值特性出现左低右高,且三个区域的均值左侧区域最小,右侧最大,则应继续向左侧移动寻找新的波谷,避免局部极小的影响。The nearest neighbor multi-point averaging method is an algorithm solution to solve when the histogram has a local mutation, but the macro trend has not yet reached the trough. Perform super-green calculation on the image whose background is set to zero after preliminary segmentation, and remove the background part with a gray value of 0 in its histogram. When only the number of pixels corresponding to a single gray level is a local small value, the point is not considered It must be the trough position, so using the average value of adjacent continuous points as the trough position detection can effectively avoid the influence of local minima on the segmentation results. The specific algorithm is to divide the found minimum position into three equal intervals by the left neighborhood, right neighborhood, and central area, calculate the average value of the histogram data in each small area, and judge whether it is a reasonable trough based on the numerical results Location. For example, when finding the position of the left trough, the regional numerical characteristics appear to be low on the left and high on the right, and the average value of the three areas is the smallest on the left and the largest on the right, so you should continue to move to the left to find a new trough to avoid local extremes. small impact.

步骤105:搜索经过平滑处理的目标作物的直方图的峰值并计算峰值左右两侧的波谷位置,从而得到目标作物的二值图像。Step 105: Search for the peak of the smoothed histogram of the target crop and calculate the positions of valleys on the left and right sides of the peak, so as to obtain a binary image of the target crop.

在峰值两侧搜索波谷,得到波谷后,两侧波谷之间的图像即为目标作物的二值图像。Search the troughs on both sides of the peak, and after the troughs are obtained, the image between the troughs on both sides is the binary image of the target crop.

步骤3:分别对左摄像头获取的目标作物的二值图像和右摄像头获取的目标作物的二值图像进行网格划分,得到左侧网格区域和右侧网各区域。Step 3: Carry out grid division on the binary image of the target crop acquired by the left camera and the binary image of the target crop acquired by the right camera respectively, to obtain the left grid area and the right grid area.

图4是对左摄像头获取的目标作物的二值图像进行网格划分的示意图。图4中,还是以左摄像头为例,说明对获取的目标作物的二值图像进行网格划分的过程。以图像左上角位置为搜索目标像素的起始位置,当出现目标作物的像素时,定位第一个网格。继续在其周围划分网格,判断该网格周围是否存在目标作物的像素,如果存在,则再定位一个网格。对于边缘处的目标像素,不够画整格的区域从反向添加网格。这样可以保证将所有目标像素划到同大小的网格区域中。Fig. 4 is a schematic diagram of meshing the binary image of the target crop acquired by the left camera. In FIG. 4 , still taking the left camera as an example, the process of meshing the acquired binary image of the target crop is described. Use the upper left corner of the image as the starting position of the search target pixel, and locate the first grid when the pixel of the target crop appears. Continue to divide the grid around it, judge whether there are pixels of the target crop around the grid, and if so, then locate another grid. For the target pixel at the edge, the grid is added from the reverse direction for the area that is not enough to draw a full grid. This ensures that all target pixels are drawn into the grid area of the same size.

步骤4:对左侧网格区域中的每个点,在右侧网格区域中进行匹配搜索,得到相互匹配的点,形成匹配点对。Step 4: For each point in the grid area on the left, perform a matching search in the grid area on the right to obtain matching points to form a pair of matching points.

图5是形成匹配点对的流程图。如图5所示,形成匹配点对的过程包括:Fig. 5 is a flowchart of forming matching point pairs. As shown in Figure 5, the process of forming matching point pairs includes:

步骤201:初始化参数,令j=1,wj=1,Min=10000;其中,j为右侧网格区域中的点的纵坐标,wj用于记录匹配成功的右侧网格区域中的点的纵坐标,Min用于记录左侧网格和右侧网格的修正的绝对差之和的最小值。Step 201: Initialize the parameters, let j=1, wj=1, Min=10000; wherein, j is the ordinate of the point in the right grid area, and wj is used to record the matching points in the right grid area The ordinate of , Min is used to record the minimum value of the sum of the corrected absolute differences of the left grid and the right grid.

步骤202:选取左侧网格区域中的点(u,i),将其所处的左侧网格记为p并计算左侧网格p的灰度均值Mi和方差Ei(u,i)。Step 202: Select a point (u, i) in the left grid area, record the left grid where it is located as p, and calculate the gray mean Mi and variance Ei(u,i) of the left grid p .

其中,左侧网格p的灰度均值Mi的计算公式为Among them, the calculation formula of the gray value Mi of the grid p on the left is

MiMi == gg &OverBar;&OverBar; uu ,, ii == 11 mnmn &Sigma;&Sigma; xx == 11 mm &Sigma;&Sigma; ythe y == 11 nno gg (( xx ++ uu -- 11 ,, ythe y ++ ii -- 11 )) -- -- -- (( 11 ))

左侧网格p的方差Ei(u,i)的计算公式为The formula for calculating the variance Ei(u,i) of the grid p on the left is

EiEi (( uu ,, ii )) == 11 mnmn &Sigma;&Sigma; xx == 11 mm &Sigma;&Sigma; ythe y == 11 nno [[ gg (( xx ++ uu -- 11 ,, ythe y ++ ii -- 11 )) -- gg &OverBar;&OverBar; uu ,, ii ]] 22 -- -- -- (( 22 ))

上述(1)式和(2)式中,u和i分别为点(u,i)的横坐标和纵坐标,m和n分别为左侧网格p的横向像素个数和纵向像素个数,g(x,y)为左侧网格p中像素的函数表达式。In the above formulas (1) and (2), u and i are the abscissa and ordinate of the point (u, i), respectively, and m and n are the number of horizontal pixels and vertical pixels of the grid p on the left, respectively , g(x,y) is the function expression of the pixel in the grid p on the left.

步骤203:选取右侧网格区域中的点(u,j),使该点的横坐标与左侧网格区域中选取的点(u,i)的横坐标相同,将其所处的右侧网格记为q并计算右侧网格q的灰度均值Nj和方差Fj(u,j)。Step 203: Select the point (u, j) in the grid area on the right, make the abscissa of the point the same as the point (u, i) in the grid area on the left, and set the point on the right The side grid is denoted as q and the gray mean N j and variance Fj(u,j) of the right grid q are calculated.

其中,右侧网格q的灰度均值Nj的计算公式为Among them, the calculation formula of the gray value Nj of the grid q on the right is

NN jj == ff &OverBar;&OverBar; uu ,, jj == 11 mnmn &Sigma;&Sigma; xx == 11 mm &Sigma;&Sigma; ythe y == 11 nno ff (( xx ++ uu -- 11 ,, ythe y ++ jj -- 11 )) -- -- -- (( 33 ))

右侧网格q的方差Fj(u,j)的计算公式为The formula for calculating the variance Fj(u,j) of the grid q on the right is

FjFj (( uu ,, jj )) == 11 mnmn &Sigma;&Sigma; xx == 11 mm &Sigma;&Sigma; ythe y == 11 nno [[ ff (( xx ++ uu -- 11 ,, ythe y ++ jj -- 11 )) -- ff &OverBar;&OverBar; uu ,, jj ]] 22 -- -- -- (( 44 ))

上述(3)式和(4)式中,u和j分别为点(u,j)的横坐标和纵坐标,m和n分别为右侧网格q的横向像素个数和纵向像素个数,f(x,y)为右侧网格q中像素的函数表达式。In the above formulas (3) and (4), u and j are the abscissa and ordinate of the point (u, j), respectively, and m and n are the number of horizontal pixels and vertical pixels of the grid q on the right side, respectively , f(x,y) is the function expression of the pixel in the grid q on the right.

步骤204:判断|Mi-Fj|<ε是否成立,如果|Mi-Fj|<ε成立,则说明左侧网格p和右侧网格q大致相等,可以执行步骤205;否则,令j=j+1并返回步骤203;其中,ε是设定值。Step 204: Judging whether |Mi-Fj|<ε is true, if |Mi-Fj|<ε is true, it means that the grid p on the left and the grid q on the right are approximately equal, and step 205 can be executed; otherwise, set j= j+1 and return to step 203; wherein, ε is a set value.

步骤205:计算左侧网格p和右侧网格q的绝对差之和,记为SAD;计算左侧网格p的方差Ei(u,i)和右侧网格q的方差Fj(u,j)的绝对值之差,记为FCC。Step 205: Calculate the sum of absolute differences between the left grid p and the right grid q, denoted as SAD; calculate the variance Ei(u,i) of the left grid p and the variance Fj(u ,j) The difference of the absolute value, recorded as FCC.

计算左侧网格p和右侧网格q的绝对差之和采用公式Calculate the sum of the absolute difference between the left grid p and the right grid q using the formula

Figure BDA00001786951400101
Figure BDA00001786951400101

上述(5)式中,c和r分别点(u,i)或者点(u,j)的横坐标和纵坐标,

Figure BDA00001786951400102
为左侧网格p或者右侧网格q中像素的函数表达式。In the above formula (5), c and r are respectively the abscissa and ordinate of point (u, i) or point (u, j),
Figure BDA00001786951400102
is the function expression of the pixels in the grid p on the left or the grid q on the right.

计算左侧网格p的方差Ei(u,i)和右侧网格q的方差Fj(u,j)的绝对值之差采用公式The difference between the absolute value of the variance Ei(u,i) of the grid p on the left and the variance Fj(u,j) of the grid q on the right is calculated using the formula

FCC=|Ei(u,i)-Fj(u,j)|   (6)FCC=|Ei(u,i)-Fj(u,j)| (6)

步骤206:根据公式SF=SAD×a+FCC×b计算左侧网格p和右侧网格q的修正的绝对差之和,其中a和b分别为比例参数。比例参数a和b决定了方差因素和绝对差之和因素对最终结果所占影响比例。一般比例参数a和b的比值设定为1:1。Step 206: Calculate the sum of the corrected absolute differences of the left grid p and the right grid q according to the formula SF=SAD×a+FCC×b, where a and b are scale parameters respectively. The proportion parameters a and b determine the proportion of variance factor and sum of absolute difference factor on the final result. Generally, the ratio of the proportional parameters a and b is set to 1:1.

步骤207:如果左侧网格p和右侧网格q的修正的绝对差之和小于Min,则执行步骤108;否则,令j=j+1并返回步骤103。Step 207: If the sum of the corrected absolute differences of the left grid p and the right grid q is less than Min, execute step 108; otherwise, set j=j+1 and return to step 103.

步骤208:令Min=SF,wj=j;判断j的取值是否经过所有极线点,如果j的取值经过所有极线点,则执行步骤209;否则,令j=j+1并返回步骤203。Step 208: Let Min=SF, wj=j; judge whether the value of j passes through all epipolar points, if the value of j passes through all epipolar points, then execute step 209; otherwise, make j=j+1 and return Step 203.

对于平行轴双目系统而言,若两摄像头光轴确定的平面与世界坐标系相对保持水平,则极线约束可视为由左右图像的同一高度行限定的目标位置,即对空间同一目标点而言,左右摄像头获取的图像中的对应点图像坐标垂直方向相等,只有水平方向像素坐标存在差异。基于这个原因,本发明对匹配点的搜索就是沿极线位置进行的。因为匹配点一定是在极线上,因此可以提高匹配算法的计算效率和准确性。For a parallel-axis binocular system, if the plane determined by the optical axes of the two cameras is relatively horizontal to the world coordinate system, the epipolar constraint can be regarded as the target position defined by the same height row of the left and right images, that is, for the same target point in space Generally speaking, the image coordinates of the corresponding points in the images acquired by the left and right cameras are equal in the vertical direction, and only the pixel coordinates in the horizontal direction are different. For this reason, the search for matching points in the present invention is performed along the epipolar position. Because the matching point must be on the epipolar line, the calculation efficiency and accuracy of the matching algorithm can be improved.

步骤209:右侧网格区域中的点(u,wj)为左侧网格区域中的点(u,i)的匹配点。Step 209: The point (u, wj) in the grid area on the right is the matching point of the point (u, i) in the grid area on the left.

步骤5:由各匹配点对计算左右图像视差并求取所述匹配点对对应的目标作物的点的三维坐标。Step 5: Calculate the parallax of the left and right images from each pair of matching points and obtain the three-dimensional coordinates of the points of the target crop corresponding to the pair of matching points.

图6是平行光轴双摄像头原理图。图6中,由于左右摄像头的光轴相互平行,二者之间的距离为基线长B。设左摄像头光心OL处为世界坐标系的原点,即世界坐标系与左摄像机坐标系完全重合。光轴OLZL垂直于左摄像头成像平面,交点为O1,光轴ORZR垂直于右摄像头成像平面,交点为O2。两摄像头同时查看空间物体的同一点P,分别在左右摄像头上获得对应图像PL和PR点,坐标分别为PL(XL,YL)和PR(XR,YR),则视差为D=XL-XRFigure 6 is a schematic diagram of dual cameras with parallel optical axes. In FIG. 6 , since the optical axes of the left and right cameras are parallel to each other, the distance between them is the baseline length B. Let the optical center OL of the left camera be the origin of the world coordinate system, that is, the world coordinate system and the left camera coordinate system completely coincide. The optical axis O L Z L is perpendicular to the imaging plane of the left camera, and the intersection point is O 1 , and the optical axis O R Z R is perpendicular to the imaging plane of the right camera, and the intersection point is O 2 . The two cameras view the same point P of the space object at the same time, and obtain the corresponding image points P L and P R on the left and right cameras respectively, and the coordinates are P L (X L , Y L ) and P R (X R , Y R ), then The parallax is D=X L -X R .

利用公式use the formula

xx cc == BXBX LL DD.

ythe y cc == BYBY DD. -- -- -- (( 77 ))

zz cc == BfBf DD.

可以计算出匹配点对对应的目标作物的点的三维坐标。其中,xc、yc和zc分别为目标作物的点的三维坐标值。B为左右两个摄像头的光轴之间的距离,XL为匹配点对中左摄像头获取的点的横坐标,Y为匹配点对中任意一点的纵坐标。由于两个摄像头为光轴平行坐标系,因此左摄像头和右摄像头Y轴坐标实际上数值是一样的,即有Y=YL=YR。f为左摄像头或右摄像头的焦距。The three-dimensional coordinates of the point corresponding to the target crop can be calculated from the matched point pair. Wherein, x c , y c and z c are respectively the three-dimensional coordinate values of the points of the target crop. B is the distance between the optical axes of the left and right cameras, X L is the abscissa of the point acquired by the left camera in the matching point pair, and Y is the ordinate of any point in the matching point pair. Since the two cameras are optical axis parallel coordinate systems, the Y-axis coordinates of the left camera and the right camera are actually the same value, that is, Y=Y L =Y R . f is the focal length of the left camera or the right camera.

步骤6:分析目标作物的点的三维坐标,删除错误点。Step 6: Analyze the three-dimensional coordinates of the points of the target crop, and delete the wrong points.

删除错误点主要根据约束条件,包括左右摄像头获取的二值图像对应匹配点的顺序应当一致,如果不一致则认为是错误匹配点。另外,如果计算出的点的坐标超过给定范围,则认为是错误匹配点,应当予以清除。The deletion of wrong points is mainly based on constraints, including that the order of matching points corresponding to the binary images captured by the left and right cameras should be consistent, and if they are inconsistent, it is considered a wrong matching point. In addition, if the coordinates of the calculated point exceed the given range, it is considered as a wrong matching point and should be cleared.

步骤7:对目标作物的点进行拟合处理,得到拟合曲线或曲面。Step 7: Perform fitting processing on the points of the target crop to obtain a fitting curve or surface.

根据步骤6获得的空间离散点,以特定的空间曲线(曲面)模型拟合离散点,例如对于目标作物基本形态为球冠形,则采用圆弧线(球面)方程作为作物模型,以最小二乘法获得最佳拟合曲线(面)方程参数。According to the spatial discrete points obtained in step 6, fit the discrete points with a specific spatial curve (surface) model. For example, if the basic shape of the target crop is a spherical Multiplication obtains the best fit curve (surface) equation parameters.

步骤8:根据拟合曲线或曲面规划喷头路径。Step 8: Plan the nozzle path according to the fitting curve or surface.

根据步骤7获得的拟合曲线或曲面规划喷头路径,使喷头以理想的喷雾距离和角度进行喷雾。Plan the path of the nozzle according to the fitting curve or surface obtained in step 7, so that the nozzle can spray at the ideal spray distance and angle.

本发明可以实现对目标作物的三维信息轮廓提取与定位。该方法定位准确、计算量适中,可以满足喷雾实时性要求,并且工作过程中具有较高的灵活性和适应能力,为实现精准自动化喷雾,减少环境污染,提高农药利用率,提供了一种有效的实现方法。The invention can realize the three-dimensional information contour extraction and positioning of target crops. The method is accurate in positioning and moderate in calculation, which can meet the real-time requirements of spraying, and has high flexibility and adaptability in the working process. It provides an effective method for realizing precise automatic spraying, reducing environmental pollution, and improving the utilization rate of pesticides. implementation method.

以上所述,仅为本发明较佳的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应该以权利要求的保护范围为准。The above is only a preferred embodiment of the present invention, but the scope of protection of the present invention is not limited thereto. Any person skilled in the art within the technical scope disclosed in the present invention can easily think of changes or Replacement should be covered within the protection scope of the present invention. Therefore, the protection scope of the present invention should be determined by the protection scope of the claims.

Claims (5)

1.一种基于双目视觉网格划分匹配算法的农作物喷雾定位方法,其特征是所述方法包括:1. a crop spray location method based on binocular vision grid division matching algorithm, it is characterized in that described method comprises: 步骤1:利用双摄像头标定目标作物并获取目标作物的图像;所述双摄像头分别记为左摄像头和右摄像头;Step 1: Use dual cameras to calibrate the target crop and obtain images of the target crop; the dual cameras are respectively recorded as the left camera and the right camera; 步骤2:分别将左摄像头获取的目标作物的图像和右摄像头获取的目标作物的图像从背景中分离出来,得到左摄像头目标作物的二值图像和右摄像头目标作物的二值图像;Step 2: Separate the image of the target crop acquired by the left camera and the image of the target crop acquired by the right camera from the background respectively, and obtain the binary image of the target crop of the left camera and the binary image of the target crop of the right camera; 步骤3:分别对左摄像头获取的目标作物的二值图像和右摄像头获取的目标作物的二值图像进行网格划分,得到左侧网格区域和右侧网各区域;Step 3: Carry out grid division on the binary image of the target crop acquired by the left camera and the binary image of the target crop acquired by the right camera respectively, to obtain the left grid area and the right grid area; 步骤4:对左侧网格区域中的每个点,在右侧网格区域中进行匹配搜索,得到相互匹配的点,形成匹配点对;Step 4: For each point in the grid area on the left, perform a matching search in the grid area on the right to obtain matching points and form a pair of matching points; 步骤5:由各匹配点对计算左右图像视差并求取所述匹配点对对应的目标作物的点的三维坐标;Step 5: Calculate the parallax of the left and right images by each matching point pair and obtain the three-dimensional coordinates of the point of the target crop corresponding to the matching point pair; 步骤6:分析目标作物的点的三维坐标,删除错误点;Step 6: Analyze the three-dimensional coordinates of the points of the target crop, and delete the wrong points; 步骤7:对目标作物的点进行拟合处理,得到拟合曲线或曲面;Step 7: Fitting the points of the target crop to obtain a fitting curve or surface; 步骤8:根据拟合曲线或曲面规划喷头路径。Step 8: Plan the nozzle path according to the fitting curve or surface. 2.根据权利要求1所述的方法,其特征是所述将左/右摄像头获取的目标作物的图像从背景中分离出来,得到左/右摄像头目标作物的二值图像包括:2. The method according to claim 1, wherein the image of the target crop obtained by the left/right camera is separated from the background, and obtaining the binary image of the left/right camera target crop comprises: 步骤101:在HSI颜色空间中,利用固定阈值分割法获取目标作物的初步分割图像;Step 101: In the HSI color space, use a fixed threshold segmentation method to obtain a preliminary segmented image of the target crop; 步骤102:利用超绿算法获取目标作物的灰度图像;Step 102: Obtain the grayscale image of the target crop by using the super green algorithm; 步骤103:对目标作物的灰度图像进行直方图统计,得到目标作物的直方图;Step 103: Perform histogram statistics on the grayscale image of the target crop to obtain the histogram of the target crop; 步骤104:利用近邻多点平均法对目标作物的直方图进行平滑处理;Step 104: smoothing the histogram of the target crop using the nearest neighbor multi-point averaging method; 步骤105:搜索经过平滑处理的目标作物的直方图的峰值并计算峰值左右两侧的波谷位置,从而得到目标作物的二值图像。Step 105: Search for the peak of the smoothed histogram of the target crop and calculate the positions of valleys on the left and right sides of the peak, so as to obtain a binary image of the target crop. 3.根据权利要求1所述的方法,其特征是所述步骤4包括:3. The method according to claim 1, characterized in that said step 4 comprises: 步骤201:初始化参数,令j=1,wj=1,Min=10000;其中,j为右侧网格区域中的点的纵坐标,wj用于记录匹配成功的右侧网格区域中的点的纵坐标,Min用于记录左侧网格和右侧网格的修正的绝对差之和的最小值;Step 201: Initialize the parameters, let j=1, wj=1, Min=10000; wherein, j is the ordinate of the point in the right grid area, and wj is used to record the matching points in the right grid area The ordinate of , Min is used to record the minimum value of the sum of the corrected absolute differences of the left grid and the right grid; 步骤202:选取左侧网格区域中的点(u,i),将其所处的左侧网格记为p并计算左侧网格p的灰度均值Mi和方差Ei(u,i);Step 202: Select a point (u, i) in the left grid area, record the left grid where it is located as p, and calculate the gray mean Mi and variance Ei(u,i) of the left grid p ; 步骤203:选取右侧网格区域中的点(u,j),使该点的横坐标与左侧网格区域中选取的点(u,i)的横坐标相同,将其所处的右侧网格记为q并计算右侧网格q的灰度均值Nj和方差Fj(u,j);Step 203: Select the point (u, j) in the grid area on the right, make the abscissa of the point the same as the point (u, i) in the grid area on the left, and set the point on the right The side grid is denoted as q and the gray mean N j and variance Fj(u,j) of the right grid q are calculated; 步骤204:判断|Mi-Fj|<ε是否成立,如果|Mi-Fj|<ε成立,则执行步骤205;否则,令j=j+1并返回步骤203;其中,ε是设定值;Step 204: Judging whether |Mi-Fj|<ε is true, if |Mi-Fj|<ε is true, then execute step 205; otherwise, set j=j+1 and return to step 203; where ε is a set value; 步骤205:计算左侧网格p和右侧网格q的绝对差之和,记为SAD;计算左侧网格p的方差Ei(u,i)和右侧网格q的方差Fj(u,j)的绝对值之差,记为FCC;Step 205: Calculate the sum of absolute differences between the left grid p and the right grid q, denoted as SAD; calculate the variance Ei(u,i) of the left grid p and the variance Fj(u ,j) of the absolute value of the difference, recorded as FCC; 步骤206:根据公式SF=SAD×a+FCC×b计算左侧网格p和右侧网格q的修正的绝对差之和,其中a和b分别为比例参数;Step 206: Calculate the sum of the corrected absolute differences of the left grid p and the right grid q according to the formula SF=SAD×a+FCC×b, where a and b are scale parameters respectively; 步骤207:如果左侧网格p和右侧网格q的修正的绝对差之和小于Min,则执行步骤208;否则,令j=j+1并返回步骤203;Step 207: If the sum of the corrected absolute differences of the left grid p and the right grid q is less than Min, then execute step 208; otherwise, set j=j+1 and return to step 203; 步骤208:令Min=SF,wj=j;判断j的取值是否经过所有极线点,如果j的取值经过所有极线点,则执行步骤209;否则,令j=j+1并返回步骤203;Step 208: Let Min=SF, wj=j; judge whether the value of j passes through all epipolar points, if the value of j passes through all epipolar points, then execute step 209; otherwise, make j=j+1 and return Step 203; 步骤209:右侧网格区域中的点(u,wj)为左侧网格区域中的点(u,i)的匹配点。Step 209: The point (u, wj) in the grid area on the right is the matching point of the point (u, i) in the grid area on the left. 4.根据权利要求3所述的方法,其特征是所述比例参数a和b的比值为1:1。4. The method according to claim 3, characterized in that the ratio of the proportional parameters a and b is 1:1. 5.根据权利要求1所述的方法,其特征是所述步骤5包括:5. The method according to claim 1, characterized in that said step 5 comprises: 步骤301:利用公式D=XL-XR计算左右图像视差,其中XL为匹配点对中一个点的横坐标,XR为匹配点对中另一个点的横坐标;Step 301: Use the formula D= XL - XR to calculate the parallax of the left and right images, where XL is the abscissa of one point in the matching point pair, and XR is the abscissa of the other point in the matching point pair; 步骤302:利用公式 x c = B X L D y c = BY D z c = Bf D 计算匹配点对对应的目标作物的点的三维坐标;其中,xc、yc和zc分别为目标作物的点的三维坐标值,B为左右两个摄像头的光轴之间的距离,XL为匹配点对中一个点的横坐标,Y为匹配点对中任意一点的纵坐标,f为左摄像头或右摄像头的焦距。Step 302: Use the formula x c = B x L D. the y c = BY D. z c = Bf D. Calculate the three-dimensional coordinates of the point of the target crop corresponding to the matching point pair; where x c , y c and z c are the three-dimensional coordinate values of the point of the target crop respectively, B is the distance between the optical axes of the left and right cameras, and X L is the abscissa of a point in the matching point pair, Y is the ordinate of any point in the matching point pair, and f is the focal length of the left or right camera.
CN201210203712.4A 2012-06-19 2012-06-19 Crop spraying positioning method based on binocular vision gridding partition matching algorithm Expired - Fee Related CN102800083B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210203712.4A CN102800083B (en) 2012-06-19 2012-06-19 Crop spraying positioning method based on binocular vision gridding partition matching algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210203712.4A CN102800083B (en) 2012-06-19 2012-06-19 Crop spraying positioning method based on binocular vision gridding partition matching algorithm

Publications (2)

Publication Number Publication Date
CN102800083A true CN102800083A (en) 2012-11-28
CN102800083B CN102800083B (en) 2014-12-10

Family

ID=47199181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210203712.4A Expired - Fee Related CN102800083B (en) 2012-06-19 2012-06-19 Crop spraying positioning method based on binocular vision gridding partition matching algorithm

Country Status (1)

Country Link
CN (1) CN102800083B (en)

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530643A (en) * 2013-10-11 2014-01-22 中国科学院合肥物质科学研究院 Pesticide positioned spraying method and system on basis of crop interline automatic identification technology
CN103988824A (en) * 2014-04-18 2014-08-20 浙江大学 Automatic targeting and spraying system based on binocular vision technology
CN104615150A (en) * 2014-12-17 2015-05-13 中国科学院合肥物质科学研究院 Machine vision based adaptive precise mist spray device and method
CN104604833A (en) * 2015-02-09 2015-05-13 聊城大学 Fall webworm larva net curtain pesticide spraying robot mechanical system
CN103971367B (en) * 2014-04-28 2017-01-11 河海大学 Hydrologic data image segmenting method
CN106530281A (en) * 2016-10-18 2017-03-22 国网山东省电力公司电力科学研究院 Edge feature-based unmanned aerial vehicle image blur judgment method and system
CN107255446A (en) * 2017-08-01 2017-10-17 南京农业大学 A kind of Cold region apple fruit tree canopy three-dimensional map constructing system and method
CN107593200A (en) * 2017-10-31 2018-01-19 河北工业大学 A kind of trees plant protection system and method based on visible ray infrared technique
WO2018076776A1 (en) * 2016-10-25 2018-05-03 深圳光启合众科技有限公司 Robot, robotic arm and control method and device thereof
CN107992868A (en) * 2017-11-15 2018-05-04 辽宁警察学院 A kind of High Precision Stereo footprint Quick Acquisition method
CN109059869A (en) * 2018-07-27 2018-12-21 仲恺农业工程学院 Method for detecting spraying effect of plant protection unmanned aerial vehicle on fruit trees
CN109699623A (en) * 2019-02-27 2019-05-03 西安交通大学 A kind of Multifunctional plant protection machine people's system
CN110191330A (en) * 2019-06-13 2019-08-30 内蒙古大学 FPGA implementation method and system for depth map based on binocular vision green crop video stream
CN110584962A (en) * 2019-08-28 2019-12-20 西安工业大学 Combined obstacle-detection intelligent blind-guiding system
WO2020047863A1 (en) * 2018-09-07 2020-03-12 深圳配天智能技术研究院有限公司 Distance measurement method and apparatus
CN111109240A (en) * 2020-01-03 2020-05-08 东北农业大学 A multi-information fusion variable spraying method and device
CN111762086A (en) * 2019-12-19 2020-10-13 广州极飞科技有限公司 Spraying control method, device and system and carrier
CN111953933A (en) * 2020-07-03 2020-11-17 北京中安安博文化科技有限公司 Method, device, medium and electronic equipment for determining fire area
CN111990378A (en) * 2020-08-25 2020-11-27 淮阴工学院 Spraying control method for spraying robot
CN112167212A (en) * 2019-07-02 2021-01-05 上海临石信息科技有限公司 Unmanned aerial vehicle pesticide spraying control system and method
CN112889786A (en) * 2021-01-15 2021-06-04 吉林农业大学 Pesticide spraying system capable of tracking field crop seedling areas in real time and control method
WO2021159717A1 (en) * 2020-02-14 2021-08-19 苏州浪潮智能科技有限公司 Content self-adaptive binocular matching method and device
CN119235497A (en) * 2024-12-05 2025-01-03 四川德仁源农业科技有限公司 Monitoring system and method based on medicinal stiff silkworm

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008082870A (en) * 2006-09-27 2008-04-10 Setsunan Univ Image processing program, and road surface state measuring system using this
CN101312593A (en) * 2007-05-25 2008-11-26 中兴通讯股份有限公司 Access control method of private base station
CN101996399A (en) * 2009-08-18 2011-03-30 三星电子株式会社 Device and method for estimating parallax between left image and right image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008082870A (en) * 2006-09-27 2008-04-10 Setsunan Univ Image processing program, and road surface state measuring system using this
CN101312593A (en) * 2007-05-25 2008-11-26 中兴通讯股份有限公司 Access control method of private base station
CN101996399A (en) * 2009-08-18 2011-03-30 三星电子株式会社 Device and method for estimating parallax between left image and right image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
TANGFEI TAO ET AL: "A Fast Block Matching Algorthim for Stereo Correspondence", 《CYBERNETICS AND INTELLIGENT SYSTEMS, 2008 IEEE CONFERENCE ON》 *
戚利勇: "黄瓜采摘机器人视觉关键技术及系统研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
蒋焕煜等: "双目立体视觉技术在果蔬采摘机器人中的应用", 《江苏大学学报(自然科学版)》 *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530643A (en) * 2013-10-11 2014-01-22 中国科学院合肥物质科学研究院 Pesticide positioned spraying method and system on basis of crop interline automatic identification technology
CN103988824A (en) * 2014-04-18 2014-08-20 浙江大学 Automatic targeting and spraying system based on binocular vision technology
CN103971367B (en) * 2014-04-28 2017-01-11 河海大学 Hydrologic data image segmenting method
CN104615150A (en) * 2014-12-17 2015-05-13 中国科学院合肥物质科学研究院 Machine vision based adaptive precise mist spray device and method
CN104615150B (en) * 2014-12-17 2017-10-03 中国科学院合肥物质科学研究院 A kind of adaptive accurate spraying apparatus and method based on machine vision
CN104604833A (en) * 2015-02-09 2015-05-13 聊城大学 Fall webworm larva net curtain pesticide spraying robot mechanical system
CN106530281B (en) * 2016-10-18 2019-04-09 国网山东省电力公司电力科学研究院 Fuzzy judgment method and system for unmanned aerial vehicle images based on edge features
CN106530281A (en) * 2016-10-18 2017-03-22 国网山东省电力公司电力科学研究院 Edge feature-based unmanned aerial vehicle image blur judgment method and system
WO2018076776A1 (en) * 2016-10-25 2018-05-03 深圳光启合众科技有限公司 Robot, robotic arm and control method and device thereof
CN107255446A (en) * 2017-08-01 2017-10-17 南京农业大学 A kind of Cold region apple fruit tree canopy three-dimensional map constructing system and method
CN107255446B (en) * 2017-08-01 2020-01-07 南京农业大学 A system and method for constructing a three-dimensional map of a dwarf densely planted fruit tree canopy
CN107593200A (en) * 2017-10-31 2018-01-19 河北工业大学 A kind of trees plant protection system and method based on visible ray infrared technique
CN107593200B (en) * 2017-10-31 2022-05-27 河北工业大学 A tree plant protection system and method based on visible light-infrared technology
CN107992868A (en) * 2017-11-15 2018-05-04 辽宁警察学院 A kind of High Precision Stereo footprint Quick Acquisition method
CN109059869A (en) * 2018-07-27 2018-12-21 仲恺农业工程学院 Method for detecting spraying effect of plant protection unmanned aerial vehicle on fruit trees
CN109059869B (en) * 2018-07-27 2020-07-21 仲恺农业工程学院 A method for detecting the spraying effect of plant protection drones on fruit trees
CN111699361B (en) * 2018-09-07 2022-05-27 深圳配天智能技术研究院有限公司 Method and device for measuring distance
CN111699361A (en) * 2018-09-07 2020-09-22 深圳配天智能技术研究院有限公司 Method and device for measuring distance
WO2020047863A1 (en) * 2018-09-07 2020-03-12 深圳配天智能技术研究院有限公司 Distance measurement method and apparatus
CN109699623A (en) * 2019-02-27 2019-05-03 西安交通大学 A kind of Multifunctional plant protection machine people's system
CN110191330A (en) * 2019-06-13 2019-08-30 内蒙古大学 FPGA implementation method and system for depth map based on binocular vision green crop video stream
CN112167212A (en) * 2019-07-02 2021-01-05 上海临石信息科技有限公司 Unmanned aerial vehicle pesticide spraying control system and method
CN110584962A (en) * 2019-08-28 2019-12-20 西安工业大学 Combined obstacle-detection intelligent blind-guiding system
CN111762086A (en) * 2019-12-19 2020-10-13 广州极飞科技有限公司 Spraying control method, device and system and carrier
CN111109240A (en) * 2020-01-03 2020-05-08 东北农业大学 A multi-information fusion variable spraying method and device
CN111109240B (en) * 2020-01-03 2023-09-29 东北农业大学 Multi-information fusion variable spraying device
US11651507B2 (en) 2020-02-14 2023-05-16 Inspur Suzhou Intelligent Technology Co., Ltd. Content-adaptive binocular matching method and apparatus
WO2021159717A1 (en) * 2020-02-14 2021-08-19 苏州浪潮智能科技有限公司 Content self-adaptive binocular matching method and device
CN111953933B (en) * 2020-07-03 2022-07-05 北京中安安博文化科技有限公司 Method, device, medium and electronic equipment for determining fire area
CN111953933A (en) * 2020-07-03 2020-11-17 北京中安安博文化科技有限公司 Method, device, medium and electronic equipment for determining fire area
CN111990378A (en) * 2020-08-25 2020-11-27 淮阴工学院 Spraying control method for spraying robot
CN112889786A (en) * 2021-01-15 2021-06-04 吉林农业大学 Pesticide spraying system capable of tracking field crop seedling areas in real time and control method
CN119235497A (en) * 2024-12-05 2025-01-03 四川德仁源农业科技有限公司 Monitoring system and method based on medicinal stiff silkworm

Also Published As

Publication number Publication date
CN102800083B (en) 2014-12-10

Similar Documents

Publication Publication Date Title
CN102800083B (en) Crop spraying positioning method based on binocular vision gridding partition matching algorithm
Chen et al. Three-dimensional perception of orchard banana central stock enhanced by adaptive multi-vision technology
CN102688823B (en) Atomizing positioning device and method based on hand-eye atomizing mechanical arm
Tang et al. Weed detection using image processing under different illumination for site-specific areas spraying
CN103891697A (en) Drug spraying robot capable of moving indoors autonomously and variable drug spraying method thereof
CN104400265B (en) A kind of extracting method of the welding robot corner connection characteristics of weld seam of laser vision guiding
CN103394430A (en) Inter-sheet dead area optimization process based uniform-spraying manufacturing method for complex curved surface
CN107688779A (en) A kind of robot gesture interaction method and apparatus based on RGBD camera depth images
CN103488991B (en) A kind of leading line extraction method for crop field weed control equipment
KR101109337B1 (en) Automatic pest recognition and control system and method
CN104615150B (en) A kind of adaptive accurate spraying apparatus and method based on machine vision
CN104299246B (en) Production line article part motion detection and tracking based on video
US12172175B2 (en) Variable-rate spray control system based on annular application structure and tree canopy volume calculation method thereof
Li et al. Identification of the operating position and orientation of a robotic kiwifruit pollinator
CN117021059B (en) Picking robot and fruit positioning method, device, electronic equipment and medium
CN110433467A (en) Picking up table tennis ball robot operation method and equipment based on binocular vision and ant group algorithm
Jin et al. Far-near combined positioning of picking-point based on depth data features for horizontal-trellis cultivated grape
CN111744735A (en) A control method based on simulation of surface spraying of handicrafts
Mao et al. Agricultural robot navigation path recognition based on k-means algorithm for large-scale image segmentation
CN113932712B (en) Melon and fruit vegetable size measurement method based on depth camera and key points
Chu et al. High-precision fruit localization using active laser-camera scanning: Robust laser line extraction for 2D-3D transformation
Tian et al. Research on the application of machine vision in tea autonomous picking
Yi et al. View planning for grape harvesting based on active vision strategy under occlusion
CN114485667A (en) Light and intelligent orchard ground navigation method
Gong et al. Navigation line extraction based on root and stalk composite locating points

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141210

Termination date: 20150619

EXPY Termination of patent right or utility model