CN102800083A - Crop spraying positioning method based on binocular vision gridding partition matching algorithm - Google Patents

Crop spraying positioning method based on binocular vision gridding partition matching algorithm Download PDF

Info

Publication number
CN102800083A
CN102800083A CN2012102037124A CN201210203712A CN102800083A CN 102800083 A CN102800083 A CN 102800083A CN 2012102037124 A CN2012102037124 A CN 2012102037124A CN 201210203712 A CN201210203712 A CN 201210203712A CN 102800083 A CN102800083 A CN 102800083A
Authority
CN
China
Prior art keywords
point
target crop
camera
image
crop
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012102037124A
Other languages
Chinese (zh)
Other versions
CN102800083B (en
Inventor
张宾
刘涛
郑承云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Agricultural University
Original Assignee
China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Agricultural University filed Critical China Agricultural University
Priority to CN201210203712.4A priority Critical patent/CN102800083B/en
Publication of CN102800083A publication Critical patent/CN102800083A/en
Application granted granted Critical
Publication of CN102800083B publication Critical patent/CN102800083B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a crop spraying positioning method based on a binocular vision gridding partition matching algorithm and belongs to the technical field of medicine application of crops. The method comprises the following steps of: utilizing double cameras to calibrate target crops and obtaining an image of the target crops; respectively obtaining a binary image of a left camera target crop and a binary image of a right camera target crop; respectively carrying out gridding division on the binary image of the target crop obtained by the left camera and the binary image of the target crop obtained by the right camera; carrying out matched searching on each point of a left side gridding region in a right side gridding region to obtain mutually-matched points to form a matched point pair; calculating a left and right image vision difference by each matched point pair and solving a three-dimensional coordinate of the point of the target crop corresponding to the matched point pair; deleting an error point and carrying out fitting treatment on the point of the target crop to obtain a fitting curve or curve surface; and planning a sprayer path according to the fitting curve or curve surface. The crop spraying positioning method disclosed by the invention realizes extraction and positioning of a three-dimensional information outline of the target crop.

Description

Crops spray location method based on binocular vision grid dividing matching algorithm
Technical field
The invention belongs to crops pesticide application technology field, relate in particular to a kind of crops spray location method based on binocular vision grid dividing matching algorithm.
Background technology
In agricultural production process, be prevention and elimination of disease and pests, often need be through pesticide spraying repeatedly.And the process of dispenser can cause certain harm to environment and operating personnel's health, particularly under greenhouse experiment, and the space relative closure, spraying times is many, and therefore harm is more obvious.Through the accurate dispenser of development robotization system, can soup directly be sprayed onto crop surface, avoid liquid waste, improve the soup service efficiency, reduce environmental pollution, ensure that laborer is healthy, reduce labor intensity.There are important practical sense and social value in development robotization dispenser system.
In robotization dispenser system, the main at present problem that exists has:
1, spray medicine location is not accurate enough, and liquid waste is serious.Show that according to international equipment for plant protection in 2010 and pesticide application technology academic conference data China's agricultural chemicals average utilization is extremely low, have only about 20%.Most agricultural chemicals does not all obtain fully effectively utilizing, and tracing sth. to its source is that application method and means are scientific and reasonable inadequately on the one hand, in the use of agricultural chemicals, adopts extensive style spray medicines in addition more, lacks the technology and the condition of accurate dispenser.
2, pesticide spraying is even inadequately, and the crop surface soup is residual to exceed standard, and the crop that especially in the greenhouse, produces is more obvious.Atomizing effect during medicine liquid spray has very big influence with the spraying operating type to the homogeneity of spraying.Data shows, adopts electrostatic spray can form small droplet particle, and has good tack, helps reducing respraying and leaking spray, improves the homogeneity of spraying.Adopt technology such as preventing drift also can improve spray effect to a certain extent, but the accuracy of spray location directly can have influence on atomization quality fundamentally.
3, the use adaptability of spraying agricultural machinery and implement is limited.The spraying machine that for example abroad in the fruit tree garden, uses adopts excusing from death ripple spray location, and this mode requires fruit tree with particular distance and arrangement mode cultivation, as long as in the ultrasound examination scope, have object, will spray.No matter crop pattern how during spraying, all with the same manner work.Therefore,, environment and crop just be difficult to effectively work after changing.
4, precisely the robot detection and localization effect of spraying is not ideal enough to be used for robotization, and real-time is relatively poor.For example, the robot that the utilization vision detection technology is sprayed to specific disease and pest zone, often comparatively complicated on the algorithm of its detection and localization, need certain computing time.Simultaneously, also there is certain error rate in the target crop detection that needs dispenser.
5, all belong to two-dimensional positioning system to what the crops spray pesticide used basically.In the course of the work; General all is the two-dimensional signal that detects and obtain the spraying object earlier through specific sensor or camera earlier; Shower nozzle is moved to assigned address or the switching of a plurality of shower nozzles is controlled; And the distance of shower nozzle and target crop often all configures in advance, does not adjust in the course of work.Therefore, when crop form, size when there is some difference, will cause different spray effects.
6, the cost performance of pesticide spraying automated system is a problem of its widespread use of restriction equally.But along with the continuous development of industrialized agriculture quantity with technology, the while constantly rises with the labor cost that the aging society arrival is accompanied, and the automated spray operation will have wide application prospect.
In sum, current sixty-four dollar question is to solve spraying identification of targets and orientation problem, develops a kind of have adaptability, accurate positioning, real-time is good, cost performance is high spray location system.
At present, the method for obtaining for the object space three-dimensional information mainly contains laser, ultrasound wave, radar, infrared and binocular vision etc.Preceding four when working normally coming computed range information through reflection wave time or phase differential, binocular vision is mainly through the range of triangle principle, realizes that through the left and right sides images match locating information obtains.The advantage of binocular vision positioning system is that the scope of application is extensive, can cooperate through certain algorithm directly to realize identification of targets and location; Its shortcoming is that identification is often comparatively complicated with location algorithm, and real-time and robustness are relatively poor, and the occasion of particularly irregular to the object form, circumstance complication, illumination condition difference is difficult to detection more.In addition, adopt multi-sensor fusion technology, the locator meams that information such as vision and laser, infrared, ultrasound wave are combined can improve bearing accuracy and reliability of positioning to a certain extent.
Spray location mode proposed by the invention belongs to the vision localization category, adopts the binocular vision positioning system target crops are carried out position probing.In the technical method of present use binocular vision location, how quick, stable, accurate, reliable identification judgement target and definite target location profile information are the subject matter that urgent need will solve.
Summary of the invention
The objective of the invention is to; A kind of crops spray location method based on binocular vision grid dividing matching algorithm is provided; Be used for calculating rapidly and accurately the positional information of target crop; And according to the positional information of target crop, planning shower nozzle operating path so that shower nozzle spray according to rational spraying distance and angle.
For realizing above-mentioned purpose, technical scheme provided by the invention is that a kind of crops spray location method based on binocular vision grid dividing matching algorithm is characterized in that said method comprises:
Step 1: utilize dual camera spotting crop and obtain the image of target crop; Said dual camera is designated as left camera and right camera respectively;
Step 2: the image of the target crop of respectively left camera being obtained and the image of the target crop that right camera obtains are separated from background, obtain the bianry image of left camera target crop and the bianry image of right camera target crop;
Step 3: the bianry image of the target crop of respectively left camera being obtained and the bianry image of the target crop that right camera obtains carry out grid dividing, obtain net region, left side and each zone of right web;
Step 4: each point in the net region, left side, in the net region, right side, carry out match search, the point that is mated each other, it is right to form match point;
Step 5: by each match point to calculating left and right sides image parallactic and asking for the three-dimensional coordinate of said match point to the point of the target crop of correspondence;
Step 6: the three-dimensional coordinate of the point of evaluating objects crop, deletion error point;
Step 7: the point to target crop carries out process of fitting treatment, obtains matched curve or curved surface;
Step 8: according to matched curve or curved surface planning shower nozzle path.
Said image with a left side/target crop that right camera obtains is separated from background, and the bianry image that obtains a left side/right camera target crop comprises:
Step 101: in the HSI color space, utilize the fixed threshold split plot design to obtain the preliminary split image of target crop;
Step 102: utilize ultra green algorithm to obtain the gray level image of target crop;
Step 103: the gray level image to target crop carries out statistics with histogram, obtains the histogram of target crop;
Step 104: utilize neighbour's multi-point average method that the histogram of target crop is carried out smoothing processing;
Step 105: the histogrammic peak value of the target crop of search process smoothing processing also calculates the wave trough position of the peak value left and right sides, thereby obtains the bianry image of target crop.
Said step 4 comprises:
Step 201: initiation parameter makes j=1, wj=1, Min=10000; Wherein, j is the ordinate of the point in the net region, right side, and wj is used for writing down the ordinate of the point that matees successful net region, right side, the minimum value of the absolute difference sum of the correction of grid and right side grid on the left of Min is used to write down;
Step 202: choose the point of left side in the net region (u, i), with its residing left side grid be designated as p and calculate left side grid p gray average Mi and variance Ei (u, i);
Step 203: (u, j), (u, horizontal ordinate i) is identical, and its residing right side grid is designated as q and calculates the gray average N of right side grid q to make the point chosen in horizontal ordinate and the net region, left side of this point to choose point in the net region, right side jWith variance Fj (u, j);
Step 204: judge | Mi-Fj| whether ε sets up, if | < ε sets up Mi-Fj|, and then execution in step 205; Otherwise, make j=j+1 and return step 203; Wherein, ε is a setting value;
Step 205: calculate the absolute difference sum of left side grid p and right side grid q, be designated as SAD; Calculate left side grid p variance Ei (u, i) (u, absolute value j) poor are designated as FCC with the variance Fj of right side grid q;
Step 206: calculate the absolute difference sum of the correction of left side grid p and right side grid q according to formula S F=SAD * a+FCC * b, wherein a and b are respectively scale parameter;
Step 207: if the absolute difference sum of the correction of left side grid p and right side grid q less than Min, then execution in step 208; Otherwise, make j=j+1 and return step 203;
Step 208: make Min=SF, wj=j; Judge whether the value of j passes through all polar curve points, if all polar curve points of the value of j process, then execution in step 209; Otherwise, make j=j+1 and return step 203;
Step 209: (u wj) is point (u, match point i) in the net region, left side to the point in the net region, right side.
The ratio of said scale parameter a and b is 1:1.
Said step 5 comprises:
Step 301: utilize formula D=X L-X RCalculate left and right sides image parallactic, wherein X LBe the horizontal ordinate of a point of match point centering, X RHorizontal ordinate for another point of match point centering;
Step 302: utilize formula x c = BX L D y c = BY D z c = Bf D Calculate the three-dimensional coordinate of match point to the point of the target crop of correspondence; Wherein, x c, y cAnd z cBe respectively the D coordinates value of the point of target crop, B be about distance between the optical axis of two cameras, X LBe the horizontal ordinate of a point of match point centering, Y is the match point centering ordinate of any arbitrarily, and f is the focal length of left camera or right camera.
The present invention adopts refinement to cut apart the method for self application Threshold Segmentation, has improved the robustness and the accuracy of self-adaptation partitioning algorithm; Adopt the method for grid dividing to reduce the calculated population calculated amount; In the images match process of binocular vision location, adopt SAD to improve algorithm, reduced the erroneous matching that occurs easily in the single SAD algorithm to a great extent, and faster than maximum correlation coefficient method speed on counting yield.
Description of drawings
Fig. 1 is based on the crops spray location method flow diagram of binocular vision grid dividing matching algorithm;
Fig. 2 is the synoptic diagram that utilizes dual camera spotting crop; Wherein, (a) be the synoptic diagram that utilizes left camera calibration target crop, (b) utilize the synoptic diagram of right camera calibration target crop;
Fig. 3 utilizes the fixed threshold split plot design to obtain the preliminary split image of target crop;
Fig. 4 is the synoptic diagram that the bianry image of target crop that left camera is obtained carries out grid dividing;
Fig. 5 forms the right process flow diagram of match point;
Fig. 6 is a parallel optical axis dual camera schematic diagram.
Embodiment
Below in conjunction with accompanying drawing, preferred embodiment is elaborated.Should be emphasized that following explanation only is exemplary, rather than in order to limit scope of the present invention and application thereof.
Fig. 1 is based on the crops spray location method flow diagram of binocular vision grid dividing matching algorithm.As shown in Figure 1, the crops spray location method based on binocular vision grid dividing matching algorithm provided by the invention comprises:
Step 1: utilize dual camera spotting crop and obtain the image of target crop, dual camera is designated as left camera and right camera respectively.
When laying left and right sides camera, the optical axis that should guarantee two cameras is parallel to each other and is on the same surface level.Fig. 2 is the synoptic diagram that utilizes dual camera spotting crop.
Step 2: the image of the target crop of respectively left camera being obtained and the image of the target crop that right camera obtains are separated from background, obtain the bianry image of left camera target crop and the bianry image of right camera target crop.
Be example with left camera below, the process of the bianry image that obtains left camera target crop is described.
Step 101: in the HSI color space, utilize the fixed threshold split plot design to obtain the preliminary split image of target crop.
In this step, the image (RGB image) of the target crop that at first need left camera be obtained converts the HSI image into and carries out normalization and handle.
Secondly, setting the span of H (tone), S (color saturation), I (brightness), is black with the pixel location outside the scope, and all the other keep initial value.
Next, the pixel that keeps initial value is the RGB image by the HSI image transitions and carries out anti-normalization and handle, finally obtain the preliminary split image of target crop.Fig. 3 utilizes the fixed threshold split plot design to obtain the preliminary split image of target crop.
Step 102: utilize ultra green algorithm to obtain the gray level image of target crop.
Ultra green algorithm is a kind of through the weight increase of raising green channel and a kind of algorithm of the contrast of non-green background.This algorithm can extract the information of green crops preferably, and it uses in the green crops Flame Image Process of being everlasting.This algorithm use formula 2 * G-R-B handles each pixel, wherein the numerical value of green, redness and the blue channel of G, R and B difference represent pixel.Through the processing of ultra green algorithm, can obtain the gray level image of target crop.
Step 103: the gray level image to target crop carries out statistics with histogram, obtains the histogram of target crop.
Step 104: utilize neighbour's multi-point average method that the histogram of target crop is carried out smoothing processing.
Neighbour's multi-point average method is to solve when histogram local sudden change to occur, and a kind of algorithm solution of macroscopical trend when still not arriving trough.The image of tentatively cutting apart rear backdrop zero setting is done ultra green calculating; Remove gray-scale value in its histogram and be 0 background parts; When having only the corresponding number of pixels of single gray level to be local smaller value; Do not think that this point must be a wave trough position, therefore adopt the continuous multi-point average value of adjacency to detect and effectively to avoid the influence of local minimum segmentation result as wave trough position.Specific algorithm is, with dividing by stages such as three with left neighborhood, right neighborhood, central area near the minimal value position of finding, calculates the histogram data mean value in each zonule, judges whether to be reasonable wave trough position according to numerical result.For example, when asking for the left side wave trough position, it is low right high that a left side appears in regional numerical characteristic, and trizonal average left field is minimum, and the right side is maximum, then should continue to move to the left to seek new trough, avoids the influence of local minimum.
Step 105: the histogrammic peak value of the target crop of search process smoothing processing also calculates the wave trough position of the peak value left and right sides, thereby obtains the bianry image of target crop.
At peak value two-sided search trough, obtain trough after, the image between the trough of both sides is the bianry image of target crop.
Step 3: the bianry image of the target crop of respectively left camera being obtained and the bianry image of the target crop that right camera obtains carry out grid dividing, obtain net region, left side and each zone of right web.
Fig. 4 is the synoptic diagram that the bianry image of target crop that left camera is obtained carries out grid dividing.Among Fig. 4, be example still, explain that the bianry image to the target crop obtained carries out the process of grid dividing with left camera.With position, the image upper left corner is the reference position of ferret out pixel, when the pixel of target crop occurring, locatees first grid.Continuation is divided grid around it, judge that whether this grid exists the pixel of target crop on every side, if exist, then locatees a grid again.For the object pixel of edge, the zone of drawing integral lattice inadequately is from reverse interpolation grid.Can guarantee like this all object pixels are drawn in the net region with size.
Step 4: each point in the net region, left side, in the net region, right side, carry out match search, the point that is mated each other, it is right to form match point.
Fig. 5 forms the right process flow diagram of match point.As shown in Figure 5, form the right process of match point and comprise:
Step 201: initiation parameter makes j=1, wj=1, Min=10000; Wherein, j is the ordinate of the point in the net region, right side, and wj is used for writing down the ordinate of the point that matees successful net region, right side, the minimum value of the absolute difference sum of the correction of grid and right side grid on the left of Min is used to write down.
Step 202: choose the point of left side in the net region (u, i), with its residing left side grid be designated as p and calculate left side grid p gray average Mi and variance Ei (u, i).
Wherein, the computing formula of the gray average Mi of left side grid p does
Mi = g &OverBar; u , i = 1 mn &Sigma; x = 1 m &Sigma; y = 1 n g ( x + u - 1 , y + i - 1 ) - - - ( 1 )
(u, computing formula i) does the variance Ei of left side grid p
Ei ( u , i ) = 1 mn &Sigma; x = 1 m &Sigma; y = 1 n [ g ( x + u - 1 , y + i - 1 ) - g &OverBar; u , i ] 2 - - - ( 2 )
In above-mentioned (1) formula and (2) formula, u and i are respectively point (u, horizontal ordinate i) and ordinate, m and n are respectively pixels across number and the vertical number of pixels of left side grid p, and (x y) is the function expression of pixel among the grid p of left side to g.
Step 203: (u, j), (u, horizontal ordinate i) is identical, and its residing right side grid is designated as q and calculates the gray average N of right side grid q to make the point chosen in horizontal ordinate and the net region, left side of this point to choose point in the net region, right side jWith variance Fj (u, j).
Wherein, the computing formula of the gray average Nj of right side grid q does
N j = f &OverBar; u , j = 1 mn &Sigma; x = 1 m &Sigma; y = 1 n f ( x + u - 1 , y + j - 1 ) - - - ( 3 )
(u, computing formula j) does the variance Fj of right side grid q
Fj ( u , j ) = 1 mn &Sigma; x = 1 m &Sigma; y = 1 n [ f ( x + u - 1 , y + j - 1 ) - f &OverBar; u , j ] 2 - - - ( 4 )
In above-mentioned (3) formula and (4) formula, u and j are respectively point (u, horizontal ordinate j) and ordinate, m and n are respectively pixels across number and the vertical number of pixels of right side grid q, and (x y) is the function expression of pixel among the grid q of right side to f.
Step 204: judge | Mi-Fj| whether ε sets up, if | < ε sets up Mi-Fj|, and then explanation left side grid p and right side grid q about equally can execution in step 205; Otherwise, make j=j+1 and return step 203; Wherein, ε is a setting value.
Step 205: calculate the absolute difference sum of left side grid p and right side grid q, be designated as SAD; (u, i) (u, absolute value j) poor are designated as FCC with the variance Fj of right side grid q to calculate the variance Ei of left side grid p.
The absolute difference sum of calculating left side grid p and right side grid q adopts formula
Figure BDA00001786951400101
In above-mentioned (5) formula; C and r be point (u respectively; I) or point (u; J) horizontal ordinate and ordinate,
Figure BDA00001786951400102
are the function expression of pixel among left side grid p or the right side grid q.
(u, i) (u, the difference of absolute value j) adopts formula with the variance Fj of right side grid q to calculate the variance Ei of left side grid p
FCC=|Ei(u,i)-Fj(u,j)| (6)
Step 206: calculate the absolute difference sum of the correction of left side grid p and right side grid q according to formula S F=SAD * a+FCC * b, wherein a and b are respectively scale parameter.Scale parameter a and b have determined variance factor and absolute difference sum factor to the shared ratio that influences of net result.The ratio of general scale parameter a and b is set at 1:1.
Step 207: if the absolute difference sum of the correction of left side grid p and right side grid q less than Min, then execution in step 108; Otherwise, make j=j+1 and return step 103.
Step 208: make Min=SF, wj=j; Judge whether the value of j passes through all polar curve points, if all polar curve points of the value of j process, then execution in step 209; Otherwise, make j=j+1 and return step 203.
For the parallel shafts biocular systems; If plane and world coordinate system that two camera optical axises are confirmed keep level relatively; Then the polar curve constraint can be considered the capable target location that limits of sustained height left and right sides image; Promptly as far as the same impact point in space, the corresponding point image coordinate vertical direction in the image that left and right sides camera obtains equates, has only the horizontal direction pixel coordinate there are differences.Based on this reason, the present invention carries out along polar position the search of match point.Because match point must be on polar curve, therefore can improve the counting yield and the accuracy of matching algorithm.
Step 209: (u wj) is point (u, match point i) in the net region, left side to the point in the net region, right side.
Step 5: by each match point to calculating left and right sides image parallactic and asking for the three-dimensional coordinate of said match point to the point of the target crop of correspondence.
Fig. 6 is a parallel optical axis dual camera schematic diagram.Among Fig. 6, because the optical axis of left and right sides camera is parallel to each other, the distance between the two is base length B.If left camera photocentre O LThe place is the initial point of world coordinate system, and promptly world coordinate system overlaps with left camera coordinate system fully.Optical axis O LZ LPerpendicular to left camera imaging plane, intersection point is O 1, optical axis O RZ RPerpendicular to right camera imaging plane, intersection point is O 2Two cameras are checked the same point P of space object simultaneously, on the camera of the left and right sides, obtain correspondence image P respectively LAnd P RPoint, coordinate is respectively P L(X L, Y L) and P R(X R, Y R), then parallax is D=X L-X R
Utilize formula
x c = BX L D
y c = BY D - - - ( 7 )
z c = Bf D
Can calculate the three-dimensional coordinate of match point to the point of the target crop of correspondence.Wherein, x c, y cAnd z cBe respectively the D coordinates value of the point of target crop.B be about distance between the optical axis of two cameras, X LThe horizontal ordinate of the point that obtains for match point centering left side camera, Y is the match point centering ordinate of any arbitrarily.Because two cameras are the optical axis parallel coordinate system, therefore left camera and right camera Y axial coordinate in fact numerical value be the same, Y=Y is promptly arranged L=Y RF is the focal length of left camera or right camera.
Step 6: the three-dimensional coordinate of the point of evaluating objects crop, deletion error point.
Deletion error point is mainly according to constraint condition, comprises that the order of the corresponding match point of bianry image that left and right sides camera obtains should be consistent, if inconsistent then think error matching points.In addition,, then think error matching points, should remove if the coordinate of the point that calculates surpasses given range.
Step 7: the point to target crop carries out process of fitting treatment, obtains matched curve or curved surface.
Spatial discrete points according to step 6 acquisition; With specific space curve (curved surface) model fitting discrete point; Be spherical for example, then adopt circular arc line (sphere) equation, obtain optimum fit curve (face) equation parameter with least square method as crop modeling for the target crop grown form.
Step 8: according to matched curve or curved surface planning shower nozzle path.
Matched curve or curved surface planning shower nozzle path based on step 7 obtains make shower nozzle spray with desirable spraying distance and angle.
The present invention can realize the three-dimensional information profile of target crop is extracted and the location.This method accurate positioning, calculated amount are moderate, can satisfy the requirement of spraying real-time, and have higher flexibility and adaptive faculty in the course of work; For realizing accurate automated spray; Reduce environmental pollution, improve the agricultural chemicals utilization factor, a kind of effective implementation method is provided.
The above; Be merely the preferable embodiment of the present invention, but protection scope of the present invention is not limited thereto, any technician who is familiar with the present technique field is in the technical scope that the present invention discloses; The variation that can expect easily or replacement all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion with the protection domain of claim.

Claims (5)

1. crops spray location method based on binocular vision grid dividing matching algorithm is characterized in that said method comprises:
Step 1: utilize dual camera spotting crop and obtain the image of target crop; Said dual camera is designated as left camera and right camera respectively;
Step 2: the image of the target crop of respectively left camera being obtained and the image of the target crop that right camera obtains are separated from background, obtain the bianry image of left camera target crop and the bianry image of right camera target crop;
Step 3: the bianry image of the target crop of respectively left camera being obtained and the bianry image of the target crop that right camera obtains carry out grid dividing, obtain net region, left side and each zone of right web;
Step 4: each point in the net region, left side, in the net region, right side, carry out match search, the point that is mated each other, it is right to form match point;
Step 5: by each match point to calculating left and right sides image parallactic and asking for the three-dimensional coordinate of said match point to the point of the target crop of correspondence;
Step 6: the three-dimensional coordinate of the point of evaluating objects crop, deletion error point;
Step 7: the point to target crop carries out process of fitting treatment, obtains matched curve or curved surface;
Step 8: according to matched curve or curved surface planning shower nozzle path.
2. method according to claim 1 is characterized in that said image with a left side/target crop that right camera obtains separates from background, the bianry image that obtains a left side/right camera target crop comprises:
Step 101: in the HSI color space, utilize the fixed threshold split plot design to obtain the preliminary split image of target crop;
Step 102: utilize ultra green algorithm to obtain the gray level image of target crop;
Step 103: the gray level image to target crop carries out statistics with histogram, obtains the histogram of target crop;
Step 104: utilize neighbour's multi-point average method that the histogram of target crop is carried out smoothing processing;
Step 105: the histogrammic peak value of the target crop of search process smoothing processing also calculates the wave trough position of the peak value left and right sides, thereby obtains the bianry image of target crop.
3. method according to claim 1 is characterized in that said step 4 comprises:
Step 201: initiation parameter makes j=1, wj=1, Min=10000; Wherein, j is the ordinate of the point in the net region, right side, and wj is used for writing down the ordinate of the point that matees successful net region, right side, the minimum value of the absolute difference sum of the correction of grid and right side grid on the left of Min is used to write down;
Step 202: choose the point of left side in the net region (u, i), with its residing left side grid be designated as p and calculate left side grid p gray average Mi and variance Ei (u, i);
Step 203: (u, j), (u, horizontal ordinate i) is identical, and its residing right side grid is designated as q and calculates the gray average N of right side grid q to make the point chosen in horizontal ordinate and the net region, left side of this point to choose point in the net region, right side jWith variance Fj (u, j);
Step 204: judge | Mi-Fj| whether ε sets up, if | < ε sets up Mi-Fj|, and then execution in step 205; Otherwise, make j=j+1 and return step 203; Wherein, ε is a setting value;
Step 205: calculate the absolute difference sum of left side grid p and right side grid q, be designated as SAD; Calculate left side grid p variance Ei (u, i) (u, absolute value j) poor are designated as FCC with the variance Fj of right side grid q;
Step 206: calculate the absolute difference sum of the correction of left side grid p and right side grid q according to formula S F=SAD * a+FCC * b, wherein a and b are respectively scale parameter;
Step 207: if the absolute difference sum of the correction of left side grid p and right side grid q less than Min, then execution in step 208; Otherwise, make j=j+1 and return step 203;
Step 208: make Min=SF, wj=j; Judge whether the value of j passes through all polar curve points, if all polar curve points of the value of j process, then execution in step 209; Otherwise, make j=j+1 and return step 203;
Step 209: (u wj) is point (u, match point i) in the net region, left side to the point in the net region, right side.
4. method according to claim 3, the ratio that it is characterized in that said scale parameter a and b is 1:1.
5. method according to claim 1 is characterized in that said step 5 comprises:
Step 301: utilize formula D=X L-X RCalculate left and right sides image parallactic, wherein X LBe the horizontal ordinate of a point of match point centering, X RHorizontal ordinate for another point of match point centering;
Step 302: utilize formula x c = B X L D y c = BY D z c = Bf D Calculate the three-dimensional coordinate of match point to the point of the target crop of correspondence; Wherein, x c, y cAnd z cBe respectively the D coordinates value of the point of target crop, B be about distance between the optical axis of two cameras, X LBe the horizontal ordinate of a point of match point centering, Y is the match point centering ordinate of any arbitrarily, and f is the focal length of left camera or right camera.
CN201210203712.4A 2012-06-19 2012-06-19 Crop spraying positioning method based on binocular vision gridding partition matching algorithm Expired - Fee Related CN102800083B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210203712.4A CN102800083B (en) 2012-06-19 2012-06-19 Crop spraying positioning method based on binocular vision gridding partition matching algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210203712.4A CN102800083B (en) 2012-06-19 2012-06-19 Crop spraying positioning method based on binocular vision gridding partition matching algorithm

Publications (2)

Publication Number Publication Date
CN102800083A true CN102800083A (en) 2012-11-28
CN102800083B CN102800083B (en) 2014-12-10

Family

ID=47199181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210203712.4A Expired - Fee Related CN102800083B (en) 2012-06-19 2012-06-19 Crop spraying positioning method based on binocular vision gridding partition matching algorithm

Country Status (1)

Country Link
CN (1) CN102800083B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530643A (en) * 2013-10-11 2014-01-22 中国科学院合肥物质科学研究院 Pesticide positioned spraying method and system on basis of crop interline automatic identification technology
CN103988824A (en) * 2014-04-18 2014-08-20 浙江大学 Automatic targeting and spraying system based on binocular vision technology
CN104604833A (en) * 2015-02-09 2015-05-13 聊城大学 Fall webworm larva net curtain pesticide spraying robot mechanical system
CN104615150A (en) * 2014-12-17 2015-05-13 中国科学院合肥物质科学研究院 Machine vision based adaptive precise mist spray device and method
CN103971367B (en) * 2014-04-28 2017-01-11 河海大学 Hydrologic data image segmenting method
CN106530281A (en) * 2016-10-18 2017-03-22 国网山东省电力公司电力科学研究院 Edge feature-based unmanned aerial vehicle image blur judgment method and system
CN107255446A (en) * 2017-08-01 2017-10-17 南京农业大学 A kind of Cold region apple fruit tree canopy three-dimensional map constructing system and method
CN107593200A (en) * 2017-10-31 2018-01-19 河北工业大学 A kind of trees plant protection system and method based on visible ray infrared technique
WO2018076776A1 (en) * 2016-10-25 2018-05-03 深圳光启合众科技有限公司 Robot, robotic arm and control method and device thereof
CN107992868A (en) * 2017-11-15 2018-05-04 辽宁警察学院 A kind of High Precision Stereo footprint Quick Acquisition method
CN109059869A (en) * 2018-07-27 2018-12-21 仲恺农业工程学院 Method for detecting spraying effect of plant protection unmanned aerial vehicle on fruit trees
CN109699623A (en) * 2019-02-27 2019-05-03 西安交通大学 A kind of Multifunctional plant protection machine people's system
CN110191330A (en) * 2019-06-13 2019-08-30 内蒙古大学 Depth map FPGA implementation method and system based on binocular vision green crop video flowing
CN110584962A (en) * 2019-08-28 2019-12-20 西安工业大学 Combined obstacle-detection intelligent blind-guiding system
WO2020047863A1 (en) * 2018-09-07 2020-03-12 深圳配天智能技术研究院有限公司 Distance measurement method and apparatus
CN111109240A (en) * 2020-01-03 2020-05-08 东北农业大学 Multi-information fusion variable pesticide spraying method and device
CN111762086A (en) * 2019-12-19 2020-10-13 广州极飞科技有限公司 Spraying control method, device and system and carrier
CN111953933A (en) * 2020-07-03 2020-11-17 北京中安安博文化科技有限公司 Method, device, medium and electronic equipment for determining fire area
CN111990378A (en) * 2020-08-25 2020-11-27 淮阴工学院 Spraying control method for spraying robot
CN112167212A (en) * 2019-07-02 2021-01-05 上海临石信息科技有限公司 Unmanned aerial vehicle pesticide spraying control system and method
CN112889786A (en) * 2021-01-15 2021-06-04 吉林农业大学 Pesticide spraying system capable of tracking field crop seedling areas in real time and control method
WO2021159717A1 (en) * 2020-02-14 2021-08-19 苏州浪潮智能科技有限公司 Content self-adaptive binocular matching method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008082870A (en) * 2006-09-27 2008-04-10 Setsunan Univ Image processing program, and road surface state measuring system using this
CN101312593A (en) * 2007-05-25 2008-11-26 中兴通讯股份有限公司 Access control method of private base station
CN101996399A (en) * 2009-08-18 2011-03-30 三星电子株式会社 Device and method for estimating parallax between left image and right image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008082870A (en) * 2006-09-27 2008-04-10 Setsunan Univ Image processing program, and road surface state measuring system using this
CN101312593A (en) * 2007-05-25 2008-11-26 中兴通讯股份有限公司 Access control method of private base station
CN101996399A (en) * 2009-08-18 2011-03-30 三星电子株式会社 Device and method for estimating parallax between left image and right image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
TANGFEI TAO ET AL: "A Fast Block Matching Algorthim for Stereo Correspondence", 《CYBERNETICS AND INTELLIGENT SYSTEMS, 2008 IEEE CONFERENCE ON》 *
戚利勇: "黄瓜采摘机器人视觉关键技术及系统研究", 《中国优秀硕士学位论文全文数据库信息科技辑》 *
蒋焕煜等: "双目立体视觉技术在果蔬采摘机器人中的应用", 《江苏大学学报(自然科学版)》 *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103530643A (en) * 2013-10-11 2014-01-22 中国科学院合肥物质科学研究院 Pesticide positioned spraying method and system on basis of crop interline automatic identification technology
CN103988824A (en) * 2014-04-18 2014-08-20 浙江大学 Automatic targeting and spraying system based on binocular vision technology
CN103971367B (en) * 2014-04-28 2017-01-11 河海大学 Hydrologic data image segmenting method
CN104615150A (en) * 2014-12-17 2015-05-13 中国科学院合肥物质科学研究院 Machine vision based adaptive precise mist spray device and method
CN104615150B (en) * 2014-12-17 2017-10-03 中国科学院合肥物质科学研究院 A kind of adaptive accurate spraying apparatus and method based on machine vision
CN104604833A (en) * 2015-02-09 2015-05-13 聊城大学 Fall webworm larva net curtain pesticide spraying robot mechanical system
CN106530281B (en) * 2016-10-18 2019-04-09 国网山东省电力公司电力科学研究院 Unmanned plane image fuzzy Judgment method and system based on edge feature
CN106530281A (en) * 2016-10-18 2017-03-22 国网山东省电力公司电力科学研究院 Edge feature-based unmanned aerial vehicle image blur judgment method and system
WO2018076776A1 (en) * 2016-10-25 2018-05-03 深圳光启合众科技有限公司 Robot, robotic arm and control method and device thereof
CN107255446A (en) * 2017-08-01 2017-10-17 南京农业大学 A kind of Cold region apple fruit tree canopy three-dimensional map constructing system and method
CN107255446B (en) * 2017-08-01 2020-01-07 南京农业大学 Dwarfing close-planting fruit tree canopy three-dimensional map construction system and method
CN107593200A (en) * 2017-10-31 2018-01-19 河北工业大学 A kind of trees plant protection system and method based on visible ray infrared technique
CN107593200B (en) * 2017-10-31 2022-05-27 河北工业大学 Tree plant protection system and method based on visible light-infrared technology
CN107992868A (en) * 2017-11-15 2018-05-04 辽宁警察学院 A kind of High Precision Stereo footprint Quick Acquisition method
CN109059869A (en) * 2018-07-27 2018-12-21 仲恺农业工程学院 Method for detecting spraying effect of plant protection unmanned aerial vehicle on fruit trees
CN109059869B (en) * 2018-07-27 2020-07-21 仲恺农业工程学院 Method for detecting spraying effect of plant protection unmanned aerial vehicle on fruit trees
CN111699361B (en) * 2018-09-07 2022-05-27 深圳配天智能技术研究院有限公司 Method and device for measuring distance
WO2020047863A1 (en) * 2018-09-07 2020-03-12 深圳配天智能技术研究院有限公司 Distance measurement method and apparatus
CN111699361A (en) * 2018-09-07 2020-09-22 深圳配天智能技术研究院有限公司 Method and device for measuring distance
CN109699623A (en) * 2019-02-27 2019-05-03 西安交通大学 A kind of Multifunctional plant protection machine people's system
CN110191330A (en) * 2019-06-13 2019-08-30 内蒙古大学 Depth map FPGA implementation method and system based on binocular vision green crop video flowing
CN112167212A (en) * 2019-07-02 2021-01-05 上海临石信息科技有限公司 Unmanned aerial vehicle pesticide spraying control system and method
CN110584962A (en) * 2019-08-28 2019-12-20 西安工业大学 Combined obstacle-detection intelligent blind-guiding system
CN111762086A (en) * 2019-12-19 2020-10-13 广州极飞科技有限公司 Spraying control method, device and system and carrier
CN111109240A (en) * 2020-01-03 2020-05-08 东北农业大学 Multi-information fusion variable pesticide spraying method and device
CN111109240B (en) * 2020-01-03 2023-09-29 东北农业大学 Multi-information fusion variable spraying device
WO2021159717A1 (en) * 2020-02-14 2021-08-19 苏州浪潮智能科技有限公司 Content self-adaptive binocular matching method and device
US11651507B2 (en) 2020-02-14 2023-05-16 Inspur Suzhou Intelligent Technology Co., Ltd. Content-adaptive binocular matching method and apparatus
CN111953933B (en) * 2020-07-03 2022-07-05 北京中安安博文化科技有限公司 Method, device, medium and electronic equipment for determining fire area
CN111953933A (en) * 2020-07-03 2020-11-17 北京中安安博文化科技有限公司 Method, device, medium and electronic equipment for determining fire area
CN111990378A (en) * 2020-08-25 2020-11-27 淮阴工学院 Spraying control method for spraying robot
CN112889786A (en) * 2021-01-15 2021-06-04 吉林农业大学 Pesticide spraying system capable of tracking field crop seedling areas in real time and control method

Also Published As

Publication number Publication date
CN102800083B (en) 2014-12-10

Similar Documents

Publication Publication Date Title
CN102800083B (en) Crop spraying positioning method based on binocular vision gridding partition matching algorithm
Abbas et al. Different sensor based intelligent spraying systems in Agriculture
Berk et al. Development of alternative plant protection product application techniques in orchards, based on measurement sensing systems: A review
CN103049912B (en) Random trihedron-based radar-camera system external parameter calibration method
CN102688823B (en) Atomizing positioning device and method based on hand-eye atomizing mechanical arm
Qingchun et al. Design of structured-light vision system for tomato harvesting robot
CN106347919A (en) Automatic warehousing system
CN107831777A (en) A kind of aircraft automatic obstacle avoiding system, method and aircraft
Zhang et al. Review of variable-rate sprayer applications based on real-time sensor technologies
Gao et al. A spraying path planning algorithm based on colour-depth fusion segmentation in peach orchards
CN105557672A (en) Fruit tree target detection system
KR101109337B1 (en) Automatic pest recognition and control system and method
CN109032174B (en) Unmanned aerial vehicle operation route planning method and operation execution method
CN110806585B (en) Robot positioning method and system based on trunk clustering tracking
CN102422832B (en) Visual spraying location system and location method
Ma et al. Rice row tracking control of crawler tractor based on the satellite and visual integrated navigation
WO2023082482A1 (en) Variable spray control system based on annular pesticide application structure and plant canopy volume calculation method
CN205390106U (en) Fruit tree target detection system
Moreno et al. Proximal sensing for geometric characterization of vines: A review of the latest advances
Song et al. Navigation algorithm based on semantic segmentation in wheat fields using an RGB-D camera
Binbin et al. Research progress on autonomous navigation technology of agricultural robot
Li et al. Identification of the operating position and orientation of a robotic kiwifruit pollinator
Liu et al. Extracting visual navigation line between pineapple field rows based on an enhanced YOLOv5
Paturkar et al. Overview of image-based 3D vision systems for agricultural applications
Li et al. Design of multifunctional seedbed planting robot based on MobileNetV2-SSD

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141210

Termination date: 20150619

EXPY Termination of patent right or utility model