CN102183250A - Automatic navigation and positioning device and method for field road of agricultural machinery - Google Patents

Automatic navigation and positioning device and method for field road of agricultural machinery Download PDF

Info

Publication number
CN102183250A
CN102183250A CN 201110007784 CN201110007784A CN102183250A CN 102183250 A CN102183250 A CN 102183250A CN 201110007784 CN201110007784 CN 201110007784 CN 201110007784 A CN201110007784 A CN 201110007784A CN 102183250 A CN102183250 A CN 102183250A
Authority
CN
China
Prior art keywords
coloured
vision sensor
omnibearing vision
sign
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 201110007784
Other languages
Chinese (zh)
Other versions
CN102183250B (en
Inventor
李明
戴思慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Agricultural University
Original Assignee
Hunan Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Agricultural University filed Critical Hunan Agricultural University
Priority to CN201110007784A priority Critical patent/CN102183250B/en
Publication of CN102183250A publication Critical patent/CN102183250A/en
Application granted granted Critical
Publication of CN102183250B publication Critical patent/CN102183250B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an automatic navigation and positioning device and an automatic navigation and positioning method for a field road of agricultural machinery. Two colored marks are set on the same side of a road on which an operating vehicle runs; an omnibearing visual sensor for photographing an omnibearing image is arranged on the operating vehicle to be positioned; a microprocessor is also arranged on the operating vehicle and comprises 1) an image processing unit and 2) a positioning coordinate calculation unit; the image processing unit is used for acquiring distances r1 and r2 from a projection center in the omnibearing image to the two colored marks and an azimuth angle theta 1 formed by the two colored marks and the omnibearing visual sensor according to the omnibearing image; and the positioning coordinate calculation unit is used for finally solving a positioning coordinate according to an automatic navigation and positioning method for the field road of the agricultural machinery. Through the automatic navigation and positioning device and the automatic navigation and positioning method for the field road of the agricultural machinery, a global positioning system (GPS) can be substituted or compensated for positioning; and the device and the method have the advantages of high running speed, wide application range, high practicability and low cost.

Description

A kind of field road automatic navigating and positioning device and method of agricultural machinery
Technical field
The invention belongs to computer vision image processing application, relate to a kind of field road automatic navigating and positioning device and method of agricultural machinery.
Background technology
At present, people require more and more higher to the automaticity of machinery; The development of mechanical automation operation and robot thereof is following Mechanization Development direction; The location is concerning automatic job vehicle, robot, and is extremely important.Can the position of oneself must be known by vehicle, robot when automatic job, just can determine robot finish executing the task of next step on request.The image processing techniques location is extracted feature out and is divided into local positioning and global location from the scope of controlling from image; Local positioning applies to the operation in the certain limit, and computer image treatment generally extracts the benchmark of distinctive characteristic body as the location, and characteristic body is divided into natural forms and artificial the setting; Global location is such as the GPS technology of current trend use.Be divided into geometry location and topology location from the character of location.Geometry location can be known the particular location of robot, and the topology location is to determine robot in certain admissible scope path, and does not know detailed geometric position; Position Research for various robots is many, and the location that is used for outdoor robot and vehicle now uses the GPS technology basically, and this location technology error ratio is bigger; Cover the zone in the woods, house, can not transmit and to use because of microwave signal; A kind of in recent years in addition precision is higher in the RTK-GPS of the technical development of GPS technology, but costs an arm and a leg.Common vehicle and Work machine be agricultural machinery and agricultural transport vehicle especially, and operating environment place area is more abominable, chamber operation especially, and the GPS technology may not realize the self-navigation location.
Omnibearing vision sensor is a kind of novel vision sensor, it can provide the abundant information of 360 degree scopes, and image can not change image with the rotation of sensor, helps reducing the part negative effect of wheelslip and vibrations, and low price, be widely used in the robot field; Also obtained application at the robot orientation, but its use principle and method generally are the feature objects that the feature object manually is set in operating environment or utilizes nature, extract eigenwert by image processing techniques, find unique point or unique point zone to analyze the topology location by images match then; This locatees general computation process complexity, long operational time, and practicality is not so good.Patent of invention: a kind of automatic navigating and positioning device and method (application number: 200910044412.4), vehicle, robot at operation, especially operating environment is more fixing, field, greenhouse, factory building etc., and the out of use zone of GPS technology, utilize a kind of automatic navigating and positioning device and the method for the exploitation of omnibearing vision sensor and computer visual image treatment technology, this invention can replace GPS to position at some special occasions, and travelling speed is very fast, usable range is wide, practical, low price.This invention has used 4 signs to cooperate omnibearing vision sensor to finish the location; Because the field road is narrower, arranges that 4 signs are time-consuming, hinder the travelling of agricultural machinery, the GPS price is expensive and shortcoming such as existence use restriction.
Summary of the invention
Technical matters to be solved by this invention is field road automatic navigating and positioning device and the method that proposes a kind of agricultural machinery, the field road automatic navigating and positioning device and the method for this agricultural machinery can replace GPS to position, and travelling speed is very fast, usable range is wide, practical, low price.
Technical solution of the present invention is as follows:
A kind of field road self-navigation localization method of agricultural machinery may further comprise the steps:
Step 1: preliminary work: 2 coloured sign L are set in the same side at road 1And L 2, in Operation Van to be positioned, being provided with omnibearing vision sensor, omnibearing vision sensor is taken the current omnidirectional images that includes 2 coloured sign imagings;
Step 2: measure the imaging actual range that projection centre branch in omnidirectional images is clipped to 2 coloured sign imagings, and then according to following formula calculate Operation Van in the actual environment divide be clipped to 2 coloured signs apart from r1 and r2; And obtain the orientation angles θ of 2 coloured signs and omnibearing vision sensor formation according to omnidirectional images 1
Formula by imaging actual distance calculation space length is as follows:
( - 2 cx i x 0 f + x i H + c ) 2 b 2 - ( 2 cx 0 x i x 0 f + x i H ) 2 a 2 = 1 , x o , x i > 0 ;
A wherein, b, c are the structural parameters on the hyperboloid surface of omnibearing vision sensor, f is a focal length; H is the height of current fully-directional visual system; A, b, c, f and H are definite value;
x iBe the imaging actual range, the imaging point of spatial point P in omnidirectional images at promptly a certain coloured sign place is to the distance of omnibearing vision sensor projection centre;
x 0Be space length, promptly omnibearing vision sensor is to the space length of this coloured sign;
Step 3: the coordinate of obtaining 3 position candidate: (x I1, y I1), (x I2, y I2) and (x I3, y I3);
Step 4: the coordinate according to 3 position candidate is obtained the elements of a fix, and finish the location: obtain the elements of a fix (x according to following formula I, y I), the mechanical geometry location that promptly fulfils assignment at the field road,
x I = ( x I 1 + x I 2 + x I 3 ) / 3 y I = ( y I 1 + y I 2 + y I 3 ) / 3 .
In the step 3, the acquiring method of the coordinate of 3 position candidate is as follows:
At first, with some coloured initial points that is designated,, set up the plane coordinate system of actual environment with the line of two signs a coordinate axis as planimetric coordinates; (x a, y a) for identifying L 1Position coordinates in described plane coordinate system, (x b, y b) for identifying L 2Position coordinates in described plane coordinate system;
I) coordinate of the first position candidate I1 ask method:
According to formula
Figure BDA0000043844550000031
Obtain separating of two position coordinateses, limit the same side principle that this constraint condition and coloured sign are arranged on road according to the road fabric width again, determine that one of them is separated is the coordinate (x of the first position candidate I1 I1, y I1); R1 and r2 represent that also Operation Van divides the line that is clipped to 2 coloured signs;
II) coordinate (x of second position candidate I2 I2, y I2) be:
x I 2 = L 12 - r 1 cos θ y I 2 = r 1 sin θ , Wherein, L 12Be coloured sign L 1And L 2Geometry air line distance, L simultaneously 12The line of also representing two coloured signs, θ are r 1With L 12The angle that forms;
sin θ = 1 - cos 2 θ , cosθ=)(r 1 2+L 12 2-X 2)/2L 12r 1 X = r 1 cos θ 1 + ( r 1 cos θ 1 ) 2 - r 1 2 + L 12 2 ,
Wherein X is the opposite side at θ angle in the triangle;
III) coordinate (x of the 3rd position candidate I3 I3, y I3) be:
x I 3 = L 12 - r 2 cos α y I 3 = r 2 sin α ; Wherein, sin α = 1 - cos 2 α ,
Cos α=(r 2 2+ L 12 2-X ' 2) 2L 12r 2
Figure BDA0000043844550000037
In the formula, α is r 2With L 12The middle angle that forms; X ' is the opposite side at α angle in the triangle.
Road fabric width restriction refers to mainly be meant in the program design, and such as the 2m that has a lot of social connections, the value of obtaining so must be less than 2m, and is necessary for positive number.
Illustrate that the condition of asking X to utilize is orientation angles θ 1With distance r1, X is not r 2
The acquiring method of imaging actual range is:
At first calculate the imaging pixel distance that coloured sign is imaged onto the omnibearing vision sensor projection centre, again this imaging pixel distance be multiply by the demarcation pixel distance and be calculated to be the picture actual range.
The demarcation pixel distance is 0.01389inch/pixel.
A kind of field road automatic navigating and positioning device of agricultural machinery, 2 coloured signs are set in the same side at the road of exercising in Operation Van, in Operation Van to be positioned, be provided with the omnibearing vision sensor that is used to take omnidirectional images, in Operation Van, also be provided with microprocessor, include 1 in the microprocessor) be used for from omnidirectional images obtain projection centre to 2 coloured sign apart from r1 and r2 and the orientation angles θ that obtains 2 coloured signs and omnibearing vision sensor formation according to omnidirectional images 1Graphics processing unit and 2) be used to adopt the field road self-navigation localization method of agricultural machinery finally to obtain the elements of a fix computing unit of the elements of a fix.
Described omnibearing vision sensor adopts bitoric lens.
Principle of work of the present invention is: described field road automatic navigating and positioning device comprise by tripod be installed in omnibearing vision sensor on the Work machine, respectively be arranged at road on one side 2 coloured signs, computing machine and the power supply that is connected with computing machine; Described omnibearing vision sensor comprises loam cake, lower cover, curved surface minute surface, center needle, transparent housing, USB video camera and be used for forming with the USB line of computing machine communication; Described curved surface minute surface and described loam cake are bonded together, and described loam cake screw-type screws on the described transparent housing; Described lower cover is intermediate hollow and stretches out about 4mm screw socket and connect described USB camera lens; Described curved mirror face comprises sphere, parabolic surface and hyperbolic curve curved surface, is characterized in single viewpoint, gathers large-scale data; Described transparent housing cover stays in the outside of curved mirror face, works to support the curved surface minute surface and prevents dust; Described center needle spiral shell covers last, passes and close adhesion curve face center, can effectively prevent direct reflection; Be connected with the focusing screw thread by the USB camera lens that is threaded on the transparent housing, the signal output part of this USB video camera is connected to computing machine by the USB line.Described sign, the circular or conical artificial coloured sign of design outline; Described coloured, form by blue and red; And installation light increases colour brightness in sign, and adopts shadow shield to be divided into bisection, avoids above the obscuring of 2 signs.This computing machine comprises hardware and software as the control center of whole device, and software comprises to be realized image acquisition, processing, calculating and analytic function software and implement the parameter control program to Work machine issue guiding.This computing machine is by Work machine generating not stopping power supply.
Accordingly, the field road self-navigation localization method that this field road automatic navigating and positioning device is taked comprises that sign settings, view data input, Flame Image Process and calculating and result export four steps: described sign setting comprises: one side in 2 the coloured signs that are provided with of field road, it is 20m that distance is set between per two signs; Described view data input by the real-time photographic images of omnibearing vision sensor, is imported computing machine with image by real-time interface; The image that described omnibearing vision sensor provides is 360 degree information, contains much information, and figure is circular.Far away more from the omnibearing vision sensor center, the resolution step-down of image; And on length, deform, therefore, the graphical analysis more complicated, but a big advantage of this omnibearing vision sensor does not change for its position angle.Described Flame Image Process and calculating are one of innovation parts of the present invention; At first, at the intensity of coloured sign color in the omnidirectional images computed image of input computing machine; Color threshold according to the experiment setting; Line by line the above-mentioned image that collects is scanned, extract the color characteristic pixel that is higher than threshold value; Judgement is divided into the feature pixel zone according to Euclidean distance; Calculate the center of gravity of the pixel in red and blue pixel point zone; Distance according to red pixel point center of gravity and blue pixel point center of gravity judges whether to be sign then.The center of gravity of at last red according to all and blue pixel unique point is as a coloured position that is identified at this image; Finish the scanning of entire image successively, search out 2 coloured positions that are identified in the image; Have only one or do not have if coloured identification characteristics point extracts, then return previous action, carry out the analysis of next position feature dot image; If having extracted coloured identification characteristics point is 2, then according to 2 coloured positions that are identified in the image of extracting: the orientation angles that draws every adjacent two coloured signs and omnibearing vision sensor formation first by Flame Image Process; They are two years old, according to the position of omnibearing vision sensor in image, calculate two coloured signs respectively in image from the image distance of omnibearing vision sensor, thereby draw space length between two coloured signs and omnibearing vision sensor according to omnibearing vision sensor imaging parameters fortran; According to the locus of two known coloured signs, utilize the circumference theorem to obtain 2 space position candidate of omnibearing vision sensor again,, choose a wherein feasible position candidate according to the restriction of road fabric width; They are three years old, the orientation angles that forms in conjunction with sign and omnibearing vision sensor and draw the triangle relation of the space length between two coloured signs and omnibearing vision sensor according to imaging parameters fortran draws other 2 space position candidate of omnibearing vision sensor; At last, according to its two and the mean value of its three three position candidate coordinates obtaining be omnibearing vision sensor coordinate spatially, promptly be the geometry location of Work machine at the field road; According to the relation of Work machine self coordinate and volume coordinate, calculate guide angle and guiding distance; Result's output is according to the track route of the above-mentioned steps guide angle that obtains and the distance control Work machine that leads.
Install light fixture as the inside that colour code is known, adopt materials such as glass, transparent plastic, adopt the present invention a little less than light or also can operate night.
Beneficial effect:
The field road automatic navigating and positioning device and the method for agricultural machinery of the present invention, constantly provide the omni-directional visual image according to the omnibearing vision sensor online in real time, position one sub-picture utilizes Flame Image Process to extract the position that is identified in the image, computed image distance and space length; In conjunction with by the immovable principle of omnibearing vision sensor imaging side parallactic angle, draw the position angle, calculate omnibearing vision sensor jointly at the space absolute coordinates of field road with respect to sign; Further,, calculate guide angle and guiding distance, the locating information of necessity is provided for the navigation of Operation Van according to the relation of vehicle, robot self coordinate and volume coordinate.This method compares that other localization method is simple, and travelling speed is fast, and is practical, and described automatic navigating and positioning device is simple in structure, and cost is not high, is easy to be extended and applied.
According to the image-forming principle of omnibearing vision sensor, by Flame Image Process, than 2 that obtain by Flame Image Process distance, the orientation angles θ that draws 1Be very accurately, therefore in order to reduce error, improve bearing accuracy, the present invention utilizes orientation angles θ 1, the mean value of obtaining 3 candidate points at last is as the elements of a fix, and this is the innovation point of most critical of the present invention.
Description of drawings
Fig. 1 is the structural representation of field of the present invention road automatic navigating and positioning device;
Fig. 2 is an omnibearing vision sensor structural representation of the present invention;
Fig. 3 is the implementation step figure of road self-navigation localization method in field of the present invention;
Fig. 4 is instance identification deflection figure of the present invention;
Fig. 5 omnibearing vision sensor point image-forming principle of the present invention;
Fig. 6 is the relational expression derivation synoptic diagram of example device system height of the present invention and omnibearing vision sensor and identifier space distance, image-forming range;
Fig. 7 is an example location Calculation candidate point I1 schematic diagram calculation of the present invention;
Fig. 8 is an example location Calculation candidate point I2 schematic diagram calculation of the present invention;
Fig. 9 is an example location Calculation candidate point I3 schematic diagram calculation of the present invention;
Figure 10 is example Principal of Fix Calculation figure of the present invention.
In above-mentioned figure,
The bent type minute surface of 1-sign L1 2-field road 3-Work machine 4-sign L2 5-tripod 6-power supply 7-computing machine 8-omnibearing vision sensor 9-10-center needle s 11-USB video camera 12-USB line 13-focusing screw thread 14-camera lens 15-lower cover 16-transparent housing 17-loam cake 18-sign is provided with 19-view data input 20-Flame Image Process and calculates 21-result and export the 30-imaging plane, the 40-hyperboloid.
Embodiment
Below with reference to the drawings and specific embodiments the present invention is described in further details:
A kind of field road automatic navigating and positioning device, comprise the omnibearing vision sensor that is installed on the Work machine, respectively be arranged at road on one side 2 coloured signs, the computing machine that is connected with the signal output part of omnibearing vision sensor and be used for power supply to computer power supply.
Described omnibearing vision sensor comprises loam cake, lower cover, transparent housing, center needle, curved surface minute surface, USB video camera and is used for USB line with the computing machine communication that wherein loam cake, lower cover are installed on the transparent housing two ends respectively; Loam cake bottom is equipped with center needle, and this center needle passes and is close to curved surface minute surface center; Transparent shell supports is bonded in the curved surface minute surface that covers; The USB camera lens that is threadedly connected on the USB video camera by focusing is connected with lower cover.
Described sign, the circular or conical artificial coloured sign of design outline, described coloured, comprise blueness and redness; And installation light increases colour brightness in sign, and adopts shadow shield to be divided into bisection, avoids above the obscuring of 2 signs.
The present invention also provides a kind of and above-mentioned automatic navigating and positioning device corresponding self-navigation localization method, comprise sign setting, view data input, Flame Image Process and calculating and result's output, described sign setting comprises: one side in 2 the coloured signs that are provided with of field road, generally according to the omnibearing vision sensor image-forming principle, distance can be set to 20m at least between per two signs;
Described view data input comprises: include look by omnibearing vision sensor shooting in real time and be identified at interior image and import computing machine;
Described Flame Image Process and calculating comprise: at first, and at the intensity of coloured sign color in the omnidirectional images computed image of input computing machine; According to calculating the color threshold of setting; Line by line the above-mentioned image that collects is scanned, extract the color characteristic pixel that is higher than threshold value; Judgement is divided into the feature pixel zone according to Euclidean distance; Calculate the center of gravity of the pixel in red and blue pixel point zone; Distance according to red pixel point center of gravity and blue pixel point center of gravity judges whether to be sign then.The center of gravity of at last red according to all and blue pixel unique point is as a coloured position that is identified at this image; Finish the scanning of entire image successively, search out 2 coloured positions that are identified in the image:
A) have only one or do not have if coloured identification characteristics point extracts, then to return previous action, to carry out the analysis of next position feature dot image;
B) if having extracted coloured identification characteristics point is 2, then according to 2 coloured positions that are identified in the image of extracting:
1) draws the orientation angles that two coloured signs and omnibearing vision sensor form by Flame Image Process;
2) according to the position of omnibearing vision sensor in image, calculate two coloured signs respectively in image from the image distance of omnibearing vision sensor, thereby draw space length between two coloured signs and omnibearing vision sensor according to imaging parameters fortran; According to the locus of two known coloured signs, utilize the circumference theorem to obtain 2 space position candidate of omnibearing vision sensor again,, choose a wherein feasible position candidate according to the restriction of road fabric width;
3) orientation angles that forms in conjunction with sign and omnibearing vision sensor and draw the triangle relation of the space length between two coloured signs and omnibearing vision sensor according to imaging parameters fortran draws other 2 space position candidate of omnibearing vision sensor;
4) 2) and 3) mean value of three position candidate coordinates obtaining is omnibearing vision sensor coordinate spatially, promptly is the geometry location of Work machine at the field road; According to the relation of Work machine self coordinate and volume coordinate, calculate guide angle and guiding distance;
Result's output is according to the track route of the above-mentioned steps guide angle that obtains and the distance control Work machine that leads.
Embodiment 1:
As Fig. 1, present embodiment comprises 2 red signs, as identifying L among the figure 1, the sign L 2, it is separately positioned on the roadside of road the same side, field, sign L 1With sign L 2Between distance be set to 20m; Omnibearing vision sensor is connected on the Work machine by tripod, and described Work machine can be automatically, semi automatic machine or robot; Work machine will constantly be supplied with computer power supply to practice software to treatment of picture.
As Fig. 2, described omnibearing vision sensor comprises curved surface minute surface, center needle, USB video camera, USB line, focusing screw thread, camera lens, lower cover, transparent housing and loam cake; Curved surface minute surface and described loam cake are bonded together; Described loam cake screw-type screws on the described transparent housing; Described lower cover is intermediate hollow and stretches out the 4mm screw socket and connect described USB camera lens; Described transparent housing is socketed on curved surface minute surface outside, with support curved surface minute surface, and works to prevent dust; Described center needle covers by on being threaded in, and passes and close adhesion curved surface minute surface center, can effectively prevent direct reflection; The USB camera lens is threaded in down and covers, and the focusing screw thread is installed on the USB camera lens, and the signal output part of described USB camera lens is connected with computing machine by the USB line.
As Fig. 3, road self-navigation localization method in the described field of present embodiment comprises the setting of implementation step sign, view data input, Flame Image Process and calculating and result's output.Specifically describe as follows:
Sign is provided with: before the Work machine operation, on the roadside of road the same side, field, place 2 coloured sign L 1With sign L 2, and be provided with apart from being 20m.
The view data input: take 360 degree images in real time by omnibearing vision sensor, the three unities is piece image only, and by in the reservoir in the real-time interface input computing machine.
Flame Image Process and calculating: the image of input is called in the program from reservoir and moved; Line by line image is scanned, utilize red and blue pixel intensity (sign adopts redness or blueness or red combined color sign) in formula (1) the difference computed image, set threshold value red and blue combination sign extraction amount of color according to experiment value, when the red pixel point intensity in the computed image when setting the amount of red threshold value, be red feature pixel; When the blue pixel point intensity in the computed image when setting the amount of blue threshold value, be blue feature pixel; Extract red and blue feature pixel.Maximum red pixel distance is provided with red pixel apart from maximal value in the image according to being identified at arbitrarily, calculating extracts Euclidean distance between red pixel, when Euclidean distance between red pixel is judged as a red feature pixel zone during apart from maximal value less than redness; Maximum blue pixel distance is provided with blue pixel apart from maximal value in the image according to being identified at arbitrarily, calculating extracts Euclidean distance between blue pixel, when Euclidean distance between blue pixel is judged as a blue feature pixel zone during apart from maximal value less than redness; Finish the feature pixel regional compartmentalization of entire image successively.
[re,be]=[R-(B+G)/2-|B-G|,B-(R+G)/2-|R-G|](1)
R, G, B: red, green and blue brightness; Re: extract red pixel brightness; Be: extract blue pixel brightness.
On image, be initial point with the upper left corner, definition rectangular coordinate system u, v, (u is a unit with this pixel respectively v) to the coordinate of each pixel; Utilize formula (2) to calculate the center of gravity of the pixel in red and blue feature pixel zone; Calculate the distance of red feature pixel area pixel point center of gravity and blue feature pixel area pixel point center of gravity then; Distance restraint value (the distance restraint value is verified by experiment by empirical value) according to red feature pixel area pixel point center of gravity and blue feature pixel area pixel point center of gravity judges whether to be sign, when the distance of calculating red feature pixel area pixel point center of gravity and blue feature pixel area pixel point center of gravity within distance restraint value scope, promptly be judged as a sign; Otherwise judge it is not sign.
[ Xre , Yre ] = [ Σ i = 1 n Riu n 1 , Σ i = 1 n Riv n 1 ] - - - ( 2 )
[ Xbe , Ybe ] = [ Σ i = 1 n Biu n 2 , Σ i = 1 n Biv n 2 ]
In the formula:
Xre, Yre: the red identification image coordinate that Flame Image Process is calculated; Riu, Riv: the image coordinate of extracting i red identification characteristics pixel; n 1:: the pixel total amount in red feature pixel zone.Xbe, Ybe: the blue identification image coordinate that Flame Image Process is calculated; Biu, Biv: extract i blue identification characteristics pixel image coordinate; n 2:: the pixel total amount in blue feature pixel zone.
If can successfully extract 2 of coloured identification characteristics points, program will be proceeded following work, draw result of calculation; Have only one or do not have if coloured identification characteristics point extracts, then return the computing machine previous action, carry out the analysis of next position feature dot image.After extracting 2 coloured positions that are identified at image,, draw the orientation angles θ that two coloured signs and omnibearing vision sensor form by Flame Image Process as Fig. 4 1, the angular error of obtaining is little, is that the present invention mainly one of uses.
As Fig. 5, during the omnibearing vision sensor imaging, (Z) (x y) on the same in any direction straight line, is identical with respect to the azimuth angle theta of the projection centre of omnibearing vision sensor to spatial point P with the picture point p that forms for X, Y.Based on image-forming principle, spatial point and imaging point are spatially located at grade.
As Fig. 6, definition coordinate system XZ; According to the omnibearing vision sensor image-forming principle, space ground point P (x 0, z 0), incident ray i is by focus O M(0, H) on hyperboloid, put M (x m, z m) converge at O after the reflection c, on imaging surface, form picture point p (x i, z i); O cBe projection centre, f is a focal length; According to general Hyperbolic Equation:
Figure BDA0000043844550000093
The equation according to the invention (3) of deriving:
( z m - H + c ) 2 b 2 - x m 2 a 2 = 1 ; c = a 2 + b 2 - - - ( 3 )
A, b, c are the structural parameters on hyperboloid surface, general producer provides standard value; H is an omnibearing vision sensor system height overhead; Can know that from the omnibearing vision sensor image-forming principle spatial point and imaging point xsect can be represented with coordinate system as shown in Figure 6.The definition omnibearing vision sensor is to P (x 0, z 0) between distance be that omnibearing vision sensor and identifier space are apart from x 0Be defined as picture point p (x i, z i) be image-forming range x to the distance of omnibearing vision sensor projection centre i
Point O M(0, H), M (x m, z m) and p (x o, z o) on same straight line, can be write as (4):
z m = H - H x 0 x m - - - ( 4 )
According to how much leg-of-mutton similaritys of mathematics, draw (5):
x i x m = f z m - H + 2 c - - - ( 5 )
Obtain (6) according to (4) and (5):
x m = 2 c x 0 x i x 0 f + x i H - - - ( 6 )
In (6) formula substitution (5), can obtain Zm:
z m = ( 1 - 2 c x i x 0 f + x i H ) H - - - ( 7 )
In (6) formula and (7) substitution (4), obtain the relational expression of following omnibearing vision sensor system height and omnibearing vision sensor and sign distance, image-forming range:
( - 2 cx i x 0 f + x i H + c ) 2 b 2 - ( 2 c x 0 x i x 0 f + x i H ) 2 a 2 = 1 ( x o , x i > 0 ) - - - ( 8 )
Therefore, according to (8) formula, a, b, c, f and H are definite value, if obtain imaging point p (x i, z i) be image-forming range x to the distance of omnibearing vision sensor projection centre i, then can derive omnibearing vision sensor to P (x 0, z 0) between distance be that omnibearing vision sensor and identifier space are apart from x 0
According to the position of omnibearing vision sensor in image, calculate two coloured signs respectively in image from the image distance x of omnibearing vision sensor i,
Thereby draw space length x between two coloured signs and omnibearing vision sensor according to omnibearing vision sensor imaging parameters fortran 0Go out to be identified to the omnibearing vision sensor actual range by the pixel distance calibrated and calculated.As Fig. 7, definition coordinate system xy, again according to the locus of two known coloured signs, utilize circumference theorem equation (9) to obtain 2 the space position candidate I1 and the I-I1 of omnibearing vision sensor, according to road fabric width restriction and sign the principle in roadside is set, chooses a wherein feasible position candidate I1;
( x I 1 - x a ) 2 + ( y I 1 - y a ) 2 = r 1 2 ( x I 1 - x b ) 2 + ( y I 1 - y b ) 2 = r 2 2 - - - ( 9 )
Wherein, (x a, y a): sign L 1The locus; (x b, y b): sign L 2The locus; r 1: sign L 1With the omnibearing vision sensor actual range; r 1: sign L 2With the omnibearing vision sensor actual range.
As Fig. 8, in conjunction with the orientation angles θ of sign and omnibearing vision sensor formation 1With coloured sign L 1And the space length r between omnibearing vision sensor 1Triangle relation can obtain following relationship formula (10)~(12):
X = r 1 cos θ 1 + ( r 1 cos θ 1 ) 2 - r 1 2 + L 12 2 - - - ( 10 )
cosθ=(r 1 2+L 12 2-X 2)/2L 12r 1 (11)
sin θ = 1 - cos 2 θ - - - ( 12 )
Wherein, L 12: coloured sign L 1And L 2Geometry air line distance; θ: r 1With L 12The middle angle that forms; X: triangle θ angle equal edge.
Formula (13) draws the coordinate figure of other 1 space position candidate (I2) of omnibearing vision sensor:
x I 2 = L 12 - r 1 cos θ y I 2 = r 1 sin θ - - - ( 13 )
As Fig. 9,, obtain the coordinate figure of omnibearing vision sensor space position candidate (I3) according to formula (14)~(17) with asking (I2) position in like manner;
X ′ = r 2 cos θ 1 + ( r 2 cos θ 1 ) 2 - r 2 2 + L 12 2 - - - ( 14 )
cosα=(r 2 2+L 12 2-X′ 2)/2L 12r 2 (15)
sin α = 1 - cos 2 α - - - ( 16 )
Wherein, L 12: coloured sign L 1And L 2Geometry air line distance; α: r 2With L 12The middle angle that forms; X ': triangle α angle equal edge.
Formula (17) draws the coordinate figure of other 1 space position candidate (I3) of omnibearing vision sensor:
x I 3 = L 12 - r 2 cos α y I 3 = r 2 sin α - - - ( 17 )
As Figure 10, for improving bearing accuracy, (coordinate of I1~I3) also reduces the possible positioning error of candidate point according to their average coordinates value to three position candidate obtaining of utilization different calculation methods, obtains center of gravity according to (18) formula, promptly is the geometry location of Work machine at the field road.
x I = ( x I 1 + x I 2 + x I 3 ) / 3 y I = ( y I 1 + y I 2 + y I 3 ) / 3 - - - ( 18 )
Omnibearing vision sensor is installed on the Work machine 3, has promptly realized the geometry location of Work machine 3 at this field road; According to the coordinate system of Work machine 3 and the space coordinates transformational relation of road 2, calculate guide angle and guiding distance.
Result's output, guide angle and guiding distance are transported to the computing machine as steering control device, carry out the track route of Work machine.

Claims (6)

1. the field road self-navigation localization method of an agricultural machinery is characterized in that, may further comprise the steps:
Step 1: preliminary work: 2 coloured sign L are set in the same side at road 1And L 2, in Operation Van to be positioned, being provided with omnibearing vision sensor, omnibearing vision sensor is taken the current omnidirectional images that includes 2 coloured sign imagings;
Step 2: measure the imaging actual range that projection centre branch in omnidirectional images is clipped to 2 coloured sign imagings, and then according to following formula calculate Operation Van in the actual environment divide be clipped to 2 coloured signs apart from r1 and r2; And obtain the orientation angles θ of 2 coloured signs and omnibearing vision sensor formation according to omnidirectional images 1
Formula by imaging actual distance calculation space length is as follows:
( - 2 cx i x 0 f + x i H + c ) 2 b 2 - ( 2 cx 0 x i x 0 f + x i H ) 2 a 2 = 1 , x o , x i > 0 ;
A wherein, b, c are the structural parameters on the hyperboloid surface of omnibearing vision sensor, f is a focal length; H is the height of current fully-directional visual system; A, b, c, f and H are definite value;
x iBe the imaging actual range, the imaging point of spatial point P in omnidirectional images at promptly a certain coloured sign place is to the distance of omnibearing vision sensor projection centre;
x 0Be space length, promptly omnibearing vision sensor is to the space length of this coloured sign;
Step 3: the coordinate of obtaining 3 position candidate: (x I1, y I1), (x I2, y I2) and (x I3, y I3);
Step 4: the coordinate according to 3 position candidate is obtained the elements of a fix, and finish the location: obtain the elements of a fix (x according to following formula I, y I), the mechanical geometry location that promptly fulfils assignment at the field road,
x I = ( x I 1 + x I 2 + x I 3 ) / 3 y I = ( y I 1 + y I 2 + y I 3 ) / 3 .
2. the field road self-navigation localization method of agricultural machinery according to claim 1 is characterized in that in the step 3, the acquiring method of the coordinate of 3 position candidate is as follows:
At first, with some coloured initial points that is designated,, set up the plane coordinate system of actual environment with the line of two signs a coordinate axis as planimetric coordinates; (x a, y a) for identifying L 1Position coordinates in described plane coordinate system, (x b, y b) for identifying L 2Position coordinates in described plane coordinate system;
I) coordinate of the first position candidate I1 ask method:
According to formula
Figure FDA0000043844540000021
Obtain separating of two position coordinateses, limit the same side principle that this constraint condition and coloured sign are arranged on road according to the road fabric width again, determine that one of them is separated is the coordinate (x of the first position candidate I1 I1, y I1); R1 and r2 represent that also Operation Van divides the line that is clipped to 2 coloured signs;
II) coordinate (x of second position candidate I2 I2, y I2) be:
x I 2 = L 12 - r 1 cos θ y I 2 = r 1 sin θ , Wherein, L 12Be coloured sign L 1And L 2Geometry air line distance, L simultaneously 12The line of also representing two coloured signs, θ are r 1With L 12The angle that forms;
sin θ = 1 - cos 2 θ , cosθ=(r 1 2+L 12 2-X 2)/2L 12r 1 X = r 1 cos θ 1 + ( r 1 cos θ 1 ) 2 - r 1 2 + L 12 2 ,
Wherein X is the opposite side at θ angle in the triangle;
III) coordinate (x of the 3rd position candidate I3 I3, y I3) be:
x I 3 = L 12 - r 2 cos α y I 3 = r 2 sin α ; Wherein, sin α = 1 - cos 2 α ,
Cos α=(r 2 2+ L 12 2-X ' 2)/2L 12r 2
Figure FDA0000043844540000027
In the formula, α is r 2With L 12The middle angle that forms; X ' is the opposite side at α angle in the triangle.
3. the field road self-navigation localization method of agricultural machinery according to claim 1 and 2 is characterized in that the acquiring method of imaging actual range is:
At first calculate the imaging pixel distance that coloured sign is imaged onto the omnibearing vision sensor projection centre, again this imaging pixel distance be multiply by the demarcation pixel distance and be calculated to be the picture actual range.
4. the field road self-navigation localization method of agricultural machinery according to claim 3 is characterized in that the demarcation pixel distance is 0.01389inch/pixel.
5. the field road automatic navigating and positioning device of an agricultural machinery, it is characterized in that, 2 coloured signs are set in the same side at the road of exercising in Operation Van, in Operation Van to be positioned, be provided with the omnibearing vision sensor that is used to take omnidirectional images, in Operation Van, also be provided with microprocessor, include 1 in the microprocessor) be used for from omnidirectional images obtain projection centre to 2 coloured sign apart from r1 and r2, and the graphics processing unit and 2 that obtains 2 coloured signs and the orientation angles θ 1 of omnibearing vision sensor formation according to omnidirectional images) is used for finally obtaining the elements of a fix computing unit of the elements of a fix according to the field road self-navigation localization method of the agricultural machinery of claim 2.
6. the field road automatic navigating and positioning device of agricultural machinery according to claim 5 is characterized in that, described omnibearing vision sensor adopts bitoric lens.
CN201110007784A 2011-01-14 2011-01-14 Automatic navigation and positioning device and method for field road of agricultural machinery Active CN102183250B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110007784A CN102183250B (en) 2011-01-14 2011-01-14 Automatic navigation and positioning device and method for field road of agricultural machinery

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110007784A CN102183250B (en) 2011-01-14 2011-01-14 Automatic navigation and positioning device and method for field road of agricultural machinery

Publications (2)

Publication Number Publication Date
CN102183250A true CN102183250A (en) 2011-09-14
CN102183250B CN102183250B (en) 2012-10-17

Family

ID=44569493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110007784A Active CN102183250B (en) 2011-01-14 2011-01-14 Automatic navigation and positioning device and method for field road of agricultural machinery

Country Status (1)

Country Link
CN (1) CN102183250B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104390644A (en) * 2014-11-25 2015-03-04 浙江理工大学 Method for detecting field obstacle based on field navigation image collection equipment
CN105044754A (en) * 2015-07-01 2015-11-11 西安交通大学 Mobile platform outdoor positioning method based on multi-sensor fusion
CN105277206A (en) * 2015-10-26 2016-01-27 广西师范大学 Method for allowing hillside machine to travel linearly at equal intervals
CN106595630A (en) * 2015-10-14 2017-04-26 山东鲁能智能技术有限公司 Mapping system based on laser navigation substation patrol robot as well as method
CN108153303A (en) * 2017-12-13 2018-06-12 张庆国 The field road automatic navigating and positioning device and method of a kind of agricultural machinery
CN109634305A (en) * 2018-12-21 2019-04-16 国网安徽省电力有限公司淮南供电公司 UAV position and orientation method of adjustment and system based on visual aids positioning
CN109900238A (en) * 2017-12-08 2019-06-18 中国电信股份有限公司 Measurement method, device and the computer readable storage medium at antenna for base station angle
CN110533945A (en) * 2019-08-28 2019-12-03 肇庆小鹏汽车有限公司 Method for early warning, system, vehicle and the storage medium of traffic lights
CN114371717A (en) * 2022-01-21 2022-04-19 厦门理工学院 Hedge trimmer intelligent control method and system for tea leaf trimming

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6219790B2 (en) * 2014-07-29 2017-10-25 株式会社クボタ Work vehicle coordination system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5051735A (en) * 1987-09-25 1991-09-24 Honda Giken Kogyo Kabushiki Kaisha Heads-up display system for a road vehicle
CN1598487A (en) * 2004-07-23 2005-03-23 东北大学 Method for visual guiding by manual road sign
JP2005227181A (en) * 2004-02-13 2005-08-25 Toyota Motor Corp Mobile communication navigation system, mobile route searching method, and moving object
US20090024317A1 (en) * 2007-07-18 2009-01-22 Gm Global Technology Operations, Inc. System for gathering and distributing location information of vehicles
CN101660912A (en) * 2009-09-25 2010-03-03 湖南农业大学 Automatic navigating and positioning device and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5051735A (en) * 1987-09-25 1991-09-24 Honda Giken Kogyo Kabushiki Kaisha Heads-up display system for a road vehicle
JP2005227181A (en) * 2004-02-13 2005-08-25 Toyota Motor Corp Mobile communication navigation system, mobile route searching method, and moving object
CN1598487A (en) * 2004-07-23 2005-03-23 东北大学 Method for visual guiding by manual road sign
US20090024317A1 (en) * 2007-07-18 2009-01-22 Gm Global Technology Operations, Inc. System for gathering and distributing location information of vehicles
CN101660912A (en) * 2009-09-25 2010-03-03 湖南农业大学 Automatic navigating and positioning device and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《农业工程学报》 20100228 李明等 基于全方位视觉传感器的农业机械定位系统 第170-174页 1-6 第26卷, 第2期 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104390644A (en) * 2014-11-25 2015-03-04 浙江理工大学 Method for detecting field obstacle based on field navigation image collection equipment
CN104390644B (en) * 2014-11-25 2017-05-24 浙江理工大学 Method for detecting field obstacle based on field navigation image collection equipment
CN105044754A (en) * 2015-07-01 2015-11-11 西安交通大学 Mobile platform outdoor positioning method based on multi-sensor fusion
CN106595630A (en) * 2015-10-14 2017-04-26 山东鲁能智能技术有限公司 Mapping system based on laser navigation substation patrol robot as well as method
CN106595630B (en) * 2015-10-14 2019-09-24 山东鲁能智能技术有限公司 It is a kind of that drawing system and method are built based on laser navigation Intelligent Mobile Robot
CN105277206A (en) * 2015-10-26 2016-01-27 广西师范大学 Method for allowing hillside machine to travel linearly at equal intervals
CN109900238A (en) * 2017-12-08 2019-06-18 中国电信股份有限公司 Measurement method, device and the computer readable storage medium at antenna for base station angle
CN108153303A (en) * 2017-12-13 2018-06-12 张庆国 The field road automatic navigating and positioning device and method of a kind of agricultural machinery
CN109634305A (en) * 2018-12-21 2019-04-16 国网安徽省电力有限公司淮南供电公司 UAV position and orientation method of adjustment and system based on visual aids positioning
CN110533945A (en) * 2019-08-28 2019-12-03 肇庆小鹏汽车有限公司 Method for early warning, system, vehicle and the storage medium of traffic lights
CN114371717A (en) * 2022-01-21 2022-04-19 厦门理工学院 Hedge trimmer intelligent control method and system for tea leaf trimming
CN114371717B (en) * 2022-01-21 2023-04-25 厦门理工学院 Hedge trimmer intelligent control method and system for tea trimming

Also Published As

Publication number Publication date
CN102183250B (en) 2012-10-17

Similar Documents

Publication Publication Date Title
CN102183250B (en) Automatic navigation and positioning device and method for field road of agricultural machinery
CN101660912B (en) Automatic navigating and positioning device and method
CN106651953B (en) A kind of vehicle position and orientation estimation method based on traffic sign
CN105930819B (en) Real-time city traffic lamp identifying system based on monocular vision and GPS integrated navigation system
CN104217439B (en) Indoor visual positioning system and method
CA2950791C (en) Binocular visual navigation system and method based on power robot
CN104330074B (en) Intelligent surveying and mapping platform and realizing method thereof
CN104200086A (en) Wide-baseline visible light camera pose estimation method
CN110285793A (en) A kind of Vehicular intelligent survey track approach based on Binocular Stereo Vision System
CN110008893A (en) A kind of automobile driving running deviation automatic testing method based on vehicle-mounted imaging sensor
CN110443898A (en) A kind of AR intelligent terminal target identification system and method based on deep learning
CN110119698A (en) For determining the method, apparatus, equipment and storage medium of Obj State
CN105554472B (en) The method of the video monitoring system and its positioning robot of overlay environment
CN106989683A (en) A kind of shield tail clearance of shield machine vision measuring method
CN107102004A (en) A kind of tunnel detector
CN103076014B (en) A kind of Work machine self-navigation three mark location device and method
CN102902975B (en) Sun positioning method based on complementary metal-oxide-semiconductor transistor (CMOS) navigation camera
CN102129679B (en) Local positioning system and method
CN103759724B (en) A kind of indoor navigation method based on lamp decoration feature and system
CN106403926A (en) Positioning method and system
CN102168973B (en) Automatic navigating Z-shaft positioning method for omni-directional vision sensor and positioning system thereof
CN103093214B (en) A kind of pedestrian detection method based on vehicle mounted infrared camera
CN202033041U (en) Automatic road navigating and positioning device for operation vehicle
CN201995070U (en) Local positioning system
CN110415299B (en) Vehicle position estimation method based on set guideboard under motion constraint

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant