CN102183250B - Automatic navigation and positioning device and method for field road of agricultural machinery - Google Patents

Automatic navigation and positioning device and method for field road of agricultural machinery Download PDF

Info

Publication number
CN102183250B
CN102183250B CN201110007784A CN201110007784A CN102183250B CN 102183250 B CN102183250 B CN 102183250B CN 201110007784 A CN201110007784 A CN 201110007784A CN 201110007784 A CN201110007784 A CN 201110007784A CN 102183250 B CN102183250 B CN 102183250B
Authority
CN
China
Prior art keywords
coloured
vision sensor
omnibearing vision
road
sign
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110007784A
Other languages
Chinese (zh)
Other versions
CN102183250A (en
Inventor
李明
戴思慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Agricultural University
Original Assignee
Hunan Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Agricultural University filed Critical Hunan Agricultural University
Priority to CN201110007784A priority Critical patent/CN102183250B/en
Publication of CN102183250A publication Critical patent/CN102183250A/en
Application granted granted Critical
Publication of CN102183250B publication Critical patent/CN102183250B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses an automatic navigation and positioning device and an automatic navigation and positioning method for a field road of agricultural machinery. Two colored marks are set on the same side of a road on which an operating vehicle runs; an omnibearing visual sensor for photographing an omnibearing image is arranged on the operating vehicle to be positioned; a microprocessor is also arranged on the operating vehicle and comprises 1) an image processing unit and 2) a positioning coordinate calculation unit; the image processing unit is used for acquiring distances r1 and r2 from a projection center in the omnibearing image to the two colored marks and an azimuth angle theta 1 formed by the two colored marks and the omnibearing visual sensor according to the omnibearing image; and the positioning coordinate calculation unit is used for finally solving a positioning coordinate according to an automatic navigation and positioning method for the field road of the agricultural machinery. Through the automatic navigation and positioning device and the automatic navigation and positioning method for the field road of the agricultural machinery, a global positioning system (GPS) can be substituted or compensated for positioning; and the device and the method have the advantages of high running speed, wide application range, high practicability and low cost.

Description

A kind of field road automatic navigating and positioning device and method of agricultural machinery
Technical field
The invention belongs to computer vision image processing application, relate to a kind of field road automatic navigating and positioning device and method of agricultural machinery.
Background technology
At present, people require increasingly high to the automaticity of machinery; The development of mechanical automation operation and robot thereof is following Mechanization Development direction; The location is concerning automatic job vehicle, robot, and is extremely important.Can the position of oneself must be known by vehicle, robot when automatic job, just can determine robot accomplish executing the task of next step on request.The image processing techniques location is extracted characteristic out and is divided into local positioning and global location from the scope of controlling from image; Local positioning applies to the operation in the certain limit, and computer image treatment generally extracts the benchmark of distinctive characteristic body as the location, and characteristic body is divided into natural forms and is provided with artificial; Global location, technological such as the GPS that current trend is used.Be divided into geometry location and topology location from the character of location.Geometry location can be known the particular location of robot, and the topology location is to confirm robot in certain admissible scope path, and does not know detailed geometric position; Position Research for various robots is many, and the location that is used for outdoor robot and vehicle now uses the GPS technology basically, and this location technology error ratio is bigger; Cover the zone in the woods, house, can not transmit and to use because of microwave signal; A kind of in recent years in addition in the RTK-GPS of the technical development of GPS technology, precision is higher, but costs an arm and a leg.Common vehicle and Work machine be agricultural machinery and agricultural transport vehicle especially, and operating environment place area is more abominable, chamber operation especially, and the GPS technology possibly not realize the self-navigation location.
Omnibearing vision sensor is a kind of novel vision sensor; It can provide the abundant information of 360 degree scopes, and image can not change image with the rotation of sensor, helps reducing the part negative effect of wheelslip and vibrations; And low price, be widely used in the robot field; Also obtained application at the robot orientation; But its use principle and method generally are the artificial characteristic objects that the characteristic object is set or utilizes nature in operating environment; Extract eigenwert through image processing techniques, find unique point or unique point zone to analyze the topology location through images match then; This is located, and general computation process is complicated, long operational time, and practicality is not so good.Patent of invention: a kind of automatic navigating and positioning device and method (application number: 200910044412.4); Vehicle, robot to operation; Especially operating environment is more fixing; Field, greenhouse, factory building etc., and the out of use zone of GPS technology, a kind of automatic navigating and positioning device and the method for utilizing omnibearing vision sensor and computer visual image treatment technology to develop; This invention can replace GPS to position at some special occasions, and travelling speed is very fast, usable range is wide, practical, low price.This invention has used 4 signs to cooperate omnibearing vision sensor to accomplish the location; Because the field road is narrower, arranges that 4 signs are time-consuming, hinder shortcomings such as the going of agricultural machinery, GPS expensive and existence use restriction.
Summary of the invention
Technical matters to be solved by this invention is field road automatic navigating and positioning device and the method that proposes a kind of agricultural machinery; The field road automatic navigating and positioning device and the method for this agricultural machinery can replace GPS to position, and travelling speed is very fast, usable range is wide, practical, low price.
Technical solution of the present invention is following:
A kind of field road self-navigation localization method of agricultural machinery may further comprise the steps:
Step 1: preliminary work: 2 coloured sign L are set in the same side at road 1And L 2, in Operation Van to be positioned, being provided with omnibearing vision sensor, omnibearing vision sensor is taken the current omnidirectional images that includes 2 coloured sign imagings;
Step 2: measure the imaging actual range that projection centre branch in omnidirectional images is clipped to 2 coloured signs imagings, and then according to following formula calculate Operation Van in the actual environment divide be clipped to 2 coloured signs apart from r1 and r2; And obtain the orientation angles θ of 2 coloured signs and omnibearing vision sensor formation according to omnidirectional images 1
Formula by imaging actual distance calculation space length is following:
( - 2 cx i x 0 f + x i H + c ) 2 b 2 - ( 2 cx 0 x i x 0 f + x i H ) 2 a 2 = 1 , x o , x i > 0 ;
A wherein, b, c are the structural parameters on the hyperboloid surface of omnibearing vision sensor, and f is a focal length; H is the height of current fully-directional visual system; A, b, c, f and H are definite value;
x iBe the imaging actual range, the imaging point of spatial point P in omnidirectional images at promptly a certain coloured sign place is to the distance of omnibearing vision sensor projection centre;
x 0Be space length, promptly omnibearing vision sensor is to the space length of this coloured sign;
Step 3: the coordinate of obtaining 3 position candidate: (x I1, y I1), (x I2, y I2) and (x I3, y I3);
Step 4: the coordinate according to 3 position candidate is obtained the elements of a fix, and accomplish the location: obtain the elements of a fix (x according to following formula I, y I), the geometry location of the machinery road that promptly fulfils assignment in the field,
x I = ( x I 1 + x I 2 + x I 3 ) / 3 y I = ( y I 1 + y I 2 + y I 3 ) / 3 .
In the step 3, the acquiring method of the coordinate of 3 position candidate is following:
At first, with some coloured initial points that is designated,, set up the plane coordinate system of actual environment with the line of two signs a coordinate axis as planimetric coordinates; (x a, y a) for identifying L 1Position coordinates in said plane coordinate system, (x b, y b) for identifying L 2Position coordinates in said plane coordinate system;
I) coordinate of the first position candidate I1 ask method:
According to formula
Figure BDA0000043844550000031
Obtain separating of two position coordinateses, limit the same side principle that this constraint condition and coloured sign are arranged on road according to the road fabric width again, confirm that one of them is separated is the coordinate (x of the first position candidate I1 I1, y I1); R1 and r2 represent that also Operation Van divides the line that is clipped to 2 coloured signs;
II) coordinate (x of second position candidate I2 I2, y I2) be:
x I 2 = L 12 - r 1 Cos θ y I 2 = r 1 Sin θ , Wherein, L 12Be coloured sign L 1And L 2Geometry air line distance, L simultaneously 12The line of also representing two coloured signs, θ are r 1With L 12The angle that forms;
sin θ = 1 - cos 2 θ , cosθ=)(r 1 2+L 12 2-X 2)/2L 12r 1 X = r 1 cos θ 1 + ( r 1 cos θ 1 ) 2 - r 1 2 + L 12 2 ,
Wherein X is the opposite side at θ angle in the triangle;
III) coordinate (x of the 3rd position candidate I3 I3, y I3) be:
x I 3 = L 12 - r 2 Cos α y I 3 = r 2 Sin α ; Wherein, Sin α = 1 - Cos 2 α ,
Cos α=(r 2 2+ L 12 2-X ' 2) 2L 12r 2
Figure BDA0000043844550000037
In the formula, α is r 2With L 12The middle angle that forms; X ' is the opposite side at α angle in the triangle.
Road fabric width restriction refers to mainly be meant in the program design, and such as the 2m that has a lot of social connections, the value of obtaining so must be less than 2m, and is necessary for positive number.
Explain that the condition of asking X to utilize is orientation angles θ 1With apart from r1, X is not r 2
The acquiring method of imaging actual range is:
At first calculate the imaging pixel distance that coloured sign is imaged onto the omnibearing vision sensor projection centre, again this imaging pixel distance multiply by the demarcation pixel distance and be calculated to be the picture actual range.
The demarcation pixel distance is 0.01389inch/pixel.
A kind of field road automatic navigating and positioning device of agricultural machinery; 2 coloured signs are set in the same side at the road of exercising in Operation Van; In Operation Van to be positioned, be provided with the omnibearing vision sensor that is used to take omnidirectional images; In Operation Van, also be provided with microprocessor, include 1 in the microprocessor) be used for obtaining the orientation angles θ that obtains 2 coloured signs and omnibearing vision sensor formation apart from r1 and r2 and according to omnidirectional images of projection centre to 2 coloured sign from omnidirectional images 1Graphics processing unit and 2) be used to adopt the field road self-navigation localization method of agricultural machinery finally to obtain the elements of a fix computing unit of the elements of a fix.
Described omnibearing vision sensor adopts bitoric lens.
Principle of work of the present invention is: said field road automatic navigating and positioning device comprises through tripod and is installed in omnibearing vision sensor on the Work machine, is arranged at road 2 coloured signs, computing machine and the power supply that is connected with computing machine on one side respectively; Said omnibearing vision sensor comprises loam cake, lower cover, curved surface minute surface, center needle, transparent housing, USB video camera and be used for forming with the USB line of computing machine communication; Said curved surface minute surface and said loam cake are bonded together, and said loam cake screw-type is screwed on the said transparent housing; Said lower cover is intermediate hollow and stretches out about 4mm screw socket and connect said USB camera lens; Said curved mirror face comprises sphere, parabolic surface and hyperbolic curve curved surface, is characterized in single viewpoint, gathers large-scale data; Said transparent housing cover stays in the outside of curved mirror face, works to support the curved surface minute surface and prevents dust; Said center needle spiral shell covers last, passes and close adhesion curve face center, can effectively prevent direct reflection; USB camera lens through being threaded on the transparent housing is connected with the focusing screw thread, and the signal output part of this USB video camera is connected to computing machine through the USB line.Said sign, the circular or coloured sign of conical manual work of design outline; Said coloured, by blue and red the composition; And installation light increases colour brightness in sign, and adopts shadow shield to be divided into bisection, avoids above the obscuring of 2 signs.This computing machine comprises hardware and software as the control center of whole device, and software comprises to be realized IMAQ, processing, calculating and analytic function software and implement the parameter control program to Work machine issue guiding.This computing machine is by Work machine generating not stopping power supply.
Accordingly; The field road self-navigation localization method that this field road automatic navigating and positioning device is taked comprises that sign settings, view data input, Flame Image Process and calculating and result export four steps: said sign setting comprises: one side in the field road 2 coloured signs are set, it is 20m that distance is set between per two signs; Said view data input through the real-time photographic images of omnibearing vision sensor, is imported computing machine with image through real-time interface; The image that said omnibearing vision sensor provides is 360 degree information, contains much information, and figure is circular.Far away more from the omnibearing vision sensor center, the resolution step-down of image; And on length, deform, therefore, the graphical analysis more complicated, but a big advantage of this omnibearing vision sensor does not change for its position angle.Said Flame Image Process and calculating are one of innovation parts of the present invention; At first, to the intensity of coloured sign color in the omnidirectional images computed image of input computing machine; Color threshold according to the experiment setting; Line by line the above-mentioned image that collects is scanned, extract the color characteristic pixel that is higher than threshold value; Judgement is divided into the feature pixel zone according to Euclidean distance; Calculate the center of gravity of the pixel in red and blue pixel point zone; Distance according to red pixel point center of gravity and blue pixel point center of gravity judges whether to be sign then.The center of gravity of at last red according to all and blue pixel unique point is as a coloured position that is identified at this image; Accomplish the scanning of entire image successively, search out 2 coloured positions that are identified in the image; Have only one perhaps not have if coloured identification characteristics point extracts, then return previous action, carry out the analysis of next position feature dot image; If having extracted coloured identification characteristics point is 2, then according to 2 coloured positions that are identified in the image of extracting: the orientation angles that draws every adjacent two coloured signs and omnibearing vision sensor formation first through Flame Image Process; They are two years old; According to the position of omnibearing vision sensor in image; Calculate two coloured signs and in image, leave the image distance of omnibearing vision sensor respectively, thereby draw the space length between two coloured signs and omnibearing vision sensor according to omnibearing vision sensor imaging parameters fortran; According to the locus of two known coloured signs, utilize the circumference theorem to obtain 2 space position candidate of omnibearing vision sensor again,, choose a wherein feasible position candidate according to the restriction of road fabric width; They are three years old; The orientation angles that forms in conjunction with sign and omnibearing vision sensor and draw the triangle relation of the space length between two coloured signs and omnibearing vision sensor according to imaging parameters fortran draws other 2 space position candidate of omnibearing vision sensor; At last, two being omnibearing vision sensor coordinate spatially with the mean value of its three three position candidate coordinates obtaining according to it, promptly is the geometry location of Work machine road in the field; According to the relation of Work machine self coordinate and volume coordinate, calculate guide angle and guiding distance; Result's output is according to the track route of the above-mentioned steps guide angle that obtains and the distance control Work machine that leads.
Light fixture is installed in inside as colour code is known, adopts materials such as glass, transparent plastic, adopts the present invention a little less than light or also can operate night.
Beneficial effect:
The field road automatic navigating and positioning device and the method for agricultural machinery of the present invention; According to the omnibearing vision sensor online in real time omni-directional visual image is provided constantly; Position one sub-picture utilizes Flame Image Process to extract and is identified at the position in the image, computed image distance and space length; In conjunction with by the immovable principle of omnibearing vision sensor imaging side parallactic angle, draw the position angle, calculate jointly omnibearing vision sensor in the field road with respect to the sign the space absolute coordinates; Further,, calculate guide angle and guiding distance, the locating information of necessity is provided for the navigation of Operation Van according to the relation of vehicle, robot self coordinate and volume coordinate.This method compares that other localization method is simple, and travelling speed is fast, and is practical, and said automatic navigating and positioning device is simple in structure, and cost is not high, is easy to be extended and applied.
According to the image-forming principle of omnibearing vision sensor, through Flame Image Process, than 2 that obtain through Flame Image Process distance, the orientation angles θ that draws 1Be very accurately, therefore in order to reduce error, improve bearing accuracy, the present invention utilizes orientation angles θ 1, the mean value of obtaining 3 candidate points at last is as the elements of a fix, and this is the innovation point of most critical of the present invention.
Description of drawings
Fig. 1 is the structural representation of field according to the invention road automatic navigating and positioning device;
Fig. 2 is an omnibearing vision sensor structural representation according to the invention;
Fig. 3 is the implementation step figure of road self-navigation localization method in field according to the invention;
Fig. 4 is instance identification deflection figure of the present invention;
Fig. 5 omnibearing vision sensor point image-forming principle of the present invention;
Fig. 6 is the relational expression derivation synoptic diagram of example device system height of the present invention and omnibearing vision sensor and identifier space distance, image-forming range;
Fig. 7 is an instance location Calculation candidate point I1 schematic diagram calculation of the present invention;
Fig. 8 is an instance location Calculation candidate point I2 schematic diagram calculation of the present invention;
Fig. 9 is an instance location Calculation candidate point I3 schematic diagram calculation of the present invention;
Figure 10 is instance Principal of Fix Calculation figure of the present invention.
In above-mentioned figure,
The bent type minute surface of 1-sign L1 2-field road 3-Work machine 4-sign L2 5-tripod 6-power supply 7-computing machine 8-omnibearing vision sensor 9-10-center needle s 11-USB video camera 12-USB line 13-focusing screw thread 14-camera lens 15-lower cover 16-transparent housing 17-loam cake 18-sign is provided with 19-view data input 20-Flame Image Process and calculates 21-result and export the 30-imaging plane, the 40-hyperboloid.
Embodiment
Below will combine accompanying drawing and specific embodiment that the present invention is explained further details:
A kind of field road automatic navigating and positioning device, comprise the omnibearing vision sensor that is installed on the Work machine, respectively be arranged at road on one side 2 coloured signs, the computing machine that is connected with the signal output part of omnibearing vision sensor and be used for power supply to computer power supply.
Said omnibearing vision sensor comprises loam cake, lower cover, transparent housing, center needle, curved surface minute surface, USB video camera and is used for the USB line with the computing machine communication that wherein loam cake, lower cover are installed on the transparent housing two ends respectively; Loam cake bottom is equipped with center needle, and this center needle passes and is close to curved surface minute surface center; Transparent shell supports is bonded in the curved surface minute surface that covers; The USB camera lens that is threadedly connected on the USB video camera through focusing is connected with lower cover.
Said sign, the circular or coloured sign of conical manual work of design outline, said coloured, comprise blueness and redness; And installation light increases colour brightness in sign, and adopts shadow shield to be divided into bisection, avoids above the obscuring of 2 signs.
The present invention also provides a kind of and above-mentioned automatic navigating and positioning device corresponding self-navigation localization method; Comprise sign setting, view data input, Flame Image Process and calculating and result's output; Said sign setting comprises: one side in the field road 2 coloured signs are set; Generally according to the omnibearing vision sensor image-forming principle, distance can be set to 20m at least between per two signs;
Said view data input comprises: include look through omnibearing vision sensor shooting in real time and be identified at interior image and import computing machine;
Said Flame Image Process and calculating comprise: at first, and to the intensity of coloured sign color in the omnidirectional images computed image of input computing machine; According to calculating the color threshold of setting; Line by line the above-mentioned image that collects is scanned, extract the color characteristic pixel that is higher than threshold value; Judgement is divided into the feature pixel zone according to Euclidean distance; Calculate the center of gravity of the pixel in red and blue pixel point zone; Distance according to red pixel point center of gravity and blue pixel point center of gravity judges whether to be sign then.The center of gravity of at last red according to all and blue pixel unique point is as a coloured position that is identified at this image; Accomplish the scanning of entire image successively, search out 2 coloured positions that are identified in the image:
A) have only one perhaps not have if coloured identification characteristics point extracts, then return previous action, carry out the analysis of next position feature dot image;
B) if having extracted coloured identification characteristics point is 2, then according to 2 coloured positions that are identified in the image of extracting:
1) draws the orientation angles that two coloured signs and omnibearing vision sensor form through Flame Image Process;
2) according to the position of omnibearing vision sensor in image; Calculate two coloured signs and in image, leave the image distance of omnibearing vision sensor respectively, thereby draw the space length between two coloured signs and omnibearing vision sensor according to imaging parameters fortran; According to the locus of two known coloured signs, utilize the circumference theorem to obtain 2 space position candidate of omnibearing vision sensor again,, choose a wherein feasible position candidate according to the restriction of road fabric width;
3) combine orientation angles that sign and omnibearing vision sensor form and draw the triangle relation of the space length between two coloured signs and omnibearing vision sensor, draw other 2 space position candidate of omnibearing vision sensor according to imaging parameters fortran;
4) 2) and 3) mean value of three position candidate coordinates obtaining is omnibearing vision sensor coordinate spatially, promptly is the geometry location of Work machine road in the field; According to the relation of Work machine self coordinate and volume coordinate, calculate guide angle and guiding distance;
Result's output is according to the track route of the above-mentioned steps guide angle that obtains and the distance control Work machine that leads.
Embodiment 1:
Like Fig. 1, present embodiment comprises 2 red signs, as identifying L among the figure 1, the sign L 2, it is separately positioned on the roadside of road the same side, field, sign L 1With sign L 2Between distance be set to 20m; Omnibearing vision sensor is connected on the Work machine through tripod, and said Work machine can be automatically, semi automatic machine or robot; Work machine will constantly be supplied with computer power supply to practice software to treatment of picture.
Like Fig. 2, said omnibearing vision sensor comprises curved surface minute surface, center needle, USB video camera, USB line, focusing screw thread, camera lens, lower cover, transparent housing and loam cake; Curved surface minute surface and said loam cake are bonded together; Said loam cake screw-type is screwed on the said transparent housing; Said lower cover is intermediate hollow and stretches out the 4mm screw socket and connect said USB camera lens; It is outside that said transparent housing is socketed on the curved surface minute surface, with support curved surface minute surface, and works to prevent dust; Said center needle covers through on being threaded in, and passes and close adhesion curved surface minute surface center, can effectively prevent direct reflection; The USB camera lens is threaded in down and covers, and the focusing screw thread is installed on the USB camera lens, and the signal output part of said USB camera lens is connected with computing machine through the USB line.
Like Fig. 3, road self-navigation localization method in the said field of present embodiment comprises the setting of implementation step sign, view data input, Flame Image Process and calculating and result's output.Specifically describe as follows:
Sign is provided with: before the Work machine operation, on the roadside of road the same side, field, place 2 coloured sign L 1With sign L 2, and be provided with apart from being 20m.
The view data input: take 360 degree images in real time through omnibearing vision sensor, the three unities is piece image only, and through in the reservoir in the real-time interface input computing machine.
Image is handled and is calculated: the image of input is called in the program from holder move; Line by line image is scanned; Utilize red and blue pixel intensity (sign adopts redness or blueness or red combined color sign) in formula (1) the difference computed image; Set threshold value red and blue combination sign extraction amount of color based on experiment value; When the red pixel point intensity in the computed image when setting the amount of red threshold value, be red feature pixel; When the blue pixel point intensity in the computed image when setting the amount of blue threshold value, be blue feature pixel; Extract red and blue feature pixel.Maximum red pixel distance is provided with red pixel apart from maximal value in the image according to being identified at arbitrarily; Calculating extracts Euclidean distance between red pixel, when Euclidean distance between red pixel is judged as a red feature pixel zone during apart from maximal value less than redness; Maximum blue pixel distance is provided with blue pixel apart from maximal value in the image according to being identified at arbitrarily; Calculating extracts Euclidean distance between blue pixel, when Euclidean distance between blue pixel is judged as a blue feature pixel zone during apart from maximal value less than redness; Accomplish the feature pixel regional compartmentalization of entire image successively.
[re,be]=[R-(B+G)/2-|B-G|,B-(R+G)/2-|R-G|](1)
R, G, B: red, green and blue brightness; Re: extract red pixel brightness; Be: extract blue pixel brightness.
On image, be initial point with the upper left corner, definition rectangular coordinate system u, v, (u is a unit with this pixel respectively v) to the coordinate of each pixel; Utilize formula (2) to calculate the center of gravity of the pixel in red and blue feature pixel zone; Calculate the distance of red feature pixel area pixel point center of gravity and blue feature pixel area pixel point center of gravity then; Judge whether to be sign based on the distance restraint value of red feature pixel area pixel point center of gravity and blue feature pixel area pixel point center of gravity (distance restraint value by empirical value through experimental verification); When the distance of calculating red feature pixel area pixel point center of gravity and blue feature pixel area pixel point center of gravity within distance restraint value scope, promptly be judged as a sign; Otherwise judge it is not sign.
[ Xre , Yre ] = [ Σ i = 1 n Riu n 1 , Σ i = 1 n Riv n 1 ] - - - ( 2 )
[ Xbe , Ybe ] = [ Σ i = 1 n Biu n 2 , Σ i = 1 n Biv n 2 ]
In the formula:
Xre, Yre: the red identification image coordinate that Flame Image Process is calculated; Riu, Riv: the image coordinate of extracting i red identification characteristics pixel; n 1:: the pixel total amount in red feature pixel zone.Xbe, Ybe: the blue identification image coordinate that Flame Image Process is calculated; Biu, Biv: extract i blue identification characteristics pixel image coordinate; n 2:: the pixel total amount in blue feature pixel zone.
If can successfully extract 2 of coloured identification characteristics points, program will be proceeded following work, draw result of calculation; Have only one perhaps not have if coloured identification characteristics point extracts, then return the computing machine previous action, carry out the analysis of next position feature dot image.After extracting 2 coloured positions that are identified at image,, draw the orientation angles θ that two coloured signs and omnibearing vision sensor form through Flame Image Process like Fig. 4 1, the angular error of obtaining is little, is that the present invention mainly one of uses.
Like Fig. 5, in omnibearing vision sensor when imaging,, (Z) (x y) on the same in any direction straight line, is identical with respect to the azimuth angle theta of the projection centre of omnibearing vision sensor to spatial point P with the picture point p that forms for X, Y.Based on image-forming principle, spatial point and imaging point are spatially located at grade.
Like Fig. 6, definition coordinate system XZ; According to the omnibearing vision sensor image-forming principle, space ground point P (x 0, z 0), incident ray i is through focus O M(0, H) on hyperboloid, put M (x m, z m) converge at O after the reflection c, on imaging surface, form picture point p (x i, z i); O cBe projection centre, f is a focal length; According to general Hyperbolic Equation:
The equation according to the invention (3) of deriving:
( z m - H + c ) 2 b 2 - x m 2 a 2 = 1 ; c = a 2 + b 2 - - - ( 3 )
A, b, c are the structural parameters on hyperboloid surface, and general producer provides standard value; H is an omnibearing vision sensor system height overhead; Can know that from the omnibearing vision sensor image-forming principle spatial point and imaging point xsect can be represented with coordinate system as shown in Figure 6.The definition omnibearing vision sensor is to P (x 0, z 0) between distance be that omnibearing vision sensor and identifier space are apart from x 0Be defined as picture point p (x i, z i) be image-forming range x to the distance of omnibearing vision sensor projection centre i
Point O M(0, H), M (x m, z m) and p (x o, z o) on same straight line, can be write as (4):
z m = H - H x 0 x m - - - ( 4 )
According to how much leg-of-mutton similaritys of mathematics, draw (5):
x i x m = f z m - H + 2 c - - - ( 5 )
Obtain (6) according to (4) and (5):
x m = 2 c x 0 x i x 0 f + x i H - - - ( 6 )
In (6) formula substitution (5), can obtain Zm:
z m = ( 1 - 2 c x i x 0 f + x i H ) H - - - ( 7 )
In (6) formula and (7) substitution (4), obtain the relational expression of following omnibearing vision sensor system height and omnibearing vision sensor and sign distance, image-forming range:
( - 2 cx i x 0 f + x i H + c ) 2 b 2 - ( 2 c x 0 x i x 0 f + x i H ) 2 a 2 = 1 ( x o , x i > 0 ) - - - ( 8 )
Therefore, according to (8) formula, a, b, c, f and H are definite value, if obtain imaging point p (x i, z i) be image-forming range x to the distance of omnibearing vision sensor projection centre i, then can derive omnibearing vision sensor to P (x 0, z 0) between distance be that omnibearing vision sensor and identifier space are apart from x 0
According to the position of omnibearing vision sensor in image, calculate two coloured signs leave omnibearing vision sensor respectively in image image distance x i,
Thereby draw the space length x between two coloured signs and omnibearing vision sensor according to omnibearing vision sensor imaging parameters fortran 0Go out to be identified to the omnibearing vision sensor actual range through the pixel distance calibrated and calculated.Like Fig. 7; Definition coordinate system xy; Again according to the locus of two known coloured signs; Utilize circumference theorem equation (9) to obtain 2 the space position candidate I1 and the I-I1 of omnibearing vision sensor, the principle in roadside is set, choose a wherein feasible position candidate I1 according to the restriction of road fabric width and sign;
( x I 1 - x a ) 2 + ( y I 1 - y a ) 2 = r 1 2 ( x I 1 - x b ) 2 + ( y I 1 - y b ) 2 = r 2 2 - - - ( 9 )
Wherein, (x a, y a): sign L 1The locus; (x b, y b): sign L 2The locus; r 1: sign L 1With the omnibearing vision sensor actual range; r 1: sign L 2With the omnibearing vision sensor actual range.
Like Fig. 8, in conjunction with the orientation angles θ of sign and omnibearing vision sensor formation 1With coloured sign L 1And the space length r between omnibearing vision sensor 1Triangle relation can obtain following relationship formula (10)~(12):
X = r 1 cos θ 1 + ( r 1 cos θ 1 ) 2 - r 1 2 + L 12 2 - - - ( 10 )
cosθ=(r 1 2+L 12 2-X 2)/2L 12r 1 (11)
sin θ = 1 - cos 2 θ - - - ( 12 )
Wherein, L 12: coloured sign L 1And L 2Geometry air line distance; θ: r 1With L 12The middle angle that forms; X: triangle θ angle equal edge.
Formula (13) draws the coordinate figure of other 1 space position candidate (I2) of omnibearing vision sensor:
x I 2 = L 12 - r 1 cos θ y I 2 = r 1 sin θ - - - ( 13 )
Like Fig. 9,, obtain the coordinate figure of omnibearing vision sensor space position candidate (I3) according to formula (14)~(17) with asking (I2) position in like manner;
X ′ = r 2 cos θ 1 + ( r 2 cos θ 1 ) 2 - r 2 2 + L 12 2 - - - ( 14 )
cosα=(r 2 2+L 12 2-X′ 2)/2L 12r 2 (15)
sin α = 1 - cos 2 α - - - ( 16 )
Wherein, L 12: coloured sign L 1And L 2Geometry air line distance; α: r 2With L 12The middle angle that forms; X ': triangle α angle equal edge.
Formula (17) draws the coordinate figure of other 1 space position candidate (I3) of omnibearing vision sensor:
x I 3 = L 12 - r 2 cos α y I 3 = r 2 sin α - - - ( 17 )
Like Figure 10; For improving bearing accuracy; (coordinate of I1~I3) also reduces the possible positioning error of candidate point according to their average coordinates value to three position candidate obtaining of utilization different calculation methods, obtains center of gravity according to (18) formula, promptly is the geometry location of Work machine road in the field.
x I = ( x I 1 + x I 2 + x I 3 ) / 3 y I = ( y I 1 + y I 2 + y I 3 ) / 3 - - - ( 18 )
Omnibearing vision sensor is installed on the Work machine 3, has promptly realized the geometry location of Work machine 3 at this field road; According to the coordinate system of Work machine 3 and the space coordinates transformational relation of road 2, calculate guide angle and guiding distance.
Result's output, guide angle arrives the computing machine as steering control device with the guiding distances, carries out the track route of Work machine.

Claims (5)

1. the field road self-navigation localization method of an agricultural machinery is characterized in that, may further comprise the steps:
Step 1: preliminary work: 2 coloured sign L are set in the same side at road 1And L 2, in Operation Van to be positioned, being provided with omnibearing vision sensor, omnibearing vision sensor is taken the current omnidirectional images that includes 2 coloured sign imagings;
Step 2: measure the imaging actual range that projection centre branch in omnidirectional images is clipped to 2 coloured signs imagings, and then according to following formula calculate Operation Van in the actual environment divide be clipped to 2 coloured signs apart from r1 and r2; And obtain the orientation angles θ of 2 coloured signs and omnibearing vision sensor formation according to omnidirectional images 1
Formula by imaging actual distance calculation space length is following:
Figure FDA00001881856400011
A wherein, b, c are the structural parameters on the hyperboloid surface of omnibearing vision sensor, and f is a focal length; H is the height of current fully-directional visual system; A, b, c, f and H are definite value;
x iBe the imaging actual range, the imaging point of spatial point P in omnidirectional images at promptly a certain coloured sign place is to the distance of omnibearing vision sensor projection centre;
x 0Be space length, promptly omnibearing vision sensor is to the space length of this coloured sign;
Step 3: the coordinate of obtaining 3 position candidate: (x I1, y I1), (x I2, y I2) and (x I3, y I3);
Step 4: the coordinate according to 3 position candidate is obtained the elements of a fix, and accomplish the location: obtain the elements of a fix (x according to following formula I, y I), the geometry location of the machinery road that promptly fulfils assignment in the field,
Figure FDA00001881856400012
In the step 3, the acquiring method of the coordinate of 3 position candidate is following:
At first, with some coloured initial points that is designated,, set up the plane coordinate system of actual environment with the line of two signs a coordinate axis as planimetric coordinates; (x a, y a) for identifying L 1Position coordinates in said plane coordinate system, (x b, y b) for identifying L 2Position coordinates in said plane coordinate system;
I) coordinate of the first position candidate I1 ask method:
According to formula
Figure FDA00001881856400021
Obtain separating of two position coordinateses, limit the same side principle that this constraint condition and coloured sign are arranged on road according to the road fabric width again, confirm that one of them is separated is the coordinate (x of the first position candidate I1 I1, y I1); R1 and r2 represent that also Operation Van divides the line that is clipped to 2 coloured signs;
II) coordinate (x of second position candidate I2 I2, y I2) be:
Figure FDA00001881856400022
Wherein, L 12Be coloured sign L 1And L 2Geometry air line distance, L simultaneously 12The line of also representing two coloured signs, θ are r 1With L 12The angle that forms;
Figure FDA00001881856400023
Figure FDA00001881856400024
Figure FDA00001881856400025
Wherein X is the opposite side at θ angle in the triangle;
III) coordinate (x of the 3rd position candidate I3 I3, y I3) be:
Figure FDA00001881856400026
Wherein,
Figure FDA00001881856400027
Figure FDA00001881856400028
Figure FDA00001881856400029
In the formula, α is r 2With L 12The middle angle that forms; X ' is the opposite side at α angle in the triangle.
2. the field road self-navigation localization method of agricultural machinery according to claim 1 is characterized in that, the acquiring method of imaging actual range is:
At first calculate the imaging pixel distance that coloured sign is imaged onto the omnibearing vision sensor projection centre, again this imaging pixel distance multiply by the demarcation pixel distance and be calculated to be the picture actual range.
3. the field road self-navigation localization method of agricultural machinery according to claim 2 is characterized in that the demarcation pixel distance is 0.01389inch/pixel.
4. the field road automatic navigating and positioning device of an agricultural machinery; It is characterized in that; 2 coloured signs are set in the same side of the road of exercising in Operation Van; In Operation Van to be positioned, be provided with the omnibearing vision sensor that is used to take omnidirectional images; In Operation Van, also be provided with microprocessor, include 1 in the microprocessor) be used for obtaining the graphics processing unit and 2 that obtains 2 coloured signs and the orientation angles θ 1 of omnibearing vision sensor formation apart from r1 and r2 and according to omnidirectional images of projection centre to 2 coloured sign from omnidirectional images) be used for finally obtaining the elements of a fix computing unit of the elements of a fix according to the field road self-navigation localization method of the agricultural machinery of claim 1.
5. the field road automatic navigating and positioning device of agricultural machinery according to claim 4 is characterized in that, described omnibearing vision sensor adopts bitoric lens.
CN201110007784A 2011-01-14 2011-01-14 Automatic navigation and positioning device and method for field road of agricultural machinery Active CN102183250B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110007784A CN102183250B (en) 2011-01-14 2011-01-14 Automatic navigation and positioning device and method for field road of agricultural machinery

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110007784A CN102183250B (en) 2011-01-14 2011-01-14 Automatic navigation and positioning device and method for field road of agricultural machinery

Publications (2)

Publication Number Publication Date
CN102183250A CN102183250A (en) 2011-09-14
CN102183250B true CN102183250B (en) 2012-10-17

Family

ID=44569493

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110007784A Active CN102183250B (en) 2011-01-14 2011-01-14 Automatic navigation and positioning device and method for field road of agricultural machinery

Country Status (1)

Country Link
CN (1) CN102183250B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106462164A (en) * 2014-07-29 2017-02-22 株式会社久保田 Work vehicle cooperation system

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104390644B (en) * 2014-11-25 2017-05-24 浙江理工大学 Method for detecting field obstacle based on field navigation image collection equipment
CN105044754A (en) * 2015-07-01 2015-11-11 西安交通大学 Mobile platform outdoor positioning method based on multi-sensor fusion
CN106595630B (en) * 2015-10-14 2019-09-24 山东鲁能智能技术有限公司 It is a kind of that drawing system and method are built based on laser navigation Intelligent Mobile Robot
CN105277206B (en) * 2015-10-26 2019-01-08 广西师范大学 A method of realizing the mechanical equidistant straight-line travelling in mountainous region
CN109900238B (en) * 2017-12-08 2021-01-22 中国电信股份有限公司 Method and device for measuring antenna angle of base station and computer readable storage medium
CN108153303B (en) * 2017-12-13 2020-12-18 嘉兴科禾能源科技有限公司 Automatic navigation and positioning device and method for field road of agricultural machine
CN109634305A (en) * 2018-12-21 2019-04-16 国网安徽省电力有限公司淮南供电公司 UAV position and orientation method of adjustment and system based on visual aids positioning
CN110533945B (en) * 2019-08-28 2021-09-10 肇庆小鹏汽车有限公司 Early warning method and system of traffic signal lamp, vehicle and storage medium
CN114371717B (en) * 2022-01-21 2023-04-25 厦门理工学院 Hedge trimmer intelligent control method and system for tea trimming

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5051735A (en) * 1987-09-25 1991-09-24 Honda Giken Kogyo Kabushiki Kaisha Heads-up display system for a road vehicle
CN1598487A (en) * 2004-07-23 2005-03-23 东北大学 Method for visual guiding by manual road sign
CN101660912A (en) * 2009-09-25 2010-03-03 湖南农业大学 Automatic navigating and positioning device and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005227181A (en) * 2004-02-13 2005-08-25 Toyota Motor Corp Mobile communication navigation system, mobile route searching method, and moving object
US20090024317A1 (en) * 2007-07-18 2009-01-22 Gm Global Technology Operations, Inc. System for gathering and distributing location information of vehicles

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5051735A (en) * 1987-09-25 1991-09-24 Honda Giken Kogyo Kabushiki Kaisha Heads-up display system for a road vehicle
CN1598487A (en) * 2004-07-23 2005-03-23 东北大学 Method for visual guiding by manual road sign
CN101660912A (en) * 2009-09-25 2010-03-03 湖南农业大学 Automatic navigating and positioning device and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JP特开2005-227181A 2005.08.25
李明等.基于全方位视觉传感器的农业机械定位系统.《农业工程学报》.2010,第26卷(第2期),第170-174页. *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106462164A (en) * 2014-07-29 2017-02-22 株式会社久保田 Work vehicle cooperation system
CN106462164B (en) * 2014-07-29 2019-07-12 株式会社久保田 System is coordinated by Operation Van

Also Published As

Publication number Publication date
CN102183250A (en) 2011-09-14

Similar Documents

Publication Publication Date Title
CN102183250B (en) Automatic navigation and positioning device and method for field road of agricultural machinery
CN101660912B (en) Automatic navigating and positioning device and method
CN106651953B (en) A kind of vehicle position and orientation estimation method based on traffic sign
CN105930819B (en) Real-time city traffic lamp identifying system based on monocular vision and GPS integrated navigation system
CN104217439B (en) Indoor visual positioning system and method
CN109074668A (en) Method for path navigation, relevant apparatus and computer readable storage medium
CN103615980B (en) Method and system for measuring parameters of round holes in plate
CN108181897A (en) A kind of method of biped robot's automatic tracking
CN110008893A (en) A kind of automobile driving running deviation automatic testing method based on vehicle-mounted imaging sensor
CN110275181A (en) A kind of vehicle-mounted mobile measuring system and its data processing method
CN110119698A (en) For determining the method, apparatus, equipment and storage medium of Obj State
CN103604411A (en) Automatic theodolite collimation method based on image recognition
CN105554472B (en) The method of the video monitoring system and its positioning robot of overlay environment
CN107102004A (en) A kind of tunnel detector
CN105953771A (en) Active theodolite system and measuring method
CN103076014B (en) A kind of Work machine self-navigation three mark location device and method
CN102129679B (en) Local positioning system and method
CN113177918B (en) Intelligent and accurate inspection method and system for electric power tower by unmanned aerial vehicle
CN109612430A (en) A kind of transit survey method of view-based access control model guidance
CN102818544A (en) On-line measurement method for pitch circle center of automobile hub bolt hole and central eccentric distance of central hole
CN102902975B (en) Sun positioning method based on complementary metal-oxide-semiconductor transistor (CMOS) navigation camera
CN110243293A (en) Section of jurisdiction faulting of slab ends device for fast detecting and method based on structure light and machine vision
CN103759724A (en) Indoor navigation method based on decorative lighting characteristic and system
CN102168973B (en) Automatic navigating Z-shaft positioning method for omni-directional vision sensor and positioning system thereof
CN103093214B (en) A kind of pedestrian detection method based on vehicle mounted infrared camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant