CN103424112A - Vision navigating method for movement carrier based on laser plane assistance - Google Patents

Vision navigating method for movement carrier based on laser plane assistance Download PDF

Info

Publication number
CN103424112A
CN103424112A CN2013103236581A CN201310323658A CN103424112A CN 103424112 A CN103424112 A CN 103424112A CN 2013103236581 A CN2013103236581 A CN 2013103236581A CN 201310323658 A CN201310323658 A CN 201310323658A CN 103424112 A CN103424112 A CN 103424112A
Authority
CN
China
Prior art keywords
speck
range
laser
vision
sigma
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013103236581A
Other languages
Chinese (zh)
Other versions
CN103424112B (en
Inventor
曾庆化
邓孝逸
王云舒
赵伟
刘建业
熊智
赖际洲
李荣冰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201310323658.1A priority Critical patent/CN103424112B/en
Publication of CN103424112A publication Critical patent/CN103424112A/en
Application granted granted Critical
Publication of CN103424112B publication Critical patent/CN103424112B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a vision navigating method for a movement carrier based on the laser plane assistance. The method comprises the following steps: building a distance measurement and environment system, determining the position of a rectangular bright spot to be detected in an image, determining the center coordinate of the rectangular bright spot, obtaining a geometric distance measurement model, correcting the distance measurement model, segmenting the visual image, determining a local vectorization environmental space with distance information, positioning the position of the carrier per se, and building a global environmental space. Through the adoption of the method different from the traditional movement carrier autonomous navigation middle distance obtaining method, a laser plane is introduced as an auxiliary means, a single or a plurality of vision sensors are combined, the threadlike bright spot formed by the reflection of the laser plane on an object to be irradiated is fully utilized and taken as an cooperated distance measurement target, and the geometric relationship between the vision sensor and a laser light source is effectively utilized, so that the requirements for the precision, yield ratio and real-time performance of the distance calculating can be met. Therefore, the method is a reliable navigation method, and can be applied to a movement carrier autonomous navigation system.

Description

A kind of motion carrier vision navigation method auxiliary based on laser plane
Technical field
The present invention relates to a kind of vision navigation method for motion carrier, relate in particular to a kind of auxiliary method of carrying out the vision range-finding navigation of laser plane that adopts, belong to motion carrier independent navigation field.
Background technology
In the situation that external environmental information input while lacking high-precision real, all kinds of motion carriers are difficult to realize independent navigation by inertia/GNSS system.Utilize vision technique realize motion carrier to the high precision perception of unknown barrier and hide, can effectively improve the application power of carrier.Monocular or used for multi-vision visual rely on it simple in structure, and the advantages such as low-power consumption, small size, having broad application prospects aspect the indoor and outdoor navigation field.Research is the ranging technology to the place ahead barrier based on laser-assisted motion carrier, and the estimation collision time, not only can evade service for barrier effectively, positions oneself simultaneously and provide information with the reconstruct of exterior three dimensional environment.Therefore, vision navigation method is significant to the independent navigation of motion carrier fast and accurately.
The vision guided navigation algorithm has been carried out to many research both at home and abroad, proposed to realize by the method for pure vision the independent navigation of carrier in part paper and document, but pure vision guided navigation is affected by environment large and obtaining information is limited; Propose to have used vision/inertia/multiple measurement means such as ultrasound wave in some papers and document, such method is in the situation that shortage cooperative target effect is limited; Some researchists are on the basis of having used vision sensor, using laser radar as main range finding means, realized navigation feature, be aided with other sensors such as inertia, odometer, ultrasound wave, sonar after effect very outstanding, but cost and energy consumption cost are higher.And in general laser radar can only be realized the two dimension range finding, the environment construction effect is limited.
Summary of the invention
Technical matters
The technical problem to be solved in the present invention be in the autonomous vision guided navigation algorithm of conventional motion carrier by the two dimensional image coordinate derive the space ambiguity problem draw the three-dimensional world coordinate and range information amount not fully, algorithm process real-time and distance accuracy problem.In order to simplify LASER Light Source and video camera mounting condition (without the specific (special) requirements such as parallel, vertical), the air navigation aid of a kind of auxiliary vision range finding based on laser plane, autonomous location, environment construction is provided, the required hardware configuration of the method is simple, volume is little, energy consumption is low, has overcome the restriction that is subject to self can't carrying large volume, large power consumption device due to motion carrier; The range information obtained is carried out to the matrixing storage, and combining image processing acquisition Vectorgraph, merge the Inertial Measurement Unit information realization to hiding of carrier self poisoning, peripheral obstacle and obtaining of environmental information.
Technical scheme
In order to solve above-mentioned technical matters, the motion carrier vision navigation method auxiliary based on laser plane of the present invention, comprise the steps:
Step 1: by by being arranged on LASER Light Source on motion carrier and vision sensor as range finding and environment construction system.Utilize LASER Light Source to send laser plane, produce and reflect to form the wire speck on the testee of carrier the place ahead, then take and obtain the visual pattern that comprises this wire speck by vision sensor;
Step 2: determine one group of speck to be measured and coordinate range thereof in step 1 gained visual pattern, utilize the stencil matching method, in image, the position of rectangle speck to be measured in image determined in search, wherein:
R ( i , j ) = Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) × T ( m , n ) ) Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) ) 2
Template and subimage size are all the rectangle of M * N pixel; (i, j) is the reference point coordinate, represents the subimage position;
Figure BDA00003582906900022
For masterplate, it is normal value;
Figure BDA00003582906900023
For variable, with reference point, change; Represent masterplate and the subimage degree of correlation, when subimage masterplate and masterplate similarity maximum, can obtain the subimage position that reference point is (I, J), i.e. rectangle speck to be measured position;
Step 3: choose the brightness of rectangle speck to be measured as weights, by gray scale weights method, to each pixel weighted sum of rectangle speck, draw the centre coordinate of rectangle speck after average:
x c = Σ x = I I + m Σ y = J J + n S ( x , y ) x Σ x = I I + m Σ y = J J + n S ( x , y ) , y c = Σ x = I I + m Σ y = J J + n S ( x , y ) y Σ x = I I + m Σ y = J J + n S ( x , y ) , ( S ( x , y ) ≥ T )
Wherein, (I, J) is the reference point coordinate of the rectangle speck of step 3 acquisition, S i,j(m, n) is weight, and T is weight threshold, if S i,j(m, n) is less than T and is set to 0, obtains rectangle speck centre coordinate after weighted mean;
Step 4: according to the relative position of the vision sensor on carrier in step 1 and lasing light emitter, according to the pinhole imaging system principle, obtain the range finding model how much, how much range finding models are as follows:
S = d m - cos β ( tan θ + tan a tan β ) , m = ( v B 0 - v B 20 ) cos β ay
Wherein, d is LASER Light Source and camera light distance in the heart; α, β, θ installs angle for each, and ay is focal length and pixel physical size ratio, v B0, v B20For each auxiliary point image coordinate;
Step 5: according to parameter and the known conditions in the model of finding range for how much in step 4, in conjunction with the error composition principle, the reason that affects distance accuracy is analyzed, in conjunction with experimental data, produce the parameter of error in compensation range finding model, the range finding model after being proofreaied and correct;
Step 6: while carrying out laser ranging, the range finding model after proofreading and correct by the speck centre coordinate substitution to be measured of obtaining in step 3, obtain the result of finding range at every turn; Or, before navigation, directly bring image coordinate into the range finding model by a fixed step size, and the result obtained is carried out to the matrixing storage, obtain in navigation procedure after facula position and directly extract corresponding distance value, to obtain fast the actual range of laser beam loca;
Step 7: the laser line segment according to known definite distance, utilize region growing method, visual pattern is cut apart, thereby obtained the image segmentation result with range information;
Step 8: the image segmentation result that will have range information carries out vector quantization, and realizes environment reconstruct with the vector quantization figure.All can obtain the Local Vector environment space with range information after each laser scanning;
Step 9: in conjunction with the Inertial Measurement Unit on motion carrier, and the local environment space that obtains of step 8, carry out many information fusion by means of filtering, orient the carrier self-position and build the global context space, realize real-time independent navigation function.
In method of the present invention, the first step adopts laser-assisted method, introduced laser plane as supplementary means, and in conjunction with vision sensor, take full advantage of wire speck that laser plane reflects to form on irradiated object as the cooperation measuring distance of target, can solve in some other vision location algorithms, draw three-dimensional world coordinate existing space ambiguity problem or the less problem that is difficult to range finding of quantity of information by the two dimensional image coordinate, this distance-finding method to the installation site of LASER Light Source and video camera without specific (special) requirements such as parallel grade.In the 6th step, the range information obtained is carried out to matrixing and store the efficiency that can improve the actual range that obtains the laser beam loca.The 7th step and the 8th step obtain the vector quantization environment space by region growing and vector quantization, can effectively reduce the local environmental information amount, can realize the independent navigation of motion carrier after the 9th step information fusion.
Beneficial effect
Method of the present invention has following beneficial effect:
(1) be different from traditional motion carrier independent navigation middle distance acquisition methods, introduced laser plane as supplementary means, and in conjunction with single or multiple vision sensors, take full advantage of wire speck that laser plane reflects to form on irradiated object as the cooperation measuring distance of target, and the method can be carried out the matrixing storage by the range information obtained, need not calculate when speck being detected, directly utilize matrix to extract the distance measurements corresponding with image coordinate.Operand is little, and the resolving time is short, is simple and easy to realize, the range information obtained is stable, not the problem of Existential Space ambiguity.
(2) during installing device, the installation site of video camera and light source is not had to specific (special) requirements, do not require that the laser plane sent must be parallel with camera optical axis, or vertical with the carrier surface level, reduced to install that requirement being set.
(3) effectively utilize the geometric relationship of vision sensor and LASER Light Source, derive the range finding model and improve speck detection efficiency to be measured, and effectively improve distance accuracy by error analysis, suppress the situation that the range finding result is dispersed, effectively utilize the information of visual pattern by region growing, and built the Local Vector environment space with accurate distance information, can build global space after having merged each local space, and realize independent navigation in this space.
(4) by the detection laser speck, the calmodulin binding domain CaM growth, can effectively detect all kinds of barriers in carrier the place ahead, contributes to realize keeping away barrier and environment reconstruct, more simpler effectively than the method for pure vision.
Generally speaking, this method reaches the requirement apart from calculation accuracy, success ratio, real-time, is a kind of air navigation aid that possesses reliability, can be applicable in the motion carrier autonomous navigation system.
The accompanying drawing explanation
The overall flow figure that Fig. 1 is the inventive method;
Fig. 2 is laser plane range finding horizontal field of view scheme of installation;
Fig. 3 is laser plane range finding model scenograph;
Fig. 4 is laser plane range images perspective view.
Embodiment
The auxiliary motion carrier vision navigation method based on laser plane of the present embodiment comprises the steps:
Step 1: by by being arranged on LASER Light Source on motion carrier and vision sensor as range finding and environment construction system.Utilize LASER Light Source to send laser plane, produce and reflect to form the wire speck on the testee of carrier the place ahead, then take and obtain the visual pattern that comprises this wire speck by vision sensor; Step 2: determine one group of speck to be measured and coordinate range thereof in step 1 gained image, utilize the stencil matching method, in image, the position of rectangle speck to be measured in image determined in search, wherein:
R ( i , j ) = Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) × T ( m , n ) ) Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) ) 2
Template and subimage size are all the rectangle of M * N pixel; (i, j) is the reference point coordinate, represents the subimage position;
Figure BDA00003582906900052
For masterplate, it is normal value;
Figure BDA00003582906900053
For variable, with reference point, change;
Figure BDA00003582906900054
Represent masterplate and the subimage degree of correlation, when subimage masterplate and masterplate similarity maximum, can obtain the subimage position that reference point is (I, J), i.e. rectangle speck to be measured position;
Step 3: choose the brightness of rectangle speck to be measured as weights, by gray scale weights method, to each pixel weighted sum of rectangle speck, draw the centre coordinate of rectangle speck after average:
x c = Σ x = I I + m Σ y = J J + n S ( x , y ) x Σ x = I I + m Σ y = J J + n S ( x , y ) , y c = Σ x = I I + m Σ y = J J + n S ( x , y ) y Σ x = I I + m Σ y = J J + n S ( x , y ) , ( S ( x , y ) ≥ T )
Wherein, (I, J) is the reference point coordinate of the rectangle speck of step 3 acquisition, S i,j(m, n) is weight, and T is weight threshold, if S i,j(m, n) is less than T and is set to 0, obtains rectangle speck centre coordinate after weighted mean;
Step 4: according to the relative position of the vision sensor on carrier in step 1 and lasing light emitter, according to the pinhole imaging system principle, obtain the range finding model how much, how much range finding models are as follows:
S = d m - cos β ( tan θ + tan a tan β ) , m = ( v B 0 - v B 20 ) cos β ay
D is LASER Light Source and camera light distance in the heart; α, β, θ installs angle for each, and ay is focal length and pixel physical size ratio, v B0, v B20For each auxiliary point image coordinate;
Step 5: according to parameter and the known conditions in the model of finding range for how much in step 4, in conjunction with the error composition principle, the reason that affects distance accuracy is analyzed, in conjunction with experimental data, produce the parameter of error in compensation range finding model, the range finding model after being proofreaied and correct;
Step 6: while carrying out laser ranging, the range finding model after proofreading and correct by the speck centre coordinate substitution to be measured of obtaining in step 3, obtain the result of finding range at every turn; Or, before navigation, directly bring image coordinate into the range finding model by a fixed step size, and the result obtained is carried out to the matrixing storage, obtain in navigation procedure after facula position and directly extract corresponding distance value, to obtain fast the actual range of laser beam loca;
Step 7: the laser line segment according to known definite distance, utilize region growing method, visual pattern is cut apart, thereby obtained the image segmentation result with range information.
Step 8: the image segmentation result that will have range information carries out vector quantization, and realizes environment reconstruct with the vector quantization figure.All can obtain the Local Vector environment space with range information after each laser scanning.
Step 9: in conjunction with the Inertial Measurement Unit on motion carrier, and the local environment space that obtains of step 8, carry out many information fusion by means of filtering, orient the carrier self-position and build the global context space, realize real-time independent navigation function.
The main performing step that below will relate to the present embodiment method is described further:
(1) laser speck position determines
After lasing light emitter sends optical plane and is mapped to objects in front, therefore due to the speck that forms reflection and can produce respectively line-like.And, because brightness is higher, if image is converted into to 8 gray-scale maps, it is 255 that the pixel value of spot center part can reach maximum; If from the aberration angle analysis, what suppose to send is red laser, and the red color component value in the laser speck will be higher than green and blue component.Under this condition that possesses certain priori, stencil matching is a kind of feasible method.Preliminary work before stencil matching comprises carries out filtering, the operation such as level and smooth, remove portion noise to image.
It should be noted that, when laser plane, forwardly on object after reflection, can form a wire speck.But the position difference due to each irradiated object, in most cases the wire speck can not be a complete straight line, there will be interrupted situation, after projecting in the plane of delineation, can not a continuous straight line equally, now can determine according to the actual requirements horizontal (indulging) coordinate range of several rectangles impact point to be measured and image thereof on each interrupted straight line, extract again afterwards the accurate location of rectangle tested point.
Suppose that masterplate is T, size is M * N, and searched figure is that the S(length and width all are more than or equal to masterplate), the search graph covered by masterplate is subgraph S i,j, the subgraph size is identical with masterplate.Wherein (i, j) is subgraph upper left corner coordinate, is reference point.
Can be by masterplate and reference diagram respective pixel are done differ from and sue for peace, the check similarity:
sum ( i , j ) = Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) - T ( m , n ) ) 2 - - - ( 1 )
After expansion:
sum ( i , j ) = Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) ) 2 - Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) × T ( m , n ) ) + Σ m = 1 M Σ n = 1 N ( T ( m , n ) ) 2 - - - ( 2 )
In formula (2) For masterplate, it is normal value;
Figure BDA00003582906900074
For variable, represent the energy of subimage, with reference point, change;
Figure BDA00003582906900075
Represent masterplate and the subimage degree of correlation, if when both are complementary, the value maximum, so can obtain thus related function:
R ( i , j ) = Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) × T ( m , n ) ) Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) ) 2 - - - ( 3 )
Masterplate and subimage similarity are larger, and R (i, j) more approaches 1.When subimage masterplate and masterplate similarity maximum, can obtain the subimage position that reference point is (I, J), i.e. rectangle speck to be measured position.
(2) Facula Center Location
After determining the rectangle speck to be measured position that laser forms on testee, can extract on this basis the centre coordinate of rectangle speck on image coordinate.When processing image problem, determine that in image, coordinate can only be accurate to pixel scale, generally can meet accuracy requirement.If measured target from carrier positions away from the time, pixel of every variation, will form larger range deviation.Therefore need to consider the sub-pix problem in the Facula Center Location process, to improve distance accuracy.
Based on considering of the aspects such as complexity and real-time, adopt the grey scale centre of gravity method, imaging characteristic according to the speck zone, the gray-scale value at speck center is maximum, the frontier area gray-scale value is lower, so can obtain centre coordinate after weighted mean using the pixel value of each coordinate as weight, realize the solving precision of sub-pix.
x c = Σ x = I I + m Σ y = J J + n S ( x , y ) x Σ x = I I + m Σ y = J J + n S ( x , y ) , y c = Σ x = I I + m Σ y = J J + n S ( x , y ) y Σ x = I I + m Σ y = J J + n S ( x , y ) , ( S ( x , y ) ≥ T ) - - - ( 4 )
In formula by summit (I, J) and length and width be respectively m, the rectangle that n forms need be slightly larger than (needing to guarantee that the pixel count that calculates the speck center is greater than certain numerical value to obtain result preferably) Matching sub-image picture, and add threshold determination while calculating, if certain some pixel value is less than threshold value T, weight is made as 0.
(3) how much range finding models based on pinhole imaging system
The straight line hot spot that the laser plane that the laser assisted distance-finding method can send by light source forms on testee is found range.Below will describe in detail this model of finding range.
As shown in Figure 3,4: establish world coordinate system XwYwZw, CC 1For camera optical axis, uov is imaging plane, and optical axis and imaging plane meet at photocentre C 0, image coordinate is (u 0, v 0), make the relative level distance of video camera and tested point as to be measured.
L is LASER Light Source, supposes optical axis CC 1Be parallel to the Yw axle, imaging plane uov is perpendicular to surface level XwOwYw, ou, and ov is parallel to respectively axle Xw and Zw; As shown in Figure 2, lasing light emitter L is positioned at the camera top, and distance is d, laser plane and optical axis CC that order is launched 1Angle be θ 0, and the angle of the intersection of laser plane and surface level XwOwYw and axle Xw can not be 0.
If laser plane is radiated on the different vertical plane in two positions, emission forms two wire specks, easily gets an A on one therein, makes the projection A on an A plane of delineation 0(u A0, v A0) horizontal ordinate and photocentre C 0Identical, and cross some A and be L ' perpendicular to the Yw axle in laser plane; Get a B on another speck, the projection on the plane of delineation is B 0(u B0, v B0), LB and L ' meet at B ', and cross B and make L ' ' and be parallel to L '; Can establish L ' is all α with the angle of the coaxial Xw of L ' ' simultaneously.
Cross L, C does respectively two planes that are parallel to surface level XwOwYw, and some A, B, B ' are respectively A in the projection on these two planes 1And C 1, B 1And B 2, B 1' and B 2', B 2Be projected as B at the plane of delineation 20(u B20, v B20).Make ∠ ALA 1For θ, ∠ BLB 1(∠ B ' LB 1') be θ ', ∠ B 1LA 1(∠ B 2CC 1) be β.
LB 1As to be measured be the relative level distance of photocentre C and tested point B, be below concrete derivation:
At first tested point A is asked to distance, can be by LC (A at this 1C 1), α, the θ known quantity during as erecting equipment, at Δ ACC 1In:
A 0 C 0 AC 1 = CC 0 CC 1 - - - ( 5 )
Wherein can directly try to achieve AC 1, A 0C 0And CC 0Value:
AC 1 = AA 1 + A 1 C 1 = CC 1 tan θ + d A 0 C 0 = ( v A 0 - v C 0 ) dy CC 0 = f - - - ( 6 )
To in three equation substitutions (5) of (6), can obtain:
CC 1 = d m - tan θ , m = ( v A 0 - v C 0 ) ay - - - ( 7 )
By above step, about the information of tested point A, all can obtain, tested point B that just can be more complicated to solution procedure as condition is calculated.
At Δ B 2' CC 1In:
tan β B 2 C 0 CC 0 = 0 ( u B 0 - u C 0 ) d f = ( xu B 0 - u C 0 ) ax - - - ( 8 )
At Δ BCB 2In:
B 0 B 20 BB 2 = CB 20 CB 2 ,
(9) in, in 4 variablees, only need to obtain except CB 2The value of 3 amounts or only comprise CB in addition 2Expression formula, just can obtain CB 2Result.
B wherein 0B 20Be the physical distance of 2 pixel coordinates:
B 0 B 20 = ( v B 0 - v B 20 ) dy - - - ( 10 )
(9) BB in 2Can be expressed as:
BB 2 = BB 1 + B 1 B 2 = d + BB 1 - - - ( 11 )
(11) BB in 1Can be obtained by the following step,
At Δ LBB 1In:
B ′ B 1 ′ BB 1 = LB 1 ′ LB 1 - - - ( 12 )
Wherein:
B ′ B 1 ′ = AA 1 + A 1 B 1 ′ tan α = LA 1 tan θ + A 1 B 1 ′ tan α A 1 B 1 ′ = C 1 B 2 ′ = CC 1 tan β = LA 1 tan β - - - ( 13 )
LB 1 ′ = LA 1 cos β = CC 1 cos β - - - ( 14 )
By (13), in (14) substitution (12), obtain:
BB 1 = LB 1 cos β ( tan θ + tan a tan β ) - - - ( 15 )
(9) CB in 20:
CB 20 CC 0 cos β = f cos β - - - ( 16 )
Finally, by (10) (11) (15) (16) substitutions (9), can finally obtain:
CB 2 = d m - cos β ( tan θ + tan a tan β ) , m = ( v B 0 - v B 20 ) cos β ay - - - ( 17 )
From formula, (21) can the results are as follows, only need to know the focal length of video camera and the physical size of image pixel, just can be in the hope of speck to be measured and video camera relative distance.
According to formula (17), when β trends towards 0, formula becomes:
CB 2 = d m - tan θ , m = ( v B 0 - v B 20 ) ay - - - ( 18 )
Find that formula (18) and formula (7) are very similar, just the ordinate difference of tested point image coordinate.According to these characteristics, can obtain such description, as β, close to 0 the time, the laser plane of launching draws in becomes laser beam, and the planar Ranging model is degenerated becomes the range finding model of the single imaging point that laser beam is formed.
The video camera used is demarcated, need be obtained the following internal reference of shooting:
K = ax 0 u C 0 0 ay v C 0 0 0 1
While adopting the laser plane distance-finding method, camera, LASER Light Source are fixed on carrier, as shown in Figure 2, wherein photocentre and light source distance are d in installation site.
(4) error analysis and correction
Owing to there being systematic error, need to the error of system be compensated.Below with alignment error, be compensated for as example, the spacing d of camera and lasing light emitter and angle theta and α compensated.
4.1 the laser plane range error is analyzed
Formula (18) makes CB 2Respectively d, θ and α are asked respectively to local derviation:
∂ CB 2 ∂ d = 1 m - cos β ( tan θ + tan α tan β ) ∂ CB 2 ∂ θ = cos β d [ m - cos β ( tan θ + tan α tan β ) ] 2 1 cos 2 θ ∂ CB 2 ∂ α = cos β tan β d [ m - cos β ( tan θ + tan α tan β ) ] 2 1 cos 2 α - - - ( 19 )
Due to d, θ and α separate, therefore according to the systematic error composition principle:
Δ CB 2 = Δd ∂ CB 2 ∂ d + Δθ ∂ CB 2 ∂ θ + Δα ∂ CB 2 ∂ a - - - ( 20 )
4.2 error correction
Using Δ d, Δ θ and Δ α unknown parameter in measurement model, carry out nonlinear fitting with True Data, solve corrected d, θ and α value, by the modified value substitution range finding model obtained, the range finding model after being proofreaied and correct.
(5) matrixing storage
From above-mentioned steps, can learn, once each parameter of range finding model is definite, the result that each coordinate of image is brought into after the range finding model also no longer changes, corresponding one by one with image coordinate itself.For can the quick obtaining range information, can be after the model that obtains accurately the finding range disposable distance value that calculates all image coordinate (image coordinate need be brought into according to a fixed step size range finding model in), and by its matrixing storage.Follow-up needs to detect speck coordinate to be measured and just can directly obtain distance value.
(6) environmental map vector quantization
According to the laser line segment of known definite distance, by region growing method, the Image Segmentation Using that vision sensor is obtained, thus obtain the image segmentation result with range information.The boundary sections that will have in the image of range information is carried out vector quantization, and realizes the reconstruct of indoor environment with the vector quantization figure, thereby obtains the Local Vector environment space with range information.
(7) information fusion
After having obtained the Local Vector environment space, movable information in conjunction with the output of the Inertial Measurement Unit on motion carrier, carry out information fusion by means of filtering, orient the carrier self-position, and obtain accurate carrier pose change information, and just can realize building the global context space after merging accordingly local space, realize real-time independent navigation function.
So far, by the motion carrier vision navigation method idiographic flow auxiliary based on laser plane, finish.

Claims (2)

1. a motion carrier vision navigation method auxiliary based on laser plane, is characterized in that, comprises the steps:
Step 1: by by being arranged on LASER Light Source on motion carrier and vision sensor as range finding and environment construction system, utilize LASER Light Source to send laser plane, produce and reflect to form the wire speck on the testee of carrier the place ahead, then take and obtain the visual pattern that comprises this wire speck by vision sensor;
Step 2: determine one group of speck to be measured and coordinate range thereof in step 1 gained visual pattern, utilize the stencil matching method, in image, the position of rectangle speck to be measured in image determined in search, wherein:
R ( i , j ) = Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) × T ( m , n ) ) Σ m = 1 M Σ n = 1 N ( S i , j ( m , n ) ) 2
Described template and subimage size are all the rectangle of M * N pixel; (i, j) is the reference point coordinate, represents the subimage position; For masterplate, it is normal value; For variable, with reference point, change;
Figure FDA00003582906800014
Represent masterplate and the subimage degree of correlation, when subimage masterplate and masterplate similarity maximum, can obtain the subimage position that reference point is (I, J), i.e. rectangle speck to be measured position;
Step 3: choose the brightness of rectangle speck to be measured as weights, by gray scale weights method, to each pixel weighted sum of rectangle speck, draw the centre coordinate of rectangle speck after average;
Step 4: according to the relative position of the vision sensor on carrier in step 1 and lasing light emitter, according to the pinhole imaging system principle, obtain the range finding model how much, how much range finding models are as follows:
S = d m - cos β ( tan θ + tan α tan β ) , m = ( v B 0 - v B 20 ) cos β ay
Wherein, d is LASER Light Source and camera light distance in the heart; α, β, θ installs angle for each, and ay is focal length and pixel physical size ratio, v B0, v B20For each auxiliary point image coordinate;
Step 5: according to parameter and the known conditions in the model of finding range for how much in step 4, in conjunction with the error composition principle, the reason that affects distance accuracy is analyzed, in conjunction with experimental data, produce the parameter of error in compensation range finding model, the range finding model after being proofreaied and correct;
Step 6: while carrying out laser ranging, the range finding model after proofreading and correct by the speck centre coordinate substitution to be measured of obtaining in step 3, obtain the result of finding range at every turn;
Step 7: the laser line segment according to known definite distance, utilize region growing method, visual pattern is cut apart, thereby obtained the image segmentation result with range information;
Step 8: the image segmentation result that will have range information carries out vector quantization, and realizes environment reconstruct with the vector quantization figure, after each laser scanning, all can obtain the Local Vector environment space with range information;
Step 9: in conjunction with the Inertial Measurement Unit on motion carrier, and the Local Vector environment space that obtains of step 8, carry out many information fusion by means of filtering, orient the carrier self-position and build the global context space, realize real-time independent navigation function.
2. the motion carrier vision navigation method auxiliary based on laser plane as claimed in claim 1, it is characterized in that, in described step 6, before navigation, directly bring image coordinate into the range finding model by a fixed step size, the result obtained is carried out to the matrixing storage, obtain in navigation procedure after facula position and directly extract corresponding distance value, to obtain fast the actual range of laser beam loca, obtain the result of finding range.
CN201310323658.1A 2013-07-29 2013-07-29 A kind of motion carrier vision navigation method auxiliary based on laser plane Active CN103424112B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310323658.1A CN103424112B (en) 2013-07-29 2013-07-29 A kind of motion carrier vision navigation method auxiliary based on laser plane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310323658.1A CN103424112B (en) 2013-07-29 2013-07-29 A kind of motion carrier vision navigation method auxiliary based on laser plane

Publications (2)

Publication Number Publication Date
CN103424112A true CN103424112A (en) 2013-12-04
CN103424112B CN103424112B (en) 2016-06-01

Family

ID=49649214

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310323658.1A Active CN103424112B (en) 2013-07-29 2013-07-29 A kind of motion carrier vision navigation method auxiliary based on laser plane

Country Status (1)

Country Link
CN (1) CN103424112B (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104467960A (en) * 2014-12-25 2015-03-25 湖北工程学院 Beacon light spot stable positioning system in wireless optical communication and implementation method thereof
CN105371840A (en) * 2015-10-30 2016-03-02 北京自动化控制设备研究所 Method for combined navigation of inertia/visual odometer/laser radar
CN105894505A (en) * 2016-03-30 2016-08-24 南京邮电大学 Quick pedestrian positioning method based on multi-camera geometrical constraint
CN106056107A (en) * 2016-07-28 2016-10-26 福建农林大学 Pile avoiding control method based on binocular vision
CN106842219A (en) * 2017-01-18 2017-06-13 北京商询科技有限公司 A kind of space ranging method and system for mixed reality equipment
CN108873897A (en) * 2018-06-26 2018-11-23 广州数娱信息科技有限公司 Short distance traffic equipment and system
EP3413287A1 (en) 2010-04-19 2018-12-12 SMR Patents S.à.r.l. Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle
CN109443325A (en) * 2018-09-25 2019-03-08 上海市保安服务总公司 Utilize the space positioning system of floor-mounted camera
CN110460758A (en) * 2019-08-28 2019-11-15 上海集成电路研发中心有限公司 A kind of imaging device and imaging method based on laser ranging point identification
CN110887486A (en) * 2019-10-18 2020-03-17 南京航空航天大学 Unmanned aerial vehicle visual navigation positioning method based on laser line assistance
US10703299B2 (en) 2010-04-19 2020-07-07 SMR Patents S.à.r.l. Rear view mirror simulation
CN111383239A (en) * 2020-02-24 2020-07-07 上海航天控制技术研究所 Mars image false edge elimination and contour accurate fitting method based on iterative search
CN112179361A (en) * 2019-07-02 2021-01-05 华为技术有限公司 Method, device and storage medium for updating work map of mobile robot
CN113379652A (en) * 2021-08-11 2021-09-10 深圳市先地图像科技有限公司 Linear image correction method and system for laser imaging and related equipment
CN113376573A (en) * 2021-06-01 2021-09-10 北京航空航天大学 Fusion positioning system based on radio ranging and artificial light source angle measurement
CN113820720A (en) * 2021-11-22 2021-12-21 成都星宇融科电力电子股份有限公司 Three-dimensional laser center ranging method, system and terminal based on multiple reference base points
CN114046768A (en) * 2021-11-10 2022-02-15 重庆紫光华山智安科技有限公司 Laser ranging method and device, laser ranging equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2709545A1 (en) * 1993-04-19 1995-03-10 Giat Ind Sa Interactive navigational aid method, in particular for road navigation, and a device for its implementation
JP2000171199A (en) * 1998-12-03 2000-06-23 Mitsubishi Electric Corp Passive impact splash standardizing apparatus
JP2002031528A (en) * 2000-07-14 2002-01-31 Asia Air Survey Co Ltd Space information generating device for mobile mapping
CN101393029A (en) * 2007-09-17 2009-03-25 王保合 Automobile navigation apparatus and navigation system using the same
CN102322857A (en) * 2011-05-24 2012-01-18 武汉理工大学 Position and posture measuring system and method for mechanical equipment
CN103140377A (en) * 2010-08-12 2013-06-05 法雷奥开关和传感器有限责任公司 Method for displaying images on a display device and driver assistance system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2709545A1 (en) * 1993-04-19 1995-03-10 Giat Ind Sa Interactive navigational aid method, in particular for road navigation, and a device for its implementation
JP2000171199A (en) * 1998-12-03 2000-06-23 Mitsubishi Electric Corp Passive impact splash standardizing apparatus
JP2002031528A (en) * 2000-07-14 2002-01-31 Asia Air Survey Co Ltd Space information generating device for mobile mapping
CN101393029A (en) * 2007-09-17 2009-03-25 王保合 Automobile navigation apparatus and navigation system using the same
CN103140377A (en) * 2010-08-12 2013-06-05 法雷奥开关和传感器有限责任公司 Method for displaying images on a display device and driver assistance system
CN102322857A (en) * 2011-05-24 2012-01-18 武汉理工大学 Position and posture measuring system and method for mechanical equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
曾庆化等: "《激光陀螺惯导系统硬件增强角速率输入圆锥算法》", 《东南大学学报(自然科学版) 》 *
王先敏等: "《视觉导航技术的发展及其研究分析》", 《信息与控制》 *
黄显林等: "《自主视觉导航方法综述》", 《吉林大学学报(信息科学版)》 *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10703299B2 (en) 2010-04-19 2020-07-07 SMR Patents S.à.r.l. Rear view mirror simulation
US10800329B2 (en) 2010-04-19 2020-10-13 SMR Patents S.à.r.l. Rear view mirror simulation
EP3413287A1 (en) 2010-04-19 2018-12-12 SMR Patents S.à.r.l. Method for indicating to a driver of a vehicle the presence of an object at least temporarily moving relative to the vehicle
US10562452B2 (en) 2010-04-19 2020-02-18 SMR Patents S.à.r.l. Rear-view mirror simulation
CN104467960A (en) * 2014-12-25 2015-03-25 湖北工程学院 Beacon light spot stable positioning system in wireless optical communication and implementation method thereof
CN104467960B (en) * 2014-12-25 2017-01-18 湖北工程学院 Beacon light spot stable positioning system in wireless optical communication and implementation method thereof
CN105371840B (en) * 2015-10-30 2019-03-22 北京自动化控制设备研究所 A kind of inertia/visual odometry/laser radar Combinated navigation method
CN105371840A (en) * 2015-10-30 2016-03-02 北京自动化控制设备研究所 Method for combined navigation of inertia/visual odometer/laser radar
CN105894505A (en) * 2016-03-30 2016-08-24 南京邮电大学 Quick pedestrian positioning method based on multi-camera geometrical constraint
CN106056107B (en) * 2016-07-28 2021-11-16 福建农林大学 Pile avoidance control method based on binocular vision
CN106056107A (en) * 2016-07-28 2016-10-26 福建农林大学 Pile avoiding control method based on binocular vision
CN106842219B (en) * 2017-01-18 2019-10-29 北京商询科技有限公司 A kind of space ranging method and system for mixed reality equipment
CN106842219A (en) * 2017-01-18 2017-06-13 北京商询科技有限公司 A kind of space ranging method and system for mixed reality equipment
CN108873897A (en) * 2018-06-26 2018-11-23 广州数娱信息科技有限公司 Short distance traffic equipment and system
CN109443325A (en) * 2018-09-25 2019-03-08 上海市保安服务总公司 Utilize the space positioning system of floor-mounted camera
US11896175B2 (en) 2019-07-02 2024-02-13 Huawei Technologies Co., Ltd. Method and apparatus for updating working map of mobile robot, and storage medium
CN112179361A (en) * 2019-07-02 2021-01-05 华为技术有限公司 Method, device and storage medium for updating work map of mobile robot
WO2021000630A1 (en) * 2019-07-02 2021-01-07 华为技术有限公司 Method and apparatus for updating working map of mobile robot, and storage medium
CN110460758A (en) * 2019-08-28 2019-11-15 上海集成电路研发中心有限公司 A kind of imaging device and imaging method based on laser ranging point identification
CN110887486A (en) * 2019-10-18 2020-03-17 南京航空航天大学 Unmanned aerial vehicle visual navigation positioning method based on laser line assistance
CN110887486B (en) * 2019-10-18 2022-05-20 南京航空航天大学 Unmanned aerial vehicle visual navigation positioning method based on laser line assistance
CN111383239A (en) * 2020-02-24 2020-07-07 上海航天控制技术研究所 Mars image false edge elimination and contour accurate fitting method based on iterative search
CN111383239B (en) * 2020-02-24 2022-06-03 上海航天控制技术研究所 Mars image false edge elimination and contour accurate fitting method based on iterative search
CN113376573A (en) * 2021-06-01 2021-09-10 北京航空航天大学 Fusion positioning system based on radio ranging and artificial light source angle measurement
CN113379652B (en) * 2021-08-11 2021-10-22 深圳市先地图像科技有限公司 Linear image correction method and system for laser imaging and related equipment
CN113379652A (en) * 2021-08-11 2021-09-10 深圳市先地图像科技有限公司 Linear image correction method and system for laser imaging and related equipment
CN114046768A (en) * 2021-11-10 2022-02-15 重庆紫光华山智安科技有限公司 Laser ranging method and device, laser ranging equipment and storage medium
CN114046768B (en) * 2021-11-10 2023-09-26 重庆紫光华山智安科技有限公司 Laser ranging method, device, laser ranging equipment and storage medium
CN113820720B (en) * 2021-11-22 2022-01-25 成都星宇融科电力电子股份有限公司 Three-dimensional laser center ranging method, system and terminal based on multiple reference base points
CN113820720A (en) * 2021-11-22 2021-12-21 成都星宇融科电力电子股份有限公司 Three-dimensional laser center ranging method, system and terminal based on multiple reference base points

Also Published As

Publication number Publication date
CN103424112B (en) 2016-06-01

Similar Documents

Publication Publication Date Title
CN103424112A (en) Vision navigating method for movement carrier based on laser plane assistance
JP7398506B2 (en) Methods and systems for generating and using localization reference data
CN110033489B (en) Method, device and equipment for evaluating vehicle positioning accuracy
KR102420476B1 (en) Apparatus and method for estimating location of vehicle and computer recordable medium storing computer program thereof
RU2668459C1 (en) Position evaluation device and method
Lenac et al. Fast planar surface 3D SLAM using LIDAR
JP2020525809A (en) System and method for updating high resolution maps based on binocular images
CN102967305B (en) Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
CN103487034B (en) Method for measuring distance and height by vehicle-mounted monocular camera based on vertical type target
JP2020500290A (en) Method and system for generating and using location reference data
CN110361027A (en) Robot path planning method based on single line laser radar Yu binocular camera data fusion
CN105160702A (en) Stereoscopic image dense matching method and system based on LiDAR point cloud assistance
CN103499337B (en) Vehicle-mounted monocular camera distance and height measuring device based on vertical target
WO2022127532A1 (en) Method and apparatus for calibrating external parameter of laser radar and imu, and device
CN113673282A (en) Target detection method and device
CN102042835A (en) Autonomous underwater vehicle combined navigation system
CN112346463B (en) Unmanned vehicle path planning method based on speed sampling
KR20210061722A (en) Method, apparatus, computer program and computer readable recording medium for producing high definition map
CN112799096B (en) Map construction method based on low-cost vehicle-mounted two-dimensional laser radar
CN113643345A (en) Multi-view road intelligent identification method based on double-light fusion
CN112455502A (en) Train positioning method and device based on laser radar
CN112068152A (en) Method and system for simultaneous 2D localization and 2D map creation using a 3D scanner
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
Meissner et al. Simulation and calibration of infrastructure based laser scanner networks at intersections
WO2020118623A1 (en) Method and system for generating an environment model for positioning

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant