CN110411462A - A kind of GNSS/ inertia/lane line constraint/odometer multi-source fusion method - Google Patents
A kind of GNSS/ inertia/lane line constraint/odometer multi-source fusion method Download PDFInfo
- Publication number
- CN110411462A CN110411462A CN201910659041.4A CN201910659041A CN110411462A CN 110411462 A CN110411462 A CN 110411462A CN 201910659041 A CN201910659041 A CN 201910659041A CN 110411462 A CN110411462 A CN 110411462A
- Authority
- CN
- China
- Prior art keywords
- lane line
- odometer
- gnss
- constraint
- carrier
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
Abstract
The invention discloses a kind of GNSS/ inertia/lane line constraint/odometer multi-source fusion methods, first with visual sensor, generate lane line map data base offline using view-based access control model+crowdsourcing model.When carrier carries out real-time resolving navigator fix, based on difference GNSS/INS tight integration, when GNSS serious shielding situation, pass through the relativeness of the visual sensor carrier detection and lane line installed on carrier, under the auxiliary of lane line map, it is dynamically added lane line constraint observational equation auxiliary positioning, effectively inhibit the position diverging laterally and in elevation both direction, and the speed observation of vehicle forward direction is provided by wheeled odometer, and carrier forward error is significantly improved with the Observable satellite of road direction of advance above carrier.The present invention can effectively control the location error drift on three directions, be the effective scheme for realizing centimeter-level positioning under the complex environment of city under the DYNAMIC COMPLEX environment of Typical Urban.
Description
Technical field
The present invention relates to integrated navigation fields more particularly to a kind of GNSS/ inertia/lane line constraint/odometer multi-source to melt
Conjunction method.
Background technique
Operation, GPS of America, Russian GLONASS are built up with China's Beidou satellite navigation system regional service system
Modernization and European Union Galileo, Japan QZSS, India's IRNSS system construction, GNSS (GNSS, Global
Navigation Satellite System) just flourishing towards the direction of multi-frequency and multi-system.Then, it is hidden since GNSS exists
Three big fragility problems of gear, interference and deception, can not provide reliable, continuous positioning service in complex environment.Therefore, it integrates
A variety of different navigation systems and sensor merge all available signal sources, the positioning of full source and the navigation quilt of the mode of plug and play
It proposes, it is intended to merge all effective informations, by advanced adaptive navigation algorithm, realize under any platform, any environment
High Availabitity, highly reliable seamless navigation, depend on GNSS unduly to get rid of.
There is huge with public location-based service field in mapping traverse measurement field for high-precision positioning and orientation technology
Demand.Traditional RTK/INS post-processing Integrated navigation mode can substantially meet the demand of traverse measurement, but in recent years, with automatic
Drive is that more stringent requirements are proposed to location navigation for the rise of intelligent industry of representative.At present Google, Waymo, Tesla,
The scientific & technical corporation such as Baidu, Jingdone district have had free environments substantially or have limited the unmanned ability under scene.However, urban environment
It is complicated and changeable, it is difficult to predict, in the case where uncertain a large amount of and knowledge is incomplete, automatic Pilot is faced with huge wind
Danger.Wherein, can the busy automatic Pilot of precise positioning under urban environment realize the key of a wide range of landing application.Currently, from
The dynamic location technology that drives is all made of multi-sensor cooperation work to complete high accuracy positioning, including Global Satellite Navigation System
(GNSS) receiver, inertial sensor, laser radar, visual sensor, odometer etc..However, GNSS signal is easy by city
City's housing-group, blocking for overhead tunnel lose stationkeeping ability, that is, allow to locking satellite signal, also tend to show low signal-to-noise ratio,
The problem of more cycle slips, more rough errors, fuzziness fixed error, cause data processing difficulty very big;Traditional inertial navigation/odometer boat position
Skill of deduction and calculation cannot keep high-precision autonomous positioning ability for a long time;Determine in addition, laser radar/visual sensor belongs to matching
Position technology, by the interference of road vehicle, pedestrian's dynamic object, meanwhile, the environmental factors such as low illumination, weak texture, motion blur
Influence serious, how to solve these difficulties, realize the high accuracy positioning of multi-resources Heterogeneous data fusion, be under urban environment from
It is dynamic to drive one of bottleneck problem urgently to be resolved.
In view of the above-mentioned problems, The present invention gives a kind of GNSS/ inertial navigation/lane line constraint/odometer multi-source fusion sides
Method under lane line map auxiliary, provides that lane line is lateral, elevation constraint observation based on difference GNSS/INS tight integration
Equation, by above wheeled odometer and carrier and the Observable satellite of road direction of advance is provided before carrier to constraint, thus
Inhibit ins error diverging in three directions, reliable, continuous high precision position and posture is provided.
Summary of the invention
The technical problem to be solved in the present invention is that for the defects in the prior art, providing a kind of GNSS/ inertia/vehicle
Diatom constraint/odometer multi-source fusion method.
The technical solution adopted by the present invention to solve the technical problems is:
The present invention provides a kind of GNSS/ inertia/lane line constraint/odometer multi-source fusion method, in this method, passes through view
Feel that sensor acquires image, it is offline to generate lane line map data base;When carrier carries out real-time resolving navigator fix, with difference
Based on GNSS/INS tight integration, when GNSS is blocked, pass through the visual sensor carrier detection installed on carrier and lane line
Relativeness is dynamically added lane line constraint observational equation and assists lateral, elevation location under the auxiliary of lane line map;And
It forms difference observation with the Observable satellite of road direction of advance using above carrier and participates in tight integration, constrain before carrier to accidentally
Difference diverging.
Further, this method of the invention specifically includes the following steps:
It is empty to carry out color to image for step 1, the vertical view image that lane line image is obtained to the Yunnan snub-nosed monkey acquired in real time
Between conversion isolate lane line, and determine the coordinate of lane line based on the statistics of histogram of sliding window, finally utilize base
Lane line is tracked in the Kalman filtering at the uniform velocity assumed, obtain the image center of continuously smooth apart from left and right image away from
From;
Step 2, the minimum voxel that position is forecast using octotree data structure fast search to inertial navigation, utilize euclidean geometry
Furthest Neighbor finds closest lane line node, and carries out course/topological coherence to node and examine, according to curvature after upchecking
Return to the P of best matchL1、PL2、PR1、PR2;
Step 3 forecasts position according to inertial navigation, the vacation after overall height and outer participate-reform just with camera, on the face of lane
If constructing lane line elevation constrains observational equation;
Step 4, by PL1、PL2、PR1、PR2Four points are raised at image center, according to image center predicted value to lane line
Distance should be equal to observed range hypothesis building image center and lane line sidewise restraint observational equation, and utilize chain method
Then, it is converted to inertial navigation center;
Step 5, according to carrier linear motion when skid it is less and turn when it is slow it is assumed that building odometer before
To the observational equation of constraint;
Step 6 constrains observational equation according to the forward direction of lane line elevation, the equation of sidewise restraint and odometer, realizes
Multi-source fusion observation updates.
Further, the method for lane line tracking is carried out in step 1 of the invention specifically:
Step 1.1, pretreatment: monocular calibration is carried out to monocular camera by Matlab calibration tool case, is obtained in camera
Parameter carries out distortion correction to the lane line image that monocular camera acquires using camera intrinsic parameter;According to monocular camera installation
Position and field range delimit area-of-interest, the interference of other extraneous areas excluded, and utilize homography conversion, by lane line
It is restored under overlooking state;
Step 1.2, lane line drawing: input lane line orthography will by the Threshold segmentation of color and marginal information
Lane line orthography binaryzation;Pixel window is divided to image base, and histogram from left to right is carried out to window gray value
Figure statistics, gray average curve graph is smoothly obtained using Gaussian kernel, extracts crest location as lane line position;
Step 1.3, lane line tracking: lane line is tracked using Kalman filtering, before operating speed model foundation
The relationship of epoch afterwards, state equation are as follows:
Wherein, qxAnd qvIt is the process noise of position x and speed v respectively, k is the moment;
Observational equation are as follows:
yk+1=xk+1+ε
Wherein, yk+1For the lane line position currently extracted;ε indicates to extract the observation noise of lane line position;
After filtering, the position of each window i is obtainedThe position of continuous window is subjected to conic fitting again, thus
To the lane line drawing of all the period of time continuously smooth, the smooth excessiveness to lane line dotted line, zebra stripes is realized;Lane line is extracted in success
Afterwards, the x coordinate by the bottom of acquisition or so lane line under camera coordinates system, the i.e. distance of camera and left and right lane line.
Further, the lane line fast search that Octree assists in step 2 of the invention method particularly includes:
Step 2.1, closest lane line node searching: position is forecast according to inertial navigation, is deposited from the space Octree of lane line
Storage structure finds the corresponding child node voxel of forecast point;After finding voxel, all lane line coordinate points in voxel are traversed, according to
European geometric distance method is found apart from closest point;
Step 2.2, consistency judgement: before obtaining carrier according to the course information in the posture of the location point of inertial navigation forecast
Into direction, consistency judgement is carried out by the course of direction of advance and the lane line node searched, if the difference in the two course is big
In threshold value, then show that the lane line node searched is wrong;
The topological relation for the lane line node that the lane line node and current search being successfully searched according to the historical juncture arrive
It examines, judges whether history node and present node have adjacent topological relation, if not having, show the lane line searched
Node is wrong;
Conversely, showing that the lane line node searched is correct.
Further, the lane line elevation constraint observational equation constructed in step 3 of the invention are as follows:
Wherein,To forecast distance, dIFor constant value, HlaneFor the coefficient matrix of elevation constraint, δ re、δve、φ、ab、εbPoint
It Biao Shi not position, speed, attitude error and accelerometer bias, gyro zero bias;The coefficient matrix of elevation constraint indicates are as follows:
Wherein, (A, B, C) is floor coefficient.
Further, the sidewise restraint observational equation of image center and lane line is constructed in step 4 of the invention are as follows:
Wherein, fL、fRIndicate observe with a distance from the lane line of left and right,Indicate that forecast positional distance searches
Left and right lane line distance,Indicate observation to the partial derivative of location error, attitude error.
Further, to the observational equation of constraint before the odometer constructed in step 5 of the invention are as follows:
Wherein, δ vbThe difference of speed under the b system of expression odometer observation and inertial navigation forecast,Indicate inertial navigation forecast
ECEF system relative to b system spin matrix,Speed, l under the ECEF system of expression inertial navigation forecastbIndicate odometer and inertial navigation
Lever arm,Indicate rotational-angular velocity of the earth.
Further, after GNSS/ lane line constraint/odometer fusion is merged in step 6 of the invention, obtained observation side
Journey are as follows:
Wherein, vPAnd vLIndicate carrier wave and carrier phase observable,Indicate the direction cosines of double difference observation between star between station to
Amount,Indicate that the lever arm of antenna and inertial navigation under ECEF system, δ N are double difference ambiguity state parameter, εP、εL、εf、εd、εvbDivide table
Indicate the observation noise of carrier wave, phase, the lateral observation of lane line, lane line elevation observation and odometer observation.
Further, the method for multi-source fusion is carried out in step 6 of the invention specifically:
It is dynamically selected and lane line, odometer constraint whether is added, when GNSS meets observation condition, only use GNSS/INS
It is positioned, when GNSS is blocked or is interrupted, odometer constraint is added, and decide whether to add according to lane line detection event
Enter lane line constraint.
The beneficial effect comprise that: GNSS/ inertia of the invention/lane line constraint/odometer multi-source fusion side
Method, 1) each sensor information complementary localisation characteristic is taken full advantage of, to realize the position constraint on three directions, greatly press down
The error diverging of inertial navigation in all directions is made.2) different sensors, the constraint between different data are independent of each other, and can be mutual
Calibration is constrained, there is double constraints guarantee, reliability of positioning is high.3) it according to the different characteristics of typical urban environment, can dynamically select
The observational constraints that this kind of information whether is added are selected, positioning means are flexible.4) odometer and visual sensor are at low cost, and big at present
More vehicle carriers have in itself, therefore the fusion and positioning method increases cost using convenient, and substantially without additional.5) pass through
The constraint in three directions, the GNSS break period that may make the localization method to allow extended into five minutes or so, at this five minutes
Break period in, positioning accuracy can still maintain decimetre to centimetre rank.
Detailed description of the invention
Present invention will be further explained below with reference to the attached drawings and examples, in attached drawing:
Fig. 1 is GNSS/ inertial navigation/lane line constraint/odometer multi-source fusion algorithm general flow chart of present example;
Fig. 2 is the GNSS/ inertial navigation tight integration structure chart of present example;
Fig. 3 is the lane line drawing flow chart of present example;
Fig. 4 is the closest lane line search routine figure of present example;
Fig. 5 is that the lane line of present example constrains schematic diagram;
Fig. 6 is the multi-source fusion positioning flow figure of present example.
Specific embodiment
In order to make the objectives, technical solutions, and advantages of the present invention clearer, with reference to the accompanying drawings and embodiments, right
The present invention is further elaborated.It should be appreciated that described herein, specific examples are only used to explain the present invention, not
For limiting the present invention.
Whole filter frame of the invention is GNSS/INS tight integration Extended Kalman filter, tight integration structure such as Fig. 2 institute
Show, navigational coordinate system is selected as ECEF system, and corresponding SINS mechanization also carries out under ECEF system.In tight integration, GNSS and
The raw observation of SINS is input to jointly in a Kalman filter, Combined estimator navigational parameter (position, speed and appearance
State), SINS systematic error and GNSS relevant parameter (fuzziness), and Closed-cycle correction technology is used, to SINS systematic error
Carry out feedback compensation.GNSS/SINS tight integration state model and observation model, as follows respectively:
δ z=H δ X+ η (2)
In formula (1), δ XSINS=(δ re δve φ ab εb)T,The shape of respectively SINS and GNSS
State parameter, due to using the station-keeping mode of double difference, GNSS receiver clock deviation has been eliminated, and difference is fuzzy between having left behind station
Degree.F is state differential equation coefficient matrix, and w is process noise;In formula (2), δ z is observation residual error, and H is design matrix, and η is
Observation noise.
According to above-mentioned tight integration mathematical model, estimated using Kalman filter, general process of solution includes a step
Status predication and observation update two parts, specific as follows:
One step status predication:
Observation updates:
In formula (3), Xk,k-1To forecast state, Φk,k-1For state-transition matrix, Qk-1For process noise, Δ tkIt is gone through for two
Time interval between member, Pk,k-1To forecast variance.In formula (4), KkFor gain coefficient, RkFor observation noise, XkAnd PkTo filter shape
State and its variance.
The above tight integration mathematical model and the general process of solution of Kalman filter are the bases of inventive algorithm, below will knot
Technology path shown in FIG. 1 is closed, key technology and its implementation method expansion narration in detail to module each in the present invention.
One, the lane line drawing based on Kalman filtering
Lane line extracting method can divide lane line extracting method based on roadway characteristic and road model and based on depth
The lane line extracting method two major classes of habit.Deep learning has preferable detection effect, but relies on a large amount of sample data and carry out
Model training.Currently, we still in a conventional manner based on, be different from road background using lane line feature, color, structure etc.
Information, carry out isolated lane line, and improve using the kalman filter method of sliding window, process as shown in figure 3,
Main includes that pretreatment, lane line drawing and lane line track three big modules.
1) it pre-processes
Before acquiring data, monocular calibration is carried out to monocular camera by Matlab calibration tool case, obtains camera internal reference
Number, including focal length fx、fy, principal point be displaced cx、cyAnd the coefficient of radial distortion and tangential distortion coefficient of camera.When pretreatment,
Distortion correction is carried out to the lane line image that monocular camera acquires using above-mentioned parameter.According to the position of monocular camera installation and view
Wild range delimit area-of-interest, exclude the interference of other extraneous areas, and utilize homography conversion formula (5), and lane line is extensive
It answers under overlooking state, H is corresponding homography matrix, and u, v are the coordinate under pixel coordinate system, and [X, Y, Z] is overlooking state
Under corresponding points coordinate, subsequent lane line extraction process can be convenient in this way.
2) lane line drawing
Lane line orthography is inputted, by the Threshold segmentation of color and marginal information, by lane line orthography two-value
Change.Since lane line is usually white and yellow, and lane line is that guidance driver drives, usually higher to light reflected intensity,
Color is brighter, therefore the comprehensive channel B for using LAB color space, HSL and the luminance channel in hsv color space detect lane line,
Lane line and background to be separated.After binaryzation, to image base divide pixel window, and to window gray value carry out from
Then left-to-right statistics with histogram smoothly obtains gray average curve graph using Gaussian kernel, extract crest location as lane
Line position.
3) lane line tracks
Due to lane line be in vehicle travel process it is continuous slowly varying, according to this property, Kalman can be used
Filtering carries out dynamic, continuous, smooth pursuit to lane line.
The relationship of epoch before and after operating speed model foundation, state equation is as follows, in formula, qxAnd qvIt is position x and speed respectively
Spend the process noise of v:
Observational equation is as follows, yk+1For the lane line position currently extracted:
yk+1=xk+1+ε (7)
After filtering, the position of each window i is obtainedThe position of continuous window is finally subjected to conic fitting again,
To obtain the lane line drawing of all the period of time continuously smooth, the smooth excessiveness to lane line dotted line, zebra stripes is realized.
After lane line is extracted in success, by x coordinate of the bottom of acquisition or so the lane line under camera coordinates system, i.e., camera and
The distance l of left and right lane lineLAnd lR, feed back to GNSS/INS filter.
Two, the lane line fast search of Octree auxiliary
Lane line map includes all information in lane, including coordinate, course, curvature, color, topological relation etc..For
Lane line search efficiency is improved, when storing to lane line map datum, using octree structure, carries out tub of tissue in coordinate domain
Reason establishes spatial index to a cloud coordinate, realizes the space clustering of lane line coordinates, divides.Each node of Octree indicates
The volume element of one square, father node can be divided into eight child nodes, and so segmentation is gone down, until indivisible.Store vehicle
When diatom map, space Octree minimum voxel is dimensioned to 10m, then for the lane of 40km long, at most
Also 12 search is only needed just to may know which voxel lane line coordinate is located at.
1) closest lane line node searching
Position, point as shown in Figure 5 are forecast according to inertial navigationIt finds and forecasts from the space Octree storage organization of lane line
The corresponding child node voxel of point.After finding voxel, all lane line coordinate points in voxel are traversed, according to European geometric distance method
It finds apart from closest point.
2) consistency judges
The direction of advance of carrier is obtained according to the course information in the posture of the location point of inertial navigation forecast, passes through direction of advance
Consistency judgement is carried out with the course of the lane line node searched, if the difference in the two course is greater than threshold value, shows to search
Lane line node it is wrong.
The topological relation for the lane line node that the lane line node and current search being successfully searched according to the historical juncture arrive
It examines, judges whether history node and present node have adjacent topological relation, if not having, show the lane line searched
Node is wrong.
If judging by above-mentioned consistency, show that the lane line node searched is correct.Since carrier is actually locating
Lane line be the two sections of continuous curves in left and right, and what is searched at present is only the closest lane line node of left and right two, therefore
The curvature information according to node is needed first, and n node that node is extrapolated forward backward, n depends on the size of node curvature, bent
Rate greatly then n can also suitably amplify.2 node P that left-lane line can be obtained are postponed outsideL1、PL2And 2 nodes of right-lane line
PR1、PR2, as shown in Figure 5.
Three, lane line assisted GNSS/INS elevation location constraint
Closest left-lane line node P has been obtainedL1、PL2With 2 node P of right-lane lineR1、PR2, theoretically,
If inertial navigation forecast position does not have error, wherein the heart channel of Hang-Shaoyin is crossed overall height and should be just located on the face of lane afterwards with the outer participate-reform of camera,
Thus elevation constraint can be established.
Assuming that the distance on inertial navigation center to ground is previously known, it is set as dI, 4 coordinates of left and right lane line it is also known that, be
(PL1,PL2,PR1,PR2), ground level is represented by AX+BY+CZ+1=0, and 4 coordinates of lane line are substituted into, are acquired
Floor coefficient (A, B, C), according to point arrive face range formula, can obtain inertial navigation center to ground level distance are as follows:
Because vehicle one is scheduled on upper pavement surface, the absolute value sign in formula can remove, and be ultimately expressed as inertial navigation position
Set the observational equation of error:
In formula,To forecast distance, the predicted value of inertial navigation position is substituted into (8) formula and is obtained, HlaneIt is for what elevation constrained
Matrix number is expressed as follows:
The observational equation eventually formed is as follows:
dIIt can also be calculated in real time for constant value if do not measured in advance.In the good situation of GNSS signal,
GNSS/SINS combines the positioning result of available Centimeter Level, directly calculates inertial navigation center at this time to the vertical range on ground, protects
For used in the constraint of subsequent elevation after depositing.
Four, lane line assisted GNSS/INS lateral register constraint
Sidewise restraint equation is needed through image center as medium, and by lane line drawing, homography conversion, we are obtained
Image center is arrived to the horizontal distance of lane line, the distance of left and right lane line is denoted as d respectivelyLAnd dR.Assuming that image center
Position is Pc, to road surface, there are a height values for it, therefore, to calculate camera to the horizontal distance of lane line, image center must
It must be with road surface in the same plane.
For unified calculation, when need to construct observational equation again, road surface is lifted at image center, specific practice is as follows: 1)
Distance d of the acquisition image center to road surfacec.In step 4, inertial navigation center is had been obtained to the distance on road surface in we, and inertial navigation
Space with camera is that calibration in advance is good before acquiring data again, is belonged to known.Therefore, distance of the image center to road surface For the vertical distance of the two;2) by road surface coordinate (PL1,PL2,PR1,PR2) raise, due to ground level coefficientThe as unit normal vector of the plane, therefore, road surface coordinate add the same vector incrementObtain new road surface coordinate (P 'L1,P′L2,P′R1,P′R2) it is to be raised in camera
The lane line of the heart.
At this point, multiplication cross obtains the four parallel areas that two vectors surround by two vectors of construction, then divided by bottom edge,
Height, the as horizontal distance of image center to lane line is calculated:
(Pc-P′L1)×mLThe expansion of vector multiplication cross indicates are as follows:
Squared-distance is obtained using MATLAB algebra tool box as new observed quantity to after the square distance of (12) formula both sides
fL=(dL·||mL||)2To PcPartial derivative:
In formula, M1、M2And M3It is expressed as follows:
By image center coordinate PcWith inertial navigation centre coordinate PSINSRelationship are as follows:
In formula,lbIt is lever arm component of the image center under inertial navigation coordinate system.Thus P is obtainedcTo inertial navigation position
The partial derivative of appearance error are as follows:
In formula, re=PSINS, in order to consistent with state equation.It is put down according to chain rule by (14) formula and (17) formula
Partial derivative of the side distance f to inertial navigation position and attitude error:
Finally, obtaining observational equation are as follows:
Five, odometer assisted GNSS/INS forward location constraint
To constraint, there are two assume before being carried out using wheeled odometer: 1) motion carrier does not break away, along carrier coordinate system
Forward direction motion, i.e. side velocity are zero;2) carrier does not fluctuate up and down close to ground motion, i.e., vertical velocity is zero.
The forward speed obtained by odometer are as follows:
In formula,It is the speed that odometer speed is gone under inertial navigation this system,It is odometer speed, only in forward direction
There is speed, remaining two axis is zero,lbPosition vector of the odometer center under inertial navigation this system, i.e., in
The correction of journey meter lever arm.
The observation model constrained from the formula odometer derives, and position and speed state is in SINS
On the basis of the heart.Using method of perturbation, can obtain:
At this point, it is as follows to derive resulting observational equation:
The above are the observation models strictly derived, but can be simplified according to the actual situation to drag.
When straight line when driving, inertial navigation system not relative to ground rotation,Close to zero, at this point, odometer speed
Equal to the forward speed under inertial navigation system.When the vehicle is turning,It is calculatedSpeed point can occur in direction finding (X-axis)
Amount, assume with nonholonomic restriction it is inconsistent, this is because Ackermann steer angle is using rear-wheel as fixed-axis rotation, not on rear-wheel axial plane
Point velocity of rotation can all occur, but in practice surface car turn slower, the velocity component very little, slightly amplification point see
Noise is surveyed, can directly think vSINS=vDMI.Observational equation can be simplified at this time:
In formula,WithFor the value that current time mechanization obtains, the expression of subscript 2 takes the second behavior:
Six, multi-source fusion observation updates
In above-mentioned steps, the forward direction of lane line elevation, the equation of sidewise restraint and odometer is had been obtained about in we
Beam observational equation.Entire fusion positioning system based on difference GNSS/SINS tight integration, see by dynamic addition lane line constraint
Equation and odometer observational equation are surveyed, as shown in Figure 6.
When GNSS satellite number is more and PDOP value is smaller, illustrate that GNSS positioning geometrical condition is met the requirements, at this point, can
To leave out the observation of lane line and odometer, state equation is still constant.
When GNSS satellite number is less and when lane detection success, be added into observational equation lane line lateral and
Elevation constraint equation.
Under the built-up City scenarios in both sides of the road, GNSS signal serious shielding, but it is less to blocking before and after road,
GNSS satellite is still as it can be seen that at this point, above selection carrier, i.e., the highest satellite of elevation angle is as proper star, by proper star and road
Sagittal visible satellite composition double difference observation participates in positioning calculation, and the forward direction constraint of odometer is added simultaneously.Work as vehicle
Under overpass, in the scenes such as tunnel, GNSS satellite will be all invisible, and forward direction is only with odometer error constraints at this time.
It is as follows completely to establish GNSS/ lane line constraint/odometer fusion observational equation:
vPAnd vLIndicate that carrier wave and carrier phase observable, ε are each observation noise.
It should be understood that for those of ordinary skills, it can be modified or changed according to the above description,
And all these modifications and variations should all belong to the protection domain of appended claims of the present invention.
Claims (9)
1. a kind of GNSS/ inertia/lane line constraint/odometer multi-source fusion method, which is characterized in that in this method, pass through view
Feel that sensor acquires image, it is offline to generate lane line map data base;When carrier carries out real-time resolving navigator fix, with difference
Based on GNSS/INS tight integration, when GNSS is blocked, pass through the visual sensor carrier detection installed on carrier and lane line
Relativeness is dynamically added lane line constraint observational equation and assists lateral, elevation location under the auxiliary of lane line map;And
It forms difference observation with the Observable satellite of road direction of advance using above carrier and participates in tight integration, constrain before carrier to accidentally
Difference diverging.
2. GNSS/ inertia according to claim 1/lane line constraint/odometer multi-source fusion method, which is characterized in that
This method specifically includes the following steps:
Step 1, the vertical view image that lane line image is obtained to the Yunnan snub-nosed monkey acquired in real time carry out color space to image and turn
It changes and isolates lane line, and determine the coordinate of lane line based on the statistics of histogram of sliding window, finally using based on even
The Kalman filtering that speed is assumed tracks lane line, obtains distance of the image center of continuously smooth apart from left and right image;
Step 2, the minimum voxel that position is forecast using octotree data structure fast search to inertial navigation, utilize euclidean geometry distance
Method finds closest lane line node, and carries out course/topological coherence to node and examine, and is returned after upchecking according to curvature
The P of best matchL1、PL2、PR1、PR2;
Step 3 forecasts position according to inertial navigation, on the face of lane it is assumed that structure after overall height and outer participate-reform just with camera
Build lane line elevation constraint observational equation;
Step 4, by PL1、PL2、PR1、PR2Four points are raised at image center, according to image center predicted value to lane line away from
Sidewise restraint observational equation from hypothesis the building image center and lane line that should be equal to observed range, and chain rule is utilized,
It is converted to inertial navigation center;
Step 5, according to carrier linear motion when skid it is less and turn when it is slow it is assumed that building odometer before to about
The observational equation of beam;
Step 6 constrains observational equation according to the forward direction of lane line elevation, the equation of sidewise restraint and odometer, realizes multi-source
Fusion observation updates.
3. GNSS/ inertia according to claim 2/lane line constraint/odometer multi-source fusion method, which is characterized in that
The method of lane line tracking is carried out in step 1 specifically:
Step 1.1, pretreatment: carrying out monocular calibration to monocular camera by Matlab calibration tool case, obtain camera intrinsic parameter,
Distortion correction is carried out to the lane line image that monocular camera acquires using camera intrinsic parameter;According to monocular camera installation position and
Field range delimit area-of-interest, excludes the interference of other extraneous areas, and utilize homography conversion, lane line is restored to
Under overlooking state;
Step 1.2, lane line drawing: input lane line orthography, by the Threshold segmentation of color and marginal information, by lane
Line orthography binaryzation;Pixel window is divided to image base, and histogram from left to right is carried out to window gray value and is united
Meter smoothly obtains gray average curve graph using Gaussian kernel, extracts crest location as lane line position;
Step 1.3, lane line tracking: lane line is tracked using Kalman filtering, is gone through before and after operating speed model foundation
The relationship of member, state equation are as follows:
Wherein, qxAnd qvIt is the process noise of position x and speed v respectively, k is the moment;
Observational equation are as follows:
yk+1=xk+1+ε
Wherein, yk+1For the lane line position currently extracted;ε indicates to extract the observation noise of lane line position;
After filtering, the position of each window i is obtainedThe position of continuous window is subjected to conic fitting again, to obtain complete
The lane line drawing of period continuously smooth realizes the smooth excessiveness to lane line dotted line, zebra stripes;After lane line is extracted in success,
X coordinate of bottom by acquisition or so the lane line under camera coordinates system, the i.e. distance of camera and left and right lane line.
4. GNSS/ inertia according to claim 2/lane line constraint/odometer multi-source fusion method, which is characterized in that
The lane line fast search that Octree assists in step 2 method particularly includes:
Step 2.1, closest lane line node searching: position is forecast according to inertial navigation, stores and ties from the space Octree of lane line
Structure finds the corresponding child node voxel of forecast point;After finding voxel, all lane line coordinate points in voxel are traversed, according to European
Geometric distance method is found apart from closest point;
Step 2.2, consistency judgement: the advance side of carrier is obtained according to the course information in the posture of the location point of inertial navigation forecast
To by the progress consistency judgement of the course of direction of advance and the lane line node searched, if the difference in the two course is greater than threshold
Value, then show that the lane line node searched is wrong;
The topological relation for the lane line node that the lane line node and current search being successfully searched according to the historical juncture arrive is examined,
Judge whether history node and present node have adjacent topological relation, if not having, shows the lane line node searched
It is wrong;
Conversely, showing that the lane line node searched is correct.
5. GNSS/ inertia according to claim 1/lane line constraint/odometer multi-source fusion method, which is characterized in that
The lane line elevation constraint observational equation constructed in step 3 are as follows:
Wherein,To forecast distance, dIFor constant value, HlaneFor the coefficient matrix of elevation constraint, δ re、δve、φ、ab、εbTable respectively
Show position, speed, attitude error and accelerometer bias, gyro zero bias;The coefficient matrix of elevation constraint indicates are as follows:
Wherein, (A, B, C) is floor coefficient.
6. GNSS/ inertia according to claim 2/lane line constraint/odometer multi-source fusion method, which is characterized in that
The sidewise restraint observational equation of image center and lane line is constructed in step 4 are as follows:
Wherein, fL、fRIndicate observe with a distance from the lane line of left and right,Indicate the left side that forecast positional distance searches
The distance of right-lane line,Indicate observation to the partial derivative of location error, attitude error.
7. GNSS/ inertia according to claim 2/lane line constraint/odometer multi-source fusion method, which is characterized in that
To the observational equation of constraint before the odometer constructed in step 5 are as follows:
Wherein, δ vbThe difference of speed under the b system of expression odometer observation and inertial navigation forecast,Indicate inertial navigation forecast
ECEF system relative to b system spin matrix,Speed, l under the ECEF system of expression inertial navigation forecastbIndicate odometer and inertial navigation
Lever arm,Indicate rotational-angular velocity of the earth.
8. GNSS/ inertia according to claim 2/lane line constraint/odometer multi-source fusion method, which is characterized in that
After merging GNSS/ lane line constraint/odometer fusion in step 6, obtained observational equation are as follows:
Wherein, vPAnd vLIndicate carrier wave and carrier phase observable,Indicate the directional cosine vector of double difference observation between star between standing,Table
Show that the lever arm of antenna and inertial navigation under ECEF system, δ N are double difference ambiguity state parameter, εP、εL、εf、εd、εvbTable is divided to indicate to carry
The observation noise of the lateral observation of wave, phase, lane line, lane line elevation observation and odometer observation.
9. GNSS/ inertia according to claim 2/lane line constraint/odometer multi-source fusion method, which is characterized in that
The method of multi-source fusion is carried out in step 6 specifically:
It is dynamically selected and lane line, odometer constraint whether is added, when GNSS meets observation condition, only carried out with GNSS/INS
Odometer constraint is added when GNSS is blocked or is interrupted in positioning, and is decided whether that vehicle is added according to lane line detection event
Diatom constraint.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910659041.4A CN110411462B (en) | 2019-07-22 | 2019-07-22 | GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910659041.4A CN110411462B (en) | 2019-07-22 | 2019-07-22 | GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110411462A true CN110411462A (en) | 2019-11-05 |
CN110411462B CN110411462B (en) | 2021-05-18 |
Family
ID=68362292
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910659041.4A Active CN110411462B (en) | 2019-07-22 | 2019-07-22 | GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110411462B (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110866482A (en) * | 2019-11-08 | 2020-03-06 | 广东工业大学 | Dynamic selection method, device and equipment for odometer data source |
CN110888440A (en) * | 2019-11-28 | 2020-03-17 | 山东三木环保工程有限公司 | Rail vehicle door alignment system and method combining GNSS satellite positioning and shielding plate |
CN111026081A (en) * | 2019-12-10 | 2020-04-17 | 苏州智加科技有限公司 | Error calculation method, device, equipment and storage medium |
CN111272165A (en) * | 2020-02-27 | 2020-06-12 | 清华大学 | Intelligent vehicle positioning method based on characteristic point calibration |
CN111274343A (en) * | 2020-01-20 | 2020-06-12 | 北京百度网讯科技有限公司 | Vehicle positioning method and device, electronic equipment and storage medium |
CN111380516A (en) * | 2020-02-27 | 2020-07-07 | 上海交通大学 | Inertial navigation/odometer vehicle combined navigation method and system based on odometer measurement information |
CN112581795A (en) * | 2020-12-16 | 2021-03-30 | 东南大学 | Video-based real-time early warning method and system for ship bridge and ship-to-ship collision |
CN112596089A (en) * | 2021-03-02 | 2021-04-02 | 腾讯科技(深圳)有限公司 | Fusion positioning method and device, electronic equipment and storage medium |
CN112945266A (en) * | 2019-12-10 | 2021-06-11 | 炬星科技(深圳)有限公司 | Laser navigation robot and odometer calibration method thereof |
CN114136315A (en) * | 2021-11-30 | 2022-03-04 | 山东天星北斗信息科技有限公司 | Monocular vision-based auxiliary inertial integrated navigation method and system |
CN114646992A (en) * | 2022-03-21 | 2022-06-21 | 腾讯科技(深圳)有限公司 | Positioning method, positioning device, computer equipment, storage medium and computer program product |
CN116129389A (en) * | 2023-03-27 | 2023-05-16 | 浙江零跑科技股份有限公司 | Lane line acquisition method, computer equipment, readable storage medium and motor vehicle |
CN116540286A (en) * | 2023-07-06 | 2023-08-04 | 中国科学院空天信息创新研究院 | Multi-source robust fusion positioning method based on non-integrity constraint |
CN116642501A (en) * | 2023-07-25 | 2023-08-25 | 齐鲁空天信息研究院 | Multi-source fusion method for auxiliary positioning of lane lines with inertia as core |
TWI814480B (en) * | 2022-07-11 | 2023-09-01 | 新馳科技股份有限公司 | Vehicle positioning system and vehicle positioning method |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107229063A (en) * | 2017-06-26 | 2017-10-03 | 奇瑞汽车股份有限公司 | A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry |
CN107728175A (en) * | 2017-09-26 | 2018-02-23 | 南京航空航天大学 | The automatic driving vehicle navigation and positioning accuracy antidote merged based on GNSS and VO |
US10151588B1 (en) * | 2016-09-28 | 2018-12-11 | Near Earth Autonomy, Inc. | Determining position and orientation for aerial vehicle in GNSS-denied situations |
CN109405824A (en) * | 2018-09-05 | 2019-03-01 | 武汉契友科技股份有限公司 | A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile |
-
2019
- 2019-07-22 CN CN201910659041.4A patent/CN110411462B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10151588B1 (en) * | 2016-09-28 | 2018-12-11 | Near Earth Autonomy, Inc. | Determining position and orientation for aerial vehicle in GNSS-denied situations |
CN107229063A (en) * | 2017-06-26 | 2017-10-03 | 奇瑞汽车股份有限公司 | A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry |
CN107728175A (en) * | 2017-09-26 | 2018-02-23 | 南京航空航天大学 | The automatic driving vehicle navigation and positioning accuracy antidote merged based on GNSS and VO |
CN109405824A (en) * | 2018-09-05 | 2019-03-01 | 武汉契友科技股份有限公司 | A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile |
Non-Patent Citations (1)
Title |
---|
曾庆喜等: "融合视觉的智能车组合导航技术分析", 《导航定位学报》 * |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110866482A (en) * | 2019-11-08 | 2020-03-06 | 广东工业大学 | Dynamic selection method, device and equipment for odometer data source |
CN110866482B (en) * | 2019-11-08 | 2022-09-16 | 广东工业大学 | Dynamic selection method, device and equipment for odometer data source |
CN110888440A (en) * | 2019-11-28 | 2020-03-17 | 山东三木环保工程有限公司 | Rail vehicle door alignment system and method combining GNSS satellite positioning and shielding plate |
CN112945266A (en) * | 2019-12-10 | 2021-06-11 | 炬星科技(深圳)有限公司 | Laser navigation robot and odometer calibration method thereof |
CN111026081A (en) * | 2019-12-10 | 2020-04-17 | 苏州智加科技有限公司 | Error calculation method, device, equipment and storage medium |
CN111274343A (en) * | 2020-01-20 | 2020-06-12 | 北京百度网讯科技有限公司 | Vehicle positioning method and device, electronic equipment and storage medium |
CN111274343B (en) * | 2020-01-20 | 2023-11-24 | 阿波罗智能技术(北京)有限公司 | Vehicle positioning method and device, electronic equipment and storage medium |
CN111272165B (en) * | 2020-02-27 | 2020-10-30 | 清华大学 | Intelligent vehicle positioning method based on characteristic point calibration |
US11002859B1 (en) | 2020-02-27 | 2021-05-11 | Tsinghua University | Intelligent vehicle positioning method based on feature point calibration |
CN111380516A (en) * | 2020-02-27 | 2020-07-07 | 上海交通大学 | Inertial navigation/odometer vehicle combined navigation method and system based on odometer measurement information |
CN111272165A (en) * | 2020-02-27 | 2020-06-12 | 清华大学 | Intelligent vehicle positioning method based on characteristic point calibration |
CN111380516B (en) * | 2020-02-27 | 2022-04-08 | 上海交通大学 | Inertial navigation/odometer vehicle combined navigation method and system based on odometer measurement information |
CN112581795A (en) * | 2020-12-16 | 2021-03-30 | 东南大学 | Video-based real-time early warning method and system for ship bridge and ship-to-ship collision |
CN112581795B (en) * | 2020-12-16 | 2022-04-29 | 东南大学 | Video-based real-time early warning method and system for ship bridge and ship-to-ship collision |
CN112596089A (en) * | 2021-03-02 | 2021-04-02 | 腾讯科技(深圳)有限公司 | Fusion positioning method and device, electronic equipment and storage medium |
CN112596089B (en) * | 2021-03-02 | 2021-05-18 | 腾讯科技(深圳)有限公司 | Fusion positioning method and device, electronic equipment and storage medium |
CN114136315A (en) * | 2021-11-30 | 2022-03-04 | 山东天星北斗信息科技有限公司 | Monocular vision-based auxiliary inertial integrated navigation method and system |
CN114136315B (en) * | 2021-11-30 | 2024-04-16 | 山东天星北斗信息科技有限公司 | Monocular vision-based auxiliary inertial integrated navigation method and system |
CN114646992A (en) * | 2022-03-21 | 2022-06-21 | 腾讯科技(深圳)有限公司 | Positioning method, positioning device, computer equipment, storage medium and computer program product |
TWI814480B (en) * | 2022-07-11 | 2023-09-01 | 新馳科技股份有限公司 | Vehicle positioning system and vehicle positioning method |
CN116129389A (en) * | 2023-03-27 | 2023-05-16 | 浙江零跑科技股份有限公司 | Lane line acquisition method, computer equipment, readable storage medium and motor vehicle |
CN116540286A (en) * | 2023-07-06 | 2023-08-04 | 中国科学院空天信息创新研究院 | Multi-source robust fusion positioning method based on non-integrity constraint |
CN116540286B (en) * | 2023-07-06 | 2023-08-29 | 中国科学院空天信息创新研究院 | Multi-source robust fusion positioning method based on non-integrity constraint |
CN116642501A (en) * | 2023-07-25 | 2023-08-25 | 齐鲁空天信息研究院 | Multi-source fusion method for auxiliary positioning of lane lines with inertia as core |
CN116642501B (en) * | 2023-07-25 | 2023-09-29 | 齐鲁空天信息研究院 | Multi-source fusion method for auxiliary positioning of lane lines with inertia as core |
Also Published As
Publication number | Publication date |
---|---|
CN110411462B (en) | 2021-05-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110411462A (en) | A kind of GNSS/ inertia/lane line constraint/odometer multi-source fusion method | |
Wan et al. | Robust and precise vehicle localization based on multi-sensor fusion in diverse city scenes | |
CN110412635B (en) | GNSS/SINS/visual tight combination method under environment beacon support | |
CN108983781B (en) | Environment detection method in unmanned vehicle target search system | |
CN111551958B (en) | Mining area unmanned high-precision map manufacturing method | |
CN106908775B (en) | A kind of unmanned vehicle real-time location method based on laser reflection intensity | |
CN111457902B (en) | Water area measuring method and system based on laser SLAM positioning | |
Sun et al. | Robust IMU/GPS/VO integration for vehicle navigation in GNSS degraded urban areas | |
JP5162849B2 (en) | Fixed point position recorder | |
CN114526745B (en) | Drawing construction method and system for tightly coupled laser radar and inertial odometer | |
CN110361027A (en) | Robot path planning method based on single line laser radar Yu binocular camera data fusion | |
CN107229063A (en) | A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry | |
Engel et al. | Deeplocalization: Landmark-based self-localization with deep neural networks | |
Pfaff et al. | Towards mapping of cities | |
Mueller et al. | GIS-based topological robot localization through LIDAR crossroad detection | |
CN110412596A (en) | A kind of robot localization method based on image information and laser point cloud | |
CN114019552A (en) | Bayesian multi-sensor error constraint-based location reliability optimization method | |
CN114325634A (en) | Method for extracting passable area in high-robustness field environment based on laser radar | |
Tang et al. | OdoNet: Untethered speed aiding for vehicle navigation without hardware wheeled odometer | |
CN114565674B (en) | Method and device for purely visually positioning urban structured scene of automatic driving vehicle | |
Zhu et al. | Fusing GNSS/INS/vision with a priori feature map for high-precision and continuous navigation | |
Zinoune et al. | Detection of missing roundabouts in maps for driving assistance systems | |
Srinara et al. | Performance analysis of 3D NDT scan matching for autonomous vehicles using INS/GNSS/3D LiDAR-SLAM integration scheme | |
CN113971438A (en) | Multi-sensor fusion positioning and mapping method in desert environment | |
Gao et al. | Vido: A robust and consistent monocular visual-inertial-depth odometry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |