CN110411462B - GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method - Google Patents

GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method Download PDF

Info

Publication number
CN110411462B
CN110411462B CN201910659041.4A CN201910659041A CN110411462B CN 110411462 B CN110411462 B CN 110411462B CN 201910659041 A CN201910659041 A CN 201910659041A CN 110411462 B CN110411462 B CN 110411462B
Authority
CN
China
Prior art keywords
lane line
constraint
gnss
inertial navigation
observation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910659041.4A
Other languages
Chinese (zh)
Other versions
CN110411462A (en
Inventor
张小红
沈燕芬
朱锋
贾俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201910659041.4A priority Critical patent/CN110411462B/en
Publication of CN110411462A publication Critical patent/CN110411462A/en
Application granted granted Critical
Publication of CN110411462B publication Critical patent/CN110411462B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a GNSS/inertia/lane line constraint/mileometer multi-source fusion method. When the carrier carries out real-time calculation navigation positioning, a differential GNSS/INS tight combination is taken as a basis, when the GNSS blocks serious conditions, the relative relation between the carrier and a lane line is detected through a visual sensor arranged on the carrier, under the assistance of a lane line map, a lane line constraint observation equation is dynamically added for auxiliary positioning, the position divergence in the lateral direction and the elevation direction is effectively inhibited, a speed observation value in the vehicle advancing direction and observable satellites above the carrier and in the road advancing direction are provided through a wheel type odometer, and the forward error of the carrier is obviously improved. The method can effectively control the position error drift in three directions under the typical dynamic complex environment of the city, and is an effective scheme for realizing centimeter-level positioning under the complex environment of the city.

Description

GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method
Technical Field
The invention relates to the field of integrated navigation, in particular to a GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method.
Background
With the establishment and operation of the regional service System of the Beidou Satellite Navigation System in China, the modernization of the American GPS and Russian GLONASS, the construction of the European Union Galileo, Japanese QZSS and India IRNSS systems, the GNSS (Global Navigation Satellite System) is developing in the direction of multi-frequency and multi-System. Then, because GNSS has three vulnerability problems of obstruction, interference and spoofing, reliable and continuous positioning service cannot be provided in a complex environment. Therefore, the full-source positioning and navigation integrating various different navigation systems and sensors, fusing all available signal sources in a plug-and-play mode are provided, all effective information is fused, and high-availability and high-reliability seamless navigation under any platform and any environment is realized through an advanced self-adaptive navigation algorithm, so that the excessive dependence on the GNSS is eliminated.
The high-precision positioning and attitude determination technology has great demands in the fields of surveying and mapping mobile measurement and popular location service. The conventional RTK/INS post-processing combined navigation mode can basically meet the requirements of mobile measurement, but in recent years, the rise of the intelligent industry represented by automatic driving puts higher demands on positioning navigation. Currently, technology companies such as Google, Waymo, Tesla, hundredth, kyoto and the like basically have unmanned driving capability in open environments or limited scenes. However, urban environments are complex and variable, difficult to predict, and under the conditions of large uncertainty and incomplete knowledge, automatic driving faces huge risks. The key to whether the accurate positioning in the urban environment can realize large-range landing application is that automatic driving is carried out. Currently, the automatic driving positioning technology adopts multi-sensor cooperative work to complete high-precision positioning, including a Global Navigation Satellite System (GNSS) receiver, an inertial sensor, a laser radar, a vision sensor, a odometer and the like. However, GNSS signals are easily blocked by urban building groups and overhead tunnels and lose positioning capability, and even if satellite signals can be locked, the problems of low signal-to-noise ratio, multi-cycle slip, multiple gross errors and ambiguity fixing errors are often presented, so that the data processing difficulty is extremely high; the traditional inertial navigation/mileometer dead reckoning technology cannot keep the autonomous positioning capability with high precision for a long time; in addition, the laser radar/vision sensor belongs to a matching positioning technology, is interfered by dynamic objects of road vehicles and pedestrians, and is seriously influenced by environmental factors such as low illumination, weak texture, motion blur and the like, so that the difficult problems are solved, the high-precision positioning of multi-source heterogeneous data fusion is realized, and the method is one of bottleneck problems to be solved urgently in automatic driving in an urban environment.
Aiming at the problems, the invention provides a method for multi-source fusion of GNSS/inertial navigation/lane line constraint/odometer, which is based on differential GNSS/INS tight combination, provides a lane line lateral and elevation constraint observation equation under the assistance of a lane line map, and provides carrier forward constraint through a wheel type odometer and an observable satellite above a carrier and in the advancing direction of a road, thereby inhibiting inertial navigation error divergence in three directions and providing reliable and continuous high-precision pose.
Disclosure of Invention
The invention aims to solve the technical problem of providing a GNSS/inertial navigation/lane line constraint/odometer multi-source fusion method aiming at the defects in the prior art.
The technical scheme adopted by the invention for solving the technical problems is as follows:
the invention provides a GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method, in the method, images are collected through a vision sensor, and a lane line map database is generated off line; when the carrier carries out real-time calculation navigation positioning, based on the tight combination of a differential GNSS/INS, when the GNSS is shielded, the relative relation between the carrier and a lane line is detected through a visual sensor arranged on the carrier, and a lane line constraint observation equation is dynamically added to assist lateral and elevation positioning under the assistance of a lane line map; and forming a differential observation value by using observable satellites above the carrier and in the road advancing direction to participate in tight combination, and restraining forward error divergence of the carrier.
Further, the method of the present invention specifically comprises the steps of:
step 1, preprocessing an image collected in real time to obtain an overlook image of a lane line image, performing color space conversion on the image to separate a lane line, determining the coordinate of the lane line based on the gray histogram statistics of a sliding window, and finally tracking the lane line by using Kalman filtering based on uniform velocity hypothesis to obtain the distance between the center of a continuous and smooth camera and the left and right images;
step 2, quickly searching the minimum voxel of the inertial navigation forecasting position by using an octree data structure, finding the nearest lane line node by using an Euclidean geometric distance method, carrying out course/topology consistency check on the node, and returning the best matching P according to the curvature after passing the checkL1、PL2、PR1、PR2
Step 3, according to the inertial navigation forecast position, constructing a lane line elevation constraint observation equation under the assumption that the vehicle height and the camera are subjected to external parameter correction and the vehicle is located on a lane surface;
step 4, adding PL1、PL2、PR1、PR2Four points are raised to the center of the camera, and the center of the camera and the lane line are constructed according to the assumption that the distance between the forecast values of the center of the camera and the lane line is equal to the observation distanceThe lateral constraint observation equation is converted into an inertial navigation center by utilizing a chain rule;
step 5, constructing an observation equation of forward constraint of the odometer according to the assumptions that the carrier slips less during linear motion and turns at a lower speed;
and 6, realizing multi-source fusion observation updating according to the elevation of the lane line, the equation of lateral constraint and the forward constraint observation equation of the odometer.
Further, the method for tracking the lane line in step 1 of the present invention specifically comprises:
step 1.1, pretreatment: performing monocular calibration on the monocular camera through a Matlab calibration tool box to obtain camera internal parameters, and performing distortion correction on the lane line image acquired by the monocular camera by using the camera internal parameters; according to the installation position and the visual field range of the monocular camera, an interested area is defined, the interference of other irrelevant areas is eliminated, and the lane line is restored to the overlook state by utilizing homography transformation;
step 1.2, lane line extraction: inputting a lane line orthoimage, and binarizing the lane line orthoimage by color and threshold segmentation of edge information; dividing a pixel window at the bottom of the image, performing histogram statistics on the gray value of the window from left to right, obtaining a gray mean curve graph by adopting Gaussian kernel smoothing, and extracting the peak position as the lane line position;
step 1.3, lane line tracking: tracking the lane line by adopting Kalman filtering, and establishing a relation between a front epoch and a rear epoch by using a speed model, wherein a state equation is as follows:
Figure GDA0002930001960000031
wherein q isxAnd q isvRespectively, the process noise of the position x and the velocity v, k being the time;
the observation equation is:
yk+1=xk+1
wherein, yk+1The position of the currently extracted lane line is obtained; epsilon represents the extraction vehicleObservation noise of the lane position;
after filtering, obtaining the position of each window i
Figure GDA0002930001960000041
Performing quadratic curve fitting on the positions of the continuous windows to obtain the continuous and smooth lane line extraction in the whole time period, and realizing the smooth transition of lane line dotted lines and zebra stripes; and after the lane lines are successfully extracted, obtaining the x coordinates of the left and right lane lines at the bottom under a camera coordinate system, namely the distance between the camera and the left and right lane lines.
Further, the specific method for octree-assisted lane line fast search in step 2 of the present invention is:
step 2.1, searching nodes of the nearest lane line: according to the inertial navigation prediction position, searching a sub-node voxel corresponding to the prediction point from a space octree storage structure of the lane line; after finding the voxel, traversing all the lane line coordinate points in the voxel, and finding the nearest point according to the Euclidean geometric distance method;
step 2.2, consistency judgment: obtaining the advancing direction of the carrier according to course information in the posture of the position point predicted by inertial navigation, carrying out consistency judgment according to the advancing direction and the course of the searched lane line node, and if the difference between the two courses is greater than a threshold value, indicating that the searched lane line node is wrong;
according to the topological relation inspection of the lane line nodes successfully searched at the historical moment and the currently searched lane line nodes, whether the historical nodes and the current nodes have adjacent topological relation or not is judged, and if not, the searched lane line nodes are indicated to be wrong;
otherwise, the searched lane line node is correct.
Further, the elevation constraint observation equation of the lane line constructed in the step 3 of the invention is as follows:
Figure GDA0002930001960000042
wherein the content of the first and second substances,
Figure GDA0002930001960000043
to forecast the distance, dIIs constant value, HlaneFor elevation-constrained coefficient matrices, δ re、δve、φ、ab、εbRespectively representing the position, the speed and the attitude error as well as the zero offset of an accelerometer and the zero offset of a gyroscope; the elevation-constrained coefficient matrix is represented as:
Figure GDA0002930001960000044
wherein, (A, B and C) are plane coefficients.
Further, in step 4 of the present invention, the lateral constraint observation equation of the camera center and the lane line is constructed as follows:
Figure GDA0002930001960000051
wherein f isL、fRRepresenting the observed distance from the left and right lane lines,
Figure GDA0002930001960000052
indicating the distance between the forecast location and the searched left and right lane lines,
Figure GDA0002930001960000053
and the partial derivatives of the observed values to the positions and the postures are represented.
Further, the observation equation of the odometer forward constraint constructed in step 5 of the present invention is:
Figure GDA0002930001960000054
wherein, δ vbA difference value representing the velocity of the odometer observed value and the inertial navigation forecast under the b system,
Figure GDA0002930001960000055
Representation inertial navigationThe predicted ECEF is a rotation matrix relative to b,
Figure GDA0002930001960000056
Velocity, l under ECEF system representing inertial navigation predictionbA lever arm for indicating the odometer and the inertial navigation,
Figure GDA0002930001960000057
Representing the rotational angular velocity of the earth.
Further, after the GNSS/lane line constraint/odometer fusion is fused in step 6 of the present invention, the obtained observation equation is:
Figure GDA0002930001960000058
wherein v isPAnd vLRepresenting the carrier and phase observations,
Figure GDA0002930001960000059
the direction cosine vector of the inter-station inter-satellite double-difference observed value is shown,
Figure GDA00029300019600000510
representing the lever arm of the antenna and inertial navigation under ECEF system, deltaN is a double-difference ambiguity state parameter, epsilonP、εL、εf、εd
Figure GDA00029300019600000511
The sub-tables represent the observation noise of the carrier, the phase, the lane line lateral observation value, the lane line elevation observation value and the odometer observation value.
Further, the method for performing multi-source fusion in step 6 of the present invention specifically comprises:
and dynamically selecting whether to add lane line and odometer constraint, when the GNSS meets the observation condition, only using the GNSS/INS for positioning, when the GNSS is shielded or interrupted, adding the odometer constraint, and determining whether to add the lane line constraint according to the detection condition of the lane line.
The invention has the following beneficial effects: the GNSS/inertial navigation/lane line constraint/mileometer multi-source fusion method disclosed by the invention has the advantages that 1) the information positioning complementary characteristic of each sensor is fully utilized, so that the positioning constraint in three directions is realized, and the error divergence of inertial navigation in each direction is greatly inhibited. 2) Different sensors and different data are not influenced by each other, and the data can be checked and corrected by mutual constraint, so that double constraint guarantee is provided, and the positioning reliability is high. 3) According to different characteristics of typical urban environment, whether observation constraints of the information are added or not can be dynamically selected, and the positioning means is flexible. 4) The odometer and the vision sensor are low in cost, and most of vehicle carriers have the odometer and the vision sensor, so that the fusion positioning method is convenient to apply, and the cost is not increased basically. 5) Through the constraint of three directions, the GNSS interruption time allowed by the positioning method can be extended to about five minutes, and the positioning precision can still be maintained in the decimeter to centimeter level in the interruption time of five minutes.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
FIG. 1 is a general flow chart of a GNSS/inertial navigation/lane line constraint/odometry multi-source fusion algorithm of an embodiment of the present invention;
FIG. 2 is a block diagram of an exemplary GNSS/inertial navigation combination of the present invention;
FIG. 3 is a flow chart of lane line extraction according to an embodiment of the present invention;
FIG. 4 is a flowchart of a nearest neighbor lane line search according to an embodiment of the present invention;
FIG. 5 is a lane line constraint diagram of an embodiment of the present invention;
FIG. 6 is a flow chart of multi-source fusion positioning according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The integral filtering framework of the invention is GNSS/INS tightly-combined extended Kalman filtering, the tightly-combined structure is shown in figure 2, a navigation coordinate system is selected as an ECEF system, and corresponding SINS mechanical arrangement is also carried out under the ECEF system. In the tight combination, the raw observations of the GNSS and the SINS are jointly input into a Kalman filter, the navigation parameters (position, velocity and attitude), the SINS system error and the GNSS related parameters (ambiguity) are jointly estimated, and the SINS system error is feedback corrected using a closed-loop correction technique. The GNSS/SINS tightly combined state model and the observation model are respectively as follows:
Figure GDA0002930001960000071
δz=HδX+η (2)
in the formula (1), δ XSINS=(δre δve φ ab εb)T
Figure GDA0002930001960000072
The state parameters of the SINS and GNSS, respectively, due to the double-difference positioning mode, the GNSS receiver clock difference is eliminated, leaving only the inter-station differential ambiguity. F is a state differential equation coefficient matrix, and w is process noise; in the formula (2), δ z is an observed value residual error, H is a design matrix, and η is an observation noise.
Estimating by adopting Kalman filtering according to the tightly combined mathematical model, wherein the general solving step comprises two parts of state prediction and observation updating, and the method comprises the following steps:
and (3) one-step state prediction:
Figure GDA0002930001960000073
and (3) observation updating:
Figure GDA0002930001960000074
in the formula (3), Xk,k-1For the forecast of the state, phik,k-1Is a state transition matrix,Qk-1Is process noise, Δ tkIs the time interval between two epochs, Pk,k-1The variance is forecasted. In the formula (4), KkIs a gain factor, RkTo observe noise, XkAnd PkThe filter state and its variance.
The above tightly combined mathematical model and Kalman filter general solution steps are the basis of the algorithm of the present invention, and the key technologies of the modules and the implementation method thereof will be described in detail below with reference to the technical route shown in fig. 1.
Lane line extraction based on Kalman filtering
The lane line extraction method can be divided into two categories, namely a lane line extraction method based on road characteristics and a road model and a lane line extraction method based on deep learning. Deep learning has a good detection effect, but relies on a large amount of sample data to perform model training. At present, the traditional method is mainly used, the lane line is separated and obtained by using information, such as lane line characteristics, colors, structures and the like, different from the road background, the sliding window Kalman filtering method is adopted for improvement, the flow is shown in fig. 3, and the method mainly comprises three modules, namely preprocessing, lane line extraction and lane line tracking.
1) Pretreatment of
Before data acquisition, the monocular camera is subjected to monocular calibration through a Matlab calibration tool box to obtain camera internal parameters including a focal length fx、fyC displacement of image principal pointx、cyAnd the radial distortion coefficient and the tangential distortion coefficient of the camera. And during preprocessing, carrying out distortion correction on the lane line image acquired by the monocular camera by using the parameters. According to the installation position and the visual field range of the monocular camera, an interested area is defined, the interference of other irrelevant areas is eliminated, and the lane line is restored to the overlooking state by utilizing a homography transformation formula (5), wherein H is a corresponding homography matrix, u and v are coordinates under a pixel coordinate system, and [ X, Y, Z]The coordinates of the corresponding points in the overlooking state can be conveniently extracted and processed by the subsequent lane lines.
Figure GDA0002930001960000081
2) Lane line extraction
The lane line ortho-image is input, and the lane line ortho-image is binarized by color and threshold segmentation of edge information. Since the lane lines are generally white and yellow, and the lane lines are generally high in light reflection intensity and bright in color for guiding a driver to drive, the lane lines are detected by comprehensively using a B channel of an LAB color space, a luminance channel of an HSL color space, and a luminance channel of an HSV color space, so as to separate the lane lines from a background. After binarization, dividing a pixel window at the bottom of the image, performing left-to-right histogram statistics on the gray value of the window, then obtaining a gray mean curve graph by adopting Gaussian kernel smoothing, and extracting the peak position as the lane line position.
3) Lane line tracking
Because the lane line is continuously and slowly changed in the vehicle running process, according to the property, the Kalman filtering can be adopted to dynamically, continuously and smoothly track the lane line.
The relationship between the previous epoch and the next epoch is established by using a velocity model, and the state equation is as follows, wherein q isxAnd q isvProcess noise at position x and velocity v, respectively:
Figure GDA0002930001960000091
the observation equation is as follows, yk+1For the currently extracted lane line position:
yk+1=xk+1+ε (7)
after filtering, obtaining the position of each window i
Figure GDA0002930001960000092
And finally, performing quadratic curve fitting on the positions of the continuous windows to obtain the continuous and smooth lane line extraction in the whole time period, so as to realize the smooth transition of the lane line dotted line and the zebra crossing.
After the lane lines are successfully extracted, the obtained left and right lane lines at the bottom are arranged in a camera coordinate systemX coordinate of lower, i.e. distance l between camera and left and right lane lineLAnd lRAnd feeding back to the GNSS/INS filter.
Two-and eight-branch tree-assisted lane line fast search
The lane line map contains all the information of the lane including coordinates, heading, curvature, color, topological relations, etc. In order to improve the efficiency of lane line searching, when lane line map data are stored, an octree structure is adopted, organization and management are carried out in a coordinate domain, a spatial index is established for point cloud coordinates, and spatial clustering and division of lane line coordinates are realized. Each node of the octree represents a cubic volume element, and a parent node can be divided into eight child nodes, and the division is carried out until the division is not possible. When the lane line map is stored, the size of the minimum voxel of the space octree is set to be 10m, so that for a lane with the length of 40km, at most 12 searches are needed to know the voxel at which the lane line coordinate is located.
1) Nearest lane node search
According to the inertial navigation forecast position, points shown in FIG. 5
Figure GDA0002930001960000093
And searching a sub-node voxel corresponding to the forecast point from the space octree storage structure of the lane line. After finding the voxel, traversing all the lane line coordinate points in the voxel, and finding the point closest to the voxel according to the Euclidean geometric distance method.
2) Consistency determination
And obtaining the advancing direction of the carrier according to the course information in the posture of the position point predicted by inertial navigation, carrying out consistency judgment according to the advancing direction and the course of the searched lane line node, and if the difference between the two courses is greater than a threshold value, indicating that the searched lane line node is wrong.
And according to the topological relation check between the lane line node successfully searched at the historical moment and the currently searched lane line node, judging whether the historical node and the current node have an adjacent topological relation, and if not, indicating that the searched lane line node is wrong.
If the above-mentioned consistency judgment is passed,it indicates that the searched lane line node is correct. Because the lane line where the carrier is actually located is a left and right continuous curve, and only two nearest lane line nodes are searched at present, n nodes are firstly pushed forwards and backwards according to curvature information of the nodes, wherein n depends on the curvature of the nodes, and if the curvature is large, n can be properly amplified. After extrapolation, 2 nodes P of the left lane line can be obtainedL1、PL2And 2 nodes P of the right lane lineR1、PR2As shown in fig. 5.
Thirdly, lane line assisted GNSS/INS elevation positioning constraint
The nearest left lane line node P has been obtainedL1、PL2And 2 nodes P of the right lane lineR1、PR2Theoretically, if there is no error in the inertial navigation prediction position, the center of the inertial navigation prediction position should be located on the lane surface after the vehicle height and the external parameter correction with the camera, so that the elevation constraint can be established.
Assuming that the distance from the inertial navigation center to the ground is known in advance, let dIThe coordinates of the four points of the left and right lane lines are also known as (P)L1,PL2,PR1,PR2) The ground plane can be expressed as a · X + B · Y + C · Z +1, which is 0, and the plane coefficients (a, B, C) are obtained by substituting the four-point coordinates of the lane lines, and the distance from the inertial navigation center to the ground plane is obtained by the equation of the point-to-plane distance:
Figure GDA0002930001960000101
since the vehicle must be above the road surface, the absolute sign in the equation can be removed, and finally expressed as the observation equation of inertial navigation position error:
Figure GDA0002930001960000102
in the formula (I), the compound is shown in the specification,
Figure GDA0002930001960000103
for forecasting distanceFrom, the predicted value of inertial navigation position is substituted into the formula (8), HlaneThe coefficient matrix, which is an elevation constraint, is expressed as follows:
Figure GDA0002930001960000104
the resulting observation equation is as follows:
Figure GDA0002930001960000111
dIthe calculation can be carried out in real time if the calculation is constant and is not measured in advance. Under the condition of good GNSS signals, the GNSS/SINS combination can obtain a centimeter-level positioning result, and the vertical distance from the inertial navigation center to the ground is directly calculated and stored for subsequent elevation constraint.
Fourth, lane line assisted GNSS/INS lateral positioning constraint
The lateral constraint equation needs to take the center of a camera as a medium, the horizontal distance from the center of the camera to a lane line is obtained through lane line extraction and homography transformation, and the distances between the left lane line and the right lane line are respectively recorded as dLAnd dR. Assume that the position of the center of the camera is PcIt has a height to the road surface, so to calculate the horizontal distance of the camera to the lane line, the center of the camera must be on the same plane as the road surface.
For unified calculation, when an observation equation needs to be constructed again, the road surface is lifted to the center of the camera, and the specific method is as follows: 1) obtaining the distance d from the center of the camera to the road surfacec. In step 4, the distance from the inertial navigation center to the road surface is obtained, and the space between the inertial navigation and the camera is calibrated in advance before data acquisition, and is known. Thus, the distance from the center of the camera to the road surface
Figure GDA0002930001960000112
Figure GDA0002930001960000113
The vertical distance between the two; 2) will road surface coordinate (P)L1,PL2,PR1,PR2) Elevation due to ground plane coefficient
Figure GDA0002930001960000114
I.e. the unit normal vector of the plane, so that the road coordinates plus the same vector increment
Figure GDA0002930001960000115
Obtaining a New road surface coordinate (P'L1,P′L2,P′R1,P′R2) I.e. the lane line raised to the center of the camera.
At the moment, two vectors are constructed, cross multiplication is carried out to obtain four parallel areas enclosed by the two vectors, then the four parallel areas are divided by the bottom edge, and the height is obtained through calculation, namely the horizontal distance from the center of the camera to the lane line:
Figure GDA0002930001960000116
(Pc-P′L1)×mLthe vector cross-product expansion is expressed as:
Figure GDA0002930001960000121
squaring the distance between two sides of the (12) formula to obtain a new observed quantity, and obtaining a square distance f by using an MATLAB algebra toolboxL=(dL·||mL||)2To PcPartial derivatives of (a):
Figure GDA0002930001960000122
in the formula, M1、M2And M3Is represented as follows:
Figure GDA0002930001960000123
camera center coordinate PcAnd inertial navigation center coordinate PSINSThe relationship is as follows:
Figure GDA0002930001960000124
in the formula (I), the compound is shown in the specification,
Figure GDA0002930001960000125
lbis the lever arm component of the camera center in the inertial navigation coordinate system. Thereby obtaining PcThe partial derivative of the inertial navigation pose error is as follows:
Figure GDA0002930001960000126
in the formula, re=PSINSIn order to be consistent with the equation of state. According to a chain rule, obtaining a partial derivative of the square distance f to the inertial navigation pose error according to the formula (14) and the formula (17):
Figure GDA0002930001960000127
finally, the observation equation is obtained as:
Figure GDA0002930001960000131
fifthly, odometer-assisted GNSS/INS forward positioning constraint
There are two assumptions about forward constraints using a wheel odometer: 1) the moving carrier does not sideslip and moves along the advancing direction of a carrier coordinate system, namely the lateral speed is zero; 2) the carrier moves against the ground surface without fluctuation up and down, i.e. the vertical speed is zero.
The forward speed obtained from the odometer is:
Figure GDA0002930001960000132
in the formula (I), the compound is shown in the specification,
Figure GDA0002930001960000133
the speed of the odometer is converted to the speed under the inertial navigation system,
Figure GDA0002930001960000134
is the speed of the odometer, only has the speed in the forward direction, the other two shafts are zero,
Figure GDA0002930001960000135
lbthe position vector of the odometer center under the inertial navigation system is corrected, namely the arm correction of the odometer.
And deducing an observation model of the odometer constraint from the formula, wherein the position and the speed state are based on the SINS center. By adopting the perturbation method, the following can be obtained:
Figure GDA0002930001960000136
at this time, the derived observation equation is as follows:
Figure GDA0002930001960000137
the above is a strictly derived observation model, but the model can be simplified according to actual situations.
When the vehicle runs straight, the inertial navigation system does not rotate relative to the ground,
Figure GDA0002930001960000138
and when the speed of the odometer is close to zero, the speed of the odometer is equal to the advancing speed of the inertial navigation system. When the vehicle is turning around, the vehicle is,
Figure GDA0002930001960000141
calculated to obtain
Figure GDA0002930001960000142
The speed component can appear in the direction finding (X axis) and is inconsistent with the non-integrity constraint assumption, because the vehicle turns with the rear wheel as a fixed axis and the turning speed can appear at the point which is not on the axle surface of the rear wheel, but in practice, the ground vehicle turns slowly, the speed component is small, the point is slightly amplified to observe noise, and the situation that v is directly considered to be that v is slightly amplified to observe noise can be directly consideredSINS=vDMI. The observation equation can now be simplified to:
Figure GDA0002930001960000143
in the formula (I), the compound is shown in the specification,
Figure GDA0002930001960000144
and
Figure GDA0002930001960000145
for the values obtained for the mechanical programming at the current moment, the index 2 indicates taking the second action:
Figure GDA0002930001960000146
six, multi-source fusion observation update
In the steps, the equation of lane line elevation and lateral constraint and the forward constraint observation equation of the odometer are obtained. The whole fusion positioning system dynamically adds a lane line constraint observation equation and a mileometer observation equation on the basis of a differential GNSS/SINS tight combination, as shown in FIG. 6.
When the number of GNSS satellites is large and the PDOP value is small, the GNSS positioning geometric condition meets the requirement, at the moment, the observation of lane lines and a mileometer can be omitted, and the state equation is still unchanged.
When the number of GNSS satellites is small and the lane line detection is successful, adding a lateral and elevation constraint equation of the lane line into the observation equation.
In an urban scene of a high-rise forest on two sides of a road, GNSS signals are shielded seriously, but the shielding in the front-back direction of the road is less, GNSS satellites are still visible, at the moment, a satellite above a carrier, namely the satellite with the highest altitude angle, is selected as a reference satellite, the reference satellite and the visible satellites in the front-back direction of the road form a double-difference observation value to participate in positioning and resolving, and meanwhile, forward constraint of a speedometer is added. When the vehicle is under an overhead bridge, in a tunnel and other scenes, the GNSS satellites are completely invisible, and at the moment, the forward direction is only restricted by odometer errors.
The observation equation for completely establishing the fusion of the GNSS/lane line constraint/odometer is as follows:
Figure GDA0002930001960000151
vPand vLRepresenting the carrier and phase observations, with epsilon being the noise of each observation.
It will be understood that modifications and variations can be made by persons skilled in the art in light of the above teachings and all such modifications and variations are intended to be included within the scope of the invention as defined in the appended claims.

Claims (8)

1. A GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method is characterized in that in the method, images are collected through a vision sensor, and a lane line map database is generated in an off-line manner; when the carrier carries out real-time calculation navigation positioning, based on the tight combination of a differential GNSS/INS, when the GNSS is shielded, the relative relation between the carrier and a lane line is detected through a visual sensor arranged on the carrier, and a lane line constraint observation equation is dynamically added to assist lateral and elevation positioning under the assistance of a lane line map; forming a differential observation value by using observable satellites above the carrier and in the road advancing direction to participate in tight combination, and restraining forward error divergence of the carrier; the method specifically comprises the following steps:
step 1, preprocessing an image collected in real time to obtain an overlook image of a lane line image, performing color space conversion on the image to separate a lane line, determining the coordinate of the lane line based on the gray histogram statistics of a sliding window, and finally tracking the lane line by using Kalman filtering based on uniform velocity hypothesis to obtain the distance between the center of a continuous and smooth camera and the left and right images;
step 2, quickly searching the minimum voxel of the inertial navigation forecasting position by using an octree data structure, finding the nearest lane line node by using an Euclidean geometric distance method, carrying out course/topology consistency check on the node, and returning the best matching P according to the curvature after passing the checkL1、PL2、PR1、PR2
Step 3, according to the inertial navigation forecast position, constructing a lane line elevation constraint observation equation under the assumption that the vehicle height and the camera are subjected to external parameter correction and the vehicle is located on a lane surface;
step 4, adding PL1、PL2、PR1、PR2Lifting the four points to the center of the camera, constructing a lateral constraint observation equation of the center of the camera and the lane line according to the assumption that the distance between the predicted value of the center of the camera and the lane line is equal to the observation distance, and converting the equation to the inertial navigation center by utilizing a chain rule;
step 5, constructing an observation equation of forward constraint of the odometer according to the assumptions that the carrier slips less during linear motion and turns at a lower speed;
and 6, realizing multi-source fusion observation updating according to the elevation of the lane line, the equation of lateral constraint and the forward constraint observation equation of the odometer.
2. The GNSS/inertial navigation/lane line constraint/odometry multi-source fusion method according to claim 1, wherein the method for performing lane line tracking in step 1 specifically comprises:
step 1.1, pretreatment: performing monocular calibration on the monocular camera through a Matlab calibration tool box to obtain camera internal parameters, and performing distortion correction on the lane line image acquired by the monocular camera by using the camera internal parameters; according to the installation position and the visual field range of the monocular camera, an interested area is defined, the interference of other irrelevant areas is eliminated, and the lane line is restored to the overlook state by utilizing homography transformation;
step 1.2, lane line extraction: inputting a lane line orthoimage, and binarizing the lane line orthoimage by color and threshold segmentation of edge information; dividing a pixel window at the bottom of the image, performing histogram statistics on the gray value of the window from left to right, obtaining a gray mean curve graph by adopting Gaussian kernel smoothing, and extracting the peak position as the lane line position;
step 1.3, lane line tracking: tracking the lane line by adopting Kalman filtering, and establishing a relation between a front epoch and a rear epoch by using a speed model, wherein a state equation is as follows:
Figure FDA0002930001950000021
wherein q isxAnd q isvRespectively, the process noise of the position x and the velocity v, k being the time;
the observation equation is:
yk+1=xk+1
wherein, yk+1The position of the currently extracted lane line is obtained; epsilon represents the observation noise of the extracted lane line position;
after filtering, obtaining the position of each window i
Figure FDA0002930001950000022
Performing quadratic curve fitting on the positions of the continuous windows to obtain the continuous and smooth lane line extraction in the whole time period, and realizing the smooth transition of lane line dotted lines and zebra stripes; after the lane lines are successfully extracted, the x coordinates of the left and right lane lines at the bottom under the camera coordinate system, namely the distance between the camera and the left and right lane lines, are obtained.
3. The GNSS/inertial navigation/lane line constraint/odometry multi-source fusion method according to claim 2, wherein the specific method for octree-assisted lane line fast search in step 2 is as follows:
step 2.1, searching nodes of the nearest lane line: according to the inertial navigation prediction position, searching a sub-node voxel corresponding to the prediction point from a space octree storage structure of the lane line; after finding the voxel, traversing all the lane line coordinate points in the voxel, and finding the nearest point according to the Euclidean geometric distance method;
step 2.2, consistency judgment: obtaining the advancing direction of the carrier according to course information in the posture of the position point predicted by inertial navigation, carrying out consistency judgment according to the advancing direction and the course of the searched lane line node, and if the difference between the two courses is greater than a threshold value, indicating that the searched lane line node is wrong;
according to the topological relation inspection of the lane line nodes successfully searched at the historical moment and the currently searched lane line nodes, whether the historical nodes and the current nodes have adjacent topological relation or not is judged, and if not, the searched lane line nodes are indicated to be wrong;
otherwise, the searched lane line node is correct.
4. The GNSS/inertial navigation/lane line constraint/odometer multi-source fusion method according to claim 1, wherein the lane line elevation constraint observation equation constructed in step 3 is:
Figure FDA0002930001950000031
wherein the content of the first and second substances,
Figure FDA0002930001950000032
to forecast the distance, dIIs constant value, HlaneFor elevation-constrained coefficient matrices, δ re、δve、φ、ab、εbRespectively representing position error, speed error, attitude error, accelerometer zero offset and gyroscope zero offset; the elevation-constrained coefficient matrix is represented as:
Figure FDA0002930001950000033
wherein A, B and C are plane coefficients.
5. The GNSS/inertial navigation/lane line constraint/odometry multi-source fusion method according to claim 2, wherein the lateral constraint observation equation of the camera center and the lane line is constructed in step 4 as follows:
Figure FDA0002930001950000034
wherein f isL、fRRepresenting the observed distance from the left and right lane lines,
Figure FDA0002930001950000035
indicating the distance between the forecast location and the searched left and right lane lines,
Figure FDA0002930001950000036
and the partial derivatives of the observed values to the positions and the postures are represented.
6. The GNSS/inertial navigation/lane line constraint/odometer multi-source fusion method according to claim 5, wherein the observation equation of the odometer forward constraint constructed in step 5 is:
Figure FDA0002930001950000041
wherein, δ vbA difference value representing the velocity of the odometer observed value and the inertial navigation forecast under the b system,
Figure FDA0002930001950000042
The ECEF system representing the inertial navigation prediction is a rotation matrix relative to the b system,
Figure FDA0002930001950000043
Velocity, l under ECEF system representing inertial navigation predictionbA lever arm for indicating the odometer and the inertial navigation,
Figure FDA0002930001950000044
Representing the earth's own movementThe angular velocity.
7. The GNSS/inertial navigation/lane line constraint/odometer multi-source fusion method according to claim 6, wherein in step 6, after the GNSS/lane line constraint/odometer fusion, the obtained observation equation is:
Figure FDA0002930001950000045
wherein v isPAnd vLRepresenting the carrier and phase observations,
Figure FDA0002930001950000046
the direction cosine vector of the inter-station inter-satellite double-difference observed value is shown,
Figure FDA0002930001950000047
representing the lever arm of the antenna and inertial navigation under ECEF system, deltaN is a double-difference ambiguity state parameter, epsilonP、εL、εf、εd
Figure FDA0002930001950000048
Respectively representing the observation noise of the carrier wave, the observation noise of the phase, the observation noise of the lane line lateral observation value, the observation noise of the lane line elevation observation value and the observation noise of the odometer observation value.
8. The GNSS/inertial navigation/lane line constraint/odometer multi-source fusion method according to claim 1, wherein the method for performing multi-source fusion in step 6 specifically comprises:
and dynamically selecting whether to add lane line and odometer constraint, when the GNSS meets the observation condition, only using the GNSS/INS for positioning, when the GNSS is shielded or interrupted, adding the odometer constraint, and determining whether to add the lane line constraint according to the detection condition of the lane line.
CN201910659041.4A 2019-07-22 2019-07-22 GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method Active CN110411462B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910659041.4A CN110411462B (en) 2019-07-22 2019-07-22 GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910659041.4A CN110411462B (en) 2019-07-22 2019-07-22 GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method

Publications (2)

Publication Number Publication Date
CN110411462A CN110411462A (en) 2019-11-05
CN110411462B true CN110411462B (en) 2021-05-18

Family

ID=68362292

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910659041.4A Active CN110411462B (en) 2019-07-22 2019-07-22 GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method

Country Status (1)

Country Link
CN (1) CN110411462B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110866482B (en) * 2019-11-08 2022-09-16 广东工业大学 Dynamic selection method, device and equipment for odometer data source
CN110888440A (en) * 2019-11-28 2020-03-17 山东三木环保工程有限公司 Rail vehicle door alignment system and method combining GNSS satellite positioning and shielding plate
CN111026081B (en) * 2019-12-10 2021-05-11 苏州智加科技有限公司 Error calculation method, device, equipment and storage medium
CN112945266A (en) * 2019-12-10 2021-06-11 炬星科技(深圳)有限公司 Laser navigation robot and odometer calibration method thereof
CN111274343B (en) * 2020-01-20 2023-11-24 阿波罗智能技术(北京)有限公司 Vehicle positioning method and device, electronic equipment and storage medium
CN111272165B (en) * 2020-02-27 2020-10-30 清华大学 Intelligent vehicle positioning method based on characteristic point calibration
CN111380516B (en) * 2020-02-27 2022-04-08 上海交通大学 Inertial navigation/odometer vehicle combined navigation method and system based on odometer measurement information
CN112581795B (en) * 2020-12-16 2022-04-29 东南大学 Video-based real-time early warning method and system for ship bridge and ship-to-ship collision
CN112596089B (en) * 2021-03-02 2021-05-18 腾讯科技(深圳)有限公司 Fusion positioning method and device, electronic equipment and storage medium
CN114136315B (en) * 2021-11-30 2024-04-16 山东天星北斗信息科技有限公司 Monocular vision-based auxiliary inertial integrated navigation method and system
CN114646992A (en) * 2022-03-21 2022-06-21 腾讯科技(深圳)有限公司 Positioning method, positioning device, computer equipment, storage medium and computer program product
TWI814480B (en) * 2022-07-11 2023-09-01 新馳科技股份有限公司 Vehicle positioning system and vehicle positioning method
CN116129389B (en) * 2023-03-27 2023-07-21 浙江零跑科技股份有限公司 Lane line acquisition method, computer equipment, readable storage medium and motor vehicle
CN116540286B (en) * 2023-07-06 2023-08-29 中国科学院空天信息创新研究院 Multi-source robust fusion positioning method based on non-integrity constraint
CN116642501B (en) * 2023-07-25 2023-09-29 齐鲁空天信息研究院 Multi-source fusion method for auxiliary positioning of lane lines with inertia as core

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107229063A (en) * 2017-06-26 2017-10-03 奇瑞汽车股份有限公司 A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry
CN107728175A (en) * 2017-09-26 2018-02-23 南京航空航天大学 The automatic driving vehicle navigation and positioning accuracy antidote merged based on GNSS and VO
US10151588B1 (en) * 2016-09-28 2018-12-11 Near Earth Autonomy, Inc. Determining position and orientation for aerial vehicle in GNSS-denied situations
CN109405824A (en) * 2018-09-05 2019-03-01 武汉契友科技股份有限公司 A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10151588B1 (en) * 2016-09-28 2018-12-11 Near Earth Autonomy, Inc. Determining position and orientation for aerial vehicle in GNSS-denied situations
CN107229063A (en) * 2017-06-26 2017-10-03 奇瑞汽车股份有限公司 A kind of pilotless automobile navigation and positioning accuracy antidote merged based on GNSS and visual odometry
CN107728175A (en) * 2017-09-26 2018-02-23 南京航空航天大学 The automatic driving vehicle navigation and positioning accuracy antidote merged based on GNSS and VO
CN109405824A (en) * 2018-09-05 2019-03-01 武汉契友科技股份有限公司 A kind of multi-source perceptual positioning system suitable for intelligent network connection automobile

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
融合视觉的智能车组合导航技术分析;曾庆喜等;《导航定位学报》;20170630;第2.2-2.3节 *

Also Published As

Publication number Publication date
CN110411462A (en) 2019-11-05

Similar Documents

Publication Publication Date Title
CN110411462B (en) GNSS/inertial navigation/lane line constraint/milemeter multi-source fusion method
US11002859B1 (en) Intelligent vehicle positioning method based on feature point calibration
CN111551958B (en) Mining area unmanned high-precision map manufacturing method
CN110412635B (en) GNSS/SINS/visual tight combination method under environment beacon support
Ding et al. Lidar inertial odometry aided robust lidar localization system in changing city scenes
CN107782321B (en) Combined navigation method based on vision and high-precision map lane line constraint
Rose et al. An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS
Tao et al. Lane marking aided vehicle localization
JP2022019642A (en) Positioning method and device based upon multi-sensor combination
JP5162849B2 (en) Fixed point position recorder
CN107015238A (en) Unmanned vehicle autonomic positioning method based on three-dimensional laser radar
CN114199259B (en) Multi-source fusion navigation positioning method based on motion state and environment perception
CN106908775A (en) A kind of unmanned vehicle real-time location method based on laser reflection intensity
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN114199240B (en) Two-dimensional code, laser radar and IMU fusion positioning system and method without GPS signal
CN114526745B (en) Drawing construction method and system for tightly coupled laser radar and inertial odometer
CN113654555A (en) Automatic driving vehicle high-precision positioning method based on multi-sensor data fusion
Guo et al. Automatic lane-level map generation for advanced driver assistance systems using low-cost sensors
CN114019552A (en) Bayesian multi-sensor error constraint-based location reliability optimization method
Xiao et al. LIO-vehicle: A tightly-coupled vehicle dynamics extension of LiDAR inertial odometry
CN115930977A (en) Method and system for positioning characteristic degradation scene, electronic equipment and readable storage medium
He et al. Research on GNSS INS & GNSS/INS Integrated Navigation Method for Autonomous Vehicles: A Survey
CN116929363A (en) Mining vehicle autonomous navigation method based on passable map
Deusch et al. Improving localization in digital maps with grid maps
CN113777589B (en) LIDAR and GPS/IMU combined calibration method based on point characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant