CN114889606B - Low-cost high-precision positioning method based on multi-sensor fusion - Google Patents

Low-cost high-precision positioning method based on multi-sensor fusion Download PDF

Info

Publication number
CN114889606B
CN114889606B CN202210471531.3A CN202210471531A CN114889606B CN 114889606 B CN114889606 B CN 114889606B CN 202210471531 A CN202210471531 A CN 202210471531A CN 114889606 B CN114889606 B CN 114889606B
Authority
CN
China
Prior art keywords
lane
vehicle
line
road
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210471531.3A
Other languages
Chinese (zh)
Other versions
CN114889606A (en
Inventor
何科
丁海涛
许男
郭孔辉
张建伟
李玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN202210471531.3A priority Critical patent/CN114889606B/en
Publication of CN114889606A publication Critical patent/CN114889606A/en
Application granted granted Critical
Publication of CN114889606B publication Critical patent/CN114889606B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/343Calculating itineraries, i.e. routes leading from a starting point to a series of categorical destinations using a global route restraint, round trips, touristic trips
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0002Automatic control, details of type of controller or control system architecture
    • B60W2050/0004In digital systems, e.g. discrete-time systems involving sampling
    • B60W2050/0005Processor details or data handling, e.g. memory registers or chip architecture
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a low-cost high-precision positioning method based on multi-sensor fusion, which comprises the steps of map model creation, fusion of multiple base GPS, wheel type odometers and chassis signals, lane change identification, map matching, auxiliary positioning based on a camera and the like, wherein the invention firstly provides a steady lane change identification algorithm, and an improved multi-index weighted evaluation map matching algorithm and a lane left and right boundary point determination method are designed by combining the algorithm; on the basis of obtaining the accurate lane where the vehicle is located and the left and right boundary points by matching, an auxiliary positioning method based on a camera is provided, so that the lateral precision of the vehicle relative to the lane is improved; the invention does not need high-cost equipment such as laser radar and the like, and compared with the existing low-cost fusion positioning technology, the positioning precision is higher, the stability is stronger, and a low-cost and high-precision solution is provided for automatic driving positioning.

Description

Low-cost high-precision positioning method based on multi-sensor fusion
Technical Field
The invention relates to a positioning method in the field of automatic driving, in particular to a low-cost high-precision positioning method based on multi-sensor fusion.
Background
The architecture of autonomous driving can be described in terms of five systems, including positioning, sensing, planning, control, and system management. Modules such as perception planning and control require accurate knowledge of the location of the vehicle in order to perform proper driving decisions and actions. An error of, for example, a few decimeters may cause the vehicle to be positioned in the wrong lane, resulting in wrong driving behavior and thus causing a traffic accident. Therefore, a powerful and accurate positioning system is required for the automatic driving of the automobile, and the accuracy reaches decimeters or even centimeter level.
The Global Positioning System (GPS) provides a convenient solution for global positioning, and is the most commonly used positioning system in vehicle applications. However, the GPS can only provide a low positioning accuracy (10 m), and is affected by signal blocking, multipath, and other factors, and the reliability is poor, and the GPS cannot meet the requirements when applied to positioning of an autonomous vehicle alone. Therefore, GPS is typically integrated with Inertial Measurement Units (IMU), real Time Kinematics (RTK), etc. to achieve higher accuracy. However, in order to achieve high-precision positioning, expensive IMU or RTK needs to be used. Some scholars use lidar to create a map of features of the environment and locate vehicles in the map. While lidar technology can provide accurate and stable positioning methods, the disadvantages are their high power, high computational cost, and high implementation cost, which pose problems for mass production and marketing of autonomous vehicles.
With the progress of computer vision technology, a camera as a low-cost environmental perception sensor has become an indispensable perception device for automatic driving. Researchers have proposed a low-cost positioning method using only a camera, which first estimates an approximate position by dividing an image into meshes and extracting a direction histogram for each cell, and then performs positioning using a map containing environmental landmarks, but the accuracy is not high. Another method of using cameras is a visual odometer, which incrementally calculates the attitude of the vehicle by comparing the relative motion and direction of the left and right cameras tracking feature points at different time frames to perform relative positioning, but the visual odometer has a serious drift error problem, especially in open road scenes.
In order to further improve the performance of the camera-based positioning technology, researchers fuse the relative lateral distance between the lane line identified by the camera and the camera (hereinafter, simply referred to as the line-side distance) with other sensors to realize positioning. When the vehicle runs, the lane where the vehicle is located and the boundary point of the left lane line and the right lane line where the vehicle is located can be changed, and the lane where the vehicle is located and the boundary point where the vehicle is located can be accurately known to effectively use the measured value of the camera for positioning.
Disclosure of Invention
In order to solve the technical problem, the invention provides a low-cost high-precision positioning method based on multi-sensor fusion, which comprises the following steps of:
step 1, establishing a map model:
because the high-precision map contains a large amount of complex details and redundant information, the establishment of the large-range high-precision map is time-consuming, labor-consuming and high in cost, and in order to save cost, the lightweight lane-level map model is established;
step 2, based on the fusion of GPS, wheel type odometer and chassis signals:
(1) Fusing GPS, wheel type odometers and chassis signals based on an extended Kalman filter EKF, and correcting state variables by using relevant measurement values of the wheel type odometers;
(2) Excavating a StandStill signal of a vehicle chassis to carry out kinematic constraint, wherein the kinematic constraint comprises zero-speed correction and zero-angular-speed constraint;
step 3, lane changing identification:
a robust lane change identification method is adopted, lane change identification is carried out by establishing multiple criteria, and when a lane line can be identified and the quality of the lane line is better, the lane line is directly identified according to C 0 Judging whether to change lanes according to the judgment criterion; when the lane line can be identified but the quality of the lane line is poor or the lane line cannot be identified, comprehensively judging by adopting a plurality of criteria of yaw angular velocity, course angle and lane changing duration;
step 4, map matching:
step 4.1, road-intersection matching:
in the running process of the vehicle, three position states of the vehicle are judged: entering a road at an intersection, at a road and leaving the intersection; when the vehicle is on the road, determining the lane by using an in-road lane matching method; when the vehicle leaves from the intersection and enters the road, the camera auxiliary positioning method is started in time;
step 4.2, lane matching in the road:
when the vehicle is identified to run in the road, firstly, taking the current position of the vehicle as a center, searching candidate lanes in a certain range, and then carrying out comprehensive evaluation on the candidate lanes by utilizing a plurality of evaluation functions; calculating the comprehensive score of each candidate lane, and selecting the lane with the highest comprehensive score as a matching result;
step 4.3, determining left and right boundary points:
when a vehicle leaves an intersection and enters a road, taking a starting point of a current lane as a driving starting point of the vehicle, and determining an initial left boundary point and an initial right boundary point of the vehicle according to the driving starting point; performing lane change identification in the running process of the vehicle, if the lane change does not occur, calculating the vehicle running distance according to the wheel revolution of the wheel type odometer, comparing the vehicle running distance with the distance between each discrete point of a lane line, and updating the boundary points by combining the initial boundary points; when lane changing occurs, the lanes are re-matched according to the road matching method, then the vehicle driving starting point and the initial left and right boundary points are re-matched on the new lane, and the step of updating the left and right boundary points when the vehicle continues to drive along the lanes is the same as that when the lane changing does not occur;
and 5, auxiliary positioning based on the camera:
performing camera-based auxiliary positioning on the basis of obtaining the lane where the vehicle is located and the left and right boundary points;
in past researches, line side distance is often merged into EKF for positioning, but the EKF is a linear approximation of a nonlinear system and brings precision loss; aiming at the problem, the invention provides a novel auxiliary positioning method based on a camera, which comprises the following steps: calculating the intersection point of a straight line which passes through the center of the rear axle of the vehicle and is perpendicular to the course angle theta, the left boundary line and the right boundary line and the distance from the vehicle to the left boundary line and the right boundary line; and determining the final positioning coordinate by using the vehicle rear axle center coordinate after the left boundary line positioning and the vehicle rear axle center coordinate after the right boundary line positioning.
Further, in step 1, the invention establishes a lightweight lane-level map model, and a single lane l is represented as:
l={id,pre,suc,p c ,p l ,p r ,q} (1)
wherein id represents the serial number of the lane l, the serial number on the left side is sequentially increased from 1 and sequentially decreased from-1 from the center line of the road; pre represents an intersection or a lane connected to a lane start point; suc is defined as an intersection or a lane connected with the end of the lane; p is a radical of c ,p l And p r Respectively representing a point set of a lane central line, a left lane line and a right lane line; q contains attributes corresponding to lanes, including width, speed limit, and types of left and right lane lines;
wherein p is l And p r Expressed as:
Figure BDA0003622655330000041
Figure BDA0003622655330000042
further, in the step 2 (1), a method for fusing GPS, wheel-type odometers and chassis signals based on extended Kalman filter EKF is adopted, and the state variables are corrected by using the relevant measurement values of the wheel-type odometers; the state variables are expressed as follows:
x=[p x ,p y ,θ,ω] T (4)
wherein p is x And p y The horizontal coordinate and the vertical coordinate of the vehicle under the ENU coordinate system are obtained, theta is the course angle of the vehicle under the ENU coordinate system, and omega is the yaw angular velocity of the vehicle;
the state transition equation is:
x'=Fx+Bu+λ (5)
here, the control input Bu is 0 and λ is process noise;
the invention uses a two-wheel odometer model, the mass center slip angle of the vehicle is beta, and the state transition equation is as follows:
Figure BDA0003622655330000051
Figure BDA0003622655330000052
θ k =θ k-1k-1 T (8)
ω k =ω k-1 (9)
under normal driving and good weather conditions, wheel slip is negligible and the speed is calculated as follows:
Figure BDA0003622655330000053
wherein n is RL And n RR For about the revolutions of the wheel per second, c RL And c RR Is the actual left and right wheel circumference; the yaw angular velocity ω is provided by the chassis ESC;
the state transition matrix F is represented as:
Figure BDA0003622655330000054
when the GPS signal is good, the observation matrix established by the GPS system is:
Figure BDA0003622655330000055
the yaw rate observed value established by using the wheel type odometer signal is as follows:
Figure BDA0003622655330000056
wherein t is R Is the rear axle wheel base, the corresponding observation matrix is:
H=(0 0 0 1) (14);
in the step 2 (2), a StandStill signal of a vehicle chassis is excavated to carry out kinematic constraint, wherein the kinematic constraint comprises zero speed correction and zero angular speed constraint, when the StandStill is 0, the vehicle is in a running state, and when the StandStill is 1, the vehicle is in a static state.
When the StandStill is 1, indicating that the vehicle is in a stationary state, the vehicle speed and the yaw rate are both 0, it is possible to obtain:
p x,k =p x,k-1 (15)
p y,k =p y,k-1 (16)
θ k =θ k-1 (17)
ω k =0 (18)
in this case, F is:
Figure BDA0003622655330000061
further, in step 3, the invention provides a robust lane change identification method, comprising the following steps: the lane change recognition is carried out by establishing multiple criteria, and when the lane line can be recognized and the quality of the lane line is better, the lane change recognition is directly carried out according to the C 0 Judging whether to change lanes according to the judgment criterion; when the lane line can be identified but the quality of the lane line is poor or the lane line cannot be identified, a plurality of criteria of yaw velocity, course angle and lane changing duration are adopted for comprehensive judgment, so that the lane changing can be identified under the condition that the lane line is shielded or worn.
Said C 0 The judgment criteria are as follows:
assuming that the origin of the camera coordinate system is calibrated at the center of the rear axle of the vehicle, C 0 Representative line side distanceThe distances from the actual rear axle center position of the vehicle to the left and right lane lines are respectively
Figure BDA0003622655330000062
And &>
Figure BDA0003622655330000063
Left lane line C 0 Value->
Figure BDA0003622655330000064
Greater than 0, C of the right lane line 0 Value->
Figure BDA0003622655330000065
Less than 0;
when changing lanes across lane lines to the left or right,
Figure BDA0003622655330000066
gradually decreases close to 0 and/or is greater or smaller than>
Figure BDA0003622655330000067
Increase is close to 0; left and right lane lines C at time t 0 Respectively has a value of->
Figure BDA0003622655330000068
And &>
Figure BDA0003622655330000069
Judging a variable xi; the lane change judgment condition is as follows:
Figure BDA00036226553300000610
starting the detection of the track-changing ending state after the track-changing state is recognized, and setting C 0 Threshold value of change amount
Figure BDA00036226553300000611
When the condition (21) is met, judging that the lane change is finished;
Figure BDA0003622655330000071
when not satisfying C 0 When judging the criterion, comprehensively judging by using the judging criteria of the yaw angular speed, the course angle and the lane change duration;
the criterion of the yaw rate is as follows:
identifying whether the vehicle is changing lanes using the change in yaw rate, identifying a lane change by observing the change in yaw rate over a predefined window length M: in the left lane changing process and the right lane changing process, the change curve of the yaw angular velocity experiences a wave crest and a wave trough, and for the left lane changing process, the wave crest has the wave trough; the right lane changing is performed, and the wave crest is formed at the first wave trough;
the course angle judgment criterion is as follows:
if the yaw rate judgment criterion is met, lane changing may occur, and driving on a curve may also occur; heading angle start-end variation δ = | θ in detection period using sliding window slightly longer than window length M endstart If delta is approximately equal to 0, the vehicle changes the lane on the straight lane; otherwise, driving along a curve;
the criterion for judging lane change duration is as follows:
setting complete lane change duration as T LC If the time of the yaw rate from the peak to the trough is more than T LC Then the vehicle is judged to be driving along a curve instead of changing lanes.
Further, in step 4.1, the road-intersection matching process is as follows:
s01, starting;
s02, point-to-point matching, and finding the closest point as a matching point;
s03, judging whether the matching point is in the intersection at the moment, if so, entering S04, and otherwise, entering S05;
s04, if the lane line is not identified by the current camera, judging that the vehicle is in the intersection at the moment, and entering S08, otherwise entering S05;
s05, judging whether the vehicle is at the intersection at the previous moment or not, if so, entering S06, otherwise, carrying out lane matching in the road, and entering S08;
s06, judging whether the road where the current matching point is located is a subsequent road of the intersection, if so, entering a step S07, and if not, entering a step S08;
s07, judging whether the current camera can identify the lane line, if so, judging that the vehicle leaves the intersection at the moment to enter the road, and entering S08, otherwise, entering S08;
and S08, finishing matching.
Further, in the step 4.2 of matching the lanes in the road, when the vehicle runs in the road, the result of matching the lane and the adjacent lane back and forth occurs due to positioning errors when the lane is changed to be close to the lane line. The invention provides an improved multi-index weighted evaluation map matching method by combining the proposed lane change identification method, and the accuracy and the robustness of lane matching in a road are improved.
The plurality of evaluation functions include:
(1) For distance-based evaluation, the evaluation function is set as:
Figure BDA0003622655330000081
wherein d is the distance from the positioning coordinate to the center line of the candidate lane;
(2) Evaluation based on topological relation: setting an evaluation function as F (link), if the current matching lane and the last matching lane have a topological connection relation or are the same lane, F (link) =1, otherwise F (link) =0;
(3) Evaluation based on heading: setting the heading angle of the center line of the candidate lane as theta 1 The heading angle of the vehicle is theta 2 Considering that the heading angle of the vehicle and the direction of the center line of the curve are changed and the vehicle is not necessarily level with the lane when running, the constructed evaluation function is as follows:
Figure BDA0003622655330000082
(4) Evaluation based on lane line type: setting the total number of the lane lines identified by the camera to be M, comparing the lane line type identified by the camera with the lane line type at the corresponding position on the map, recording the number of the lane lines with the same type as N, weakening the evaluation based on the lane line type by using a coefficient in consideration of the positioning uncertainty and the possible change of the lane line type of the same lane, and taking the evaluation function as follows:
Figure BDA0003622655330000091
(5) Evaluation based on lane change recognition: setting the evaluation function as F (lc), and setting the lane l at the above one moment cur And left and right adjacent lanes l left And l right Three lanes are taken as examples, and when lane change is detected:
Figure BDA0003622655330000092
the comprehensive score is as follows:
F sum =F(link)gF(θ)g(F(d)+F(lanetype)+F(lc)) (26)。
and calculating the comprehensive score of each candidate lane, and selecting the lane with the highest comprehensive score as a matching result.
Further, in the step 5 of camera-based auxiliary positioning, the lane line model is:
y=C 3 x 3 +C 2 x 2 +C 1 x+C 0 (27)
wherein C 0 、C 1 、C 2 And C 3 Respectively representing line side distance, gradient, curvature and curvature derivative;
let the rear axle center of the vehicle be (x) M ,y M ) Taking the coordinate system as the origin of a vehicle coordinate system, wherein the horizontal axis points to the front of the vehicle, and the longitudinal axis points to the left side of the vehicle; camera coordinate system R C Is point C (x) C ,y C ) The horizontal and vertical axis directions are the same as the vehicle coordinate system; camera coordinate system R C Is point C (x) C ,y C ) Calibrated at the rear axle center p of the vehicle M (x M ,y M ) The left boundary point of the vehicle is
Figure BDA0003622655330000093
And &>
Figure BDA0003622655330000094
Right boundary point is->
Figure BDA0003622655330000095
And &>
Figure BDA0003622655330000096
Passing rear axle center p M (x M ,y M ) The intersection points of the straight line perpendicular to the course angle theta direction and the left boundary line and the right boundary line are respectively p L (x L ,y L )、p R (x R ,y R ) (ii) a The distances from the vehicle positioning coordinate to the left and right boundary lines are d l And d r Setting the actual rear axle center position of the vehicle as p real The line side distances of the vehicle relative to the left and right boundary lines measured by the camera are respectively
Figure BDA0003622655330000097
And &>
Figure BDA0003622655330000098
From the rear axle centre p M Starting, the equation of a straight line with the direction perpendicular to the heading angle theta direction is expressed as:
Figure BDA0003622655330000101
finding the coordinates p of the intersection L And p R After d, d l And d r Expressed as:
Figure BDA0003622655330000102
/>
Figure BDA0003622655330000103
the center coordinate of the rear axle of the vehicle after being positioned by the left boundary line is (x) M L ,y M L ):
Figure BDA0003622655330000104
The center coordinate of the rear axle of the vehicle after being positioned by the right boundary line is (x) M R ,y M R ):
Figure BDA0003622655330000105
Using the average of the left and right lane line fixes as the final fix coordinate (x) M ',y M ');
Considering that the line side distance has a certain error, the error is close to a constant value and is set as delta d cam Final positioning coordinates are shown in formula (33):
Figure BDA0003622655330000106
the invention has the beneficial effects that:
aiming at the problem that the positioning error is caused by the lane matching error or the lane left and right boundary point matching error when the existing research utilizes the line side distance to position the lateral position of a vehicle relative to a lane line, the invention firstly provides a steady lane change recognition algorithm, and an improved multi-index weighted evaluation map matching algorithm and a lane left and right boundary point determination method are designed by combining the algorithm; on the basis of obtaining the accurate lane where the vehicle is located and the left boundary point and the right boundary point by matching, an auxiliary positioning method based on a camera is provided, and the lateral precision of the vehicle relative to the lane is improved; the invention does not need high-cost equipment such as laser radar and the like, and compared with the existing low-cost fusion positioning technology, the positioning precision is higher, the stability is stronger, and a low-cost and high-precision solution is provided for automatic driving positioning.
Drawings
FIG. 1 is a schematic view of a lane where a vehicle is located and left and right lane line boundary points where the vehicle is located when the vehicle is traveling; the projection of the center of the rear axle of the vehicle on the left lane line is positioned
Figure BDA0003622655330000111
And &>
Figure BDA0003622655330000112
The left boundary point is called as the left boundary point for short, and the left boundary point connecting line is called as the left boundary line for short; in the same way, the right boundary point is>
Figure BDA0003622655330000113
And &>
Figure BDA0003622655330000114
The right boundary point connecting line is called the right boundary line for short;
FIG. 2 is a schematic overall framework of the present invention;
FIG. 3 is a schematic view of a two-wheel odometer model of the invention;
FIG. 4 is a schematic view of a lane change identification process according to the present invention;
FIG. 5 is a schematic diagram showing the yaw rate and the lateral displacement variation of the opposite lane in the experimental lane change process of the vehicle; the method comprises the following steps of (a) changing a left lane and (b) changing a right lane;
FIG. 6 is a schematic flow chart of a road-intersection matching method of the present invention;
FIG. 7 is a schematic flow chart of a lane left and right boundary point determination method according to the present invention;
FIG. 8 is a schematic view of camera-based assisted positioning according to the present invention;
FIG. 9 is a schematic diagram showing a comparison of lateral errors in road driving during experimental verification; (a) Lateral error in the method of the invention (b) lateral error in the method of the document.
Detailed Description
The lane in which the vehicle is located and the left and right lane line boundary points at which the vehicle is located when the vehicle is traveling are shown in figure 1,the projection of the center of the rear axle of the vehicle on the left lane line is positioned
Figure BDA0003622655330000115
And &>
Figure BDA0003622655330000116
The two points are called the left boundary point for short, and the connection line of the left boundary point is called the left boundary line for short; similarly, the right boundary point is->
Figure BDA0003622655330000117
And &>
Figure BDA0003622655330000118
The right boundary point connecting line is called the right boundary line for short.
As shown in FIG. 2, the invention provides a low-cost high-precision positioning method based on multi-sensor fusion, which comprises the following steps:
step 1, creating a map model:
the invention establishes a lightweight lane-level map model, and a single lane l is expressed as follows:
l={id,pre,suc,p c ,p l ,p r ,q} (1)
wherein id represents the serial number of the lane l, the serial number on the left side is sequentially increased from 1 and sequentially decreased from-1 from the center line of the road; pre represents an intersection or a lane connected to a lane start point; suc is defined as an intersection or lane connected to the end of the lane; p is a radical of c ,p l And p r Respectively representing a point set of a lane central line, a left lane line and a right lane line; q contains attributes corresponding to the lanes, including width, speed limit, and types of left and right lane lines;
wherein p is l And p r Expressed as:
Figure BDA0003622655330000121
Figure BDA0003622655330000122
step 2, based on the fusion of GPS, wheel type odometer and chassis signals:
(1) The method adopts a method for fusing GPS, wheel-type odometers and chassis signals based on extended Kalman filtering EKF, and corrects state variables by using the relevant measurement values of the wheel-type odometers; the state variables are expressed as follows:
x=[p x ,p y ,θ,ω] T (4)
wherein p is x And p y The horizontal and vertical coordinates of the vehicle under the ENU coordinate system are shown, theta is the course angle of the vehicle under the ENU coordinate system, and omega is the yaw velocity of the vehicle;
the state transition equation is:
x'=Fx+Bu+λ (5)
here, the control input Bu is 0 and λ is process noise;
the invention uses a two-wheel odometer model, as shown in fig. 3, the centroid slip angle of the vehicle is set as beta, and the state transition equation is as follows:
Figure BDA0003622655330000123
Figure BDA0003622655330000124
θ k =θ k-1k-1 T (8)
ω k =ω k-1 (9)
under normal driving and good weather conditions, wheel slip is negligible and the speed is calculated as follows:
Figure BDA0003622655330000131
wherein n is RL And n RR For about the revolutions of the wheel per second, c RL And c RR Is the actual left and right wheel circumference; the yaw angular velocity ω is provided by the chassis ESC;
the state transition matrix F is represented as:
Figure BDA0003622655330000132
when the GPS signal is good, the observation matrix established by the GPS system is as follows:
Figure BDA0003622655330000133
the yaw rate observed value established by using the wheel type odometer signal is as follows:
Figure BDA0003622655330000134
wherein t is R Is the rear axle wheel base, the corresponding observation matrix is:
H=(0 0 0 1) (14);
(2) Excavating a StandStill signal of a vehicle chassis to carry out kinematic constraint, wherein the kinematic constraint comprises zero-speed correction and zero-angular-speed constraint; when the StandStill is 0, the vehicle is in a running state, and when the StandStill is 1, the vehicle is in a static state.
When the StandStill is 1, indicating that the vehicle is in a stationary state, the vehicle speed and the yaw rate are both 0, it is possible to obtain:
p x,k =p x,k-1 (15)
p y,k =p y,k-1 (16)
θ k =θ k-1 (17)
ω k =0 (18)
in this case, F is:
Figure BDA0003622655330000141
step 3, lane changing identification:
typically, traffic regulations only allow a vehicle to transition to an adjacent lane on a single lane change, so the present invention does not consider the case of traversing multiple lanes at a single time.
The invention provides a robust lane change identification method, which is characterized in that as shown in figure 4, lane change identification is carried out by establishing multiple criteria, and when a lane line can be identified and the quality of the lane line is good, the lane line is directly identified according to C 0 Judging whether to change lanes according to the judgment criterion; when the lane line can be identified but the lane line quality is poor or the lane line cannot be identified, comprehensively judging by adopting a plurality of criteria of yaw angular velocity, course angle and lane changing duration, and ensuring that whether the lane is changed or not can be identified under the condition that the lane line is shielded or worn by a vehicle;
said C 0 The judgment criteria are as follows:
assuming that the origin of the camera coordinate system is calibrated at the center of the rear axle of the vehicle, C 0 The line side distances from the actual rear axle center position of the vehicle to the left and right lane lines are represented by
Figure BDA0003622655330000142
And &>
Figure BDA0003622655330000143
Left lane line C 0 Value->
Figure BDA0003622655330000144
Greater than 0, C of the right lane line 0 Value->
Figure BDA0003622655330000145
Less than 0;
through the observation and analysis of the real vehicle experiment, no matter the lane is changed leftwards or rightwards and the lane line is crossed,
Figure BDA0003622655330000146
gradually decreases close to 0 and/or is greater or smaller than>
Figure BDA0003622655330000147
Increase is close to 0; left and right lane lines C at time t 0 Respectively has a value of->
Figure BDA0003622655330000148
And &>
Figure BDA0003622655330000149
Judging a variable xi, wherein xi takes a value of 0.3 m; the lane change judgment condition is as follows:
Figure BDA00036226553300001410
starting the detection of the track-changing ending state after the track-changing state is recognized, and setting C 0 Threshold value of change amount
Figure BDA00036226553300001411
Taking the value as 0.1, and judging that the lane change is finished when the condition (21) is met;
Figure BDA0003622655330000151
when not satisfying C 0 When judging the criterion, comprehensively judging by using the judging criteria of the yaw angular speed, the course angle and the lane change duration;
the criterion for judging the yaw rate is as follows:
the criteria uses the change in yaw rate to identify whether the vehicle is changing lanes. The practical lane changing experiment is carried out, a lane where a vehicle is located in the east direction, so that the X coordinate is along the lane direction, the Y coordinate is along the lane side direction, the change of the Y coordinate can show the change of the lateral displacement of the vehicle relative to the lane, and FIG. 5 depicts the change of a Y-axis track and a yaw rate in the left lane changing process and the right lane changing process; and the right lane change is that the wave trough is firstly followed and the wave crest is also arranged. Thus, a lane change can be identified by observing a change in yaw rate over a predefined window length M.
The course angle judgment criterion is as follows:
if the yaw rate judgment criterion is met, lane changing may occur, and driving on a curve may also occur; heading angle start-end variation δ = | θ in detection period using sliding window slightly longer than window length M endstart If delta is approximately equal to 0, the vehicle changes the lane on the straight lane; otherwise, driving along the curve;
the criterion for judging lane change duration is as follows:
setting the complete lane-changing duration as T LC If the time of the yaw angular velocity from the peak to the trough is greater than T LC Then the vehicle is judged to be driving along a curve instead of changing lanes.
Step 4, map matching:
step 4.1, road-intersection matching:
during the running process of the vehicle, three position states of the vehicle are judged: entering a road at an intersection, at a road and leaving the intersection; when the vehicle is on the road, matching lanes in the road to determine the lane; when the vehicle leaves from the intersection and enters the road, the camera auxiliary positioning method is started in time;
as shown in fig. 6, the flow of road-intersection matching is as follows:
s01, starting;
s02, point-to-point matching, and finding the closest point as a matching point;
s03, judging whether the matching point is in the intersection or not, if so, entering S04, and otherwise, entering S05;
s04, if the lane line is not identified by the current camera, judging that the vehicle is in the intersection at the moment, and entering S08, otherwise entering S05;
s05, judging whether the vehicle is at the intersection at the previous moment or not, if so, entering S06, otherwise, performing road matching, and entering S08;
s06, judging whether the road where the current matching point is located is a subsequent road of the intersection, if so, entering a step S07, and if not, entering a step S08;
s07, judging whether the current camera can identify the lane line, if so, judging that the vehicle leaves the intersection at the moment to enter the road, and entering S08, otherwise, entering S08;
and S08, finishing matching.
Step 4.2, lane matching in the road:
when the vehicle runs in a road, the vehicle and the adjacent lane can be matched back and forth due to positioning errors when the lane is changed to be close to the lane line. The invention provides an improved multi-index weighted evaluation map matching method by combining the proposed lane change identification method, and the accuracy and the robustness of lane matching in a road are improved.
When the vehicle is identified to run in the road, firstly, taking the current position of the vehicle as a center, searching candidate lanes in a certain range, and then carrying out comprehensive evaluation on the candidate lanes by utilizing a plurality of evaluation functions; calculating the comprehensive score of each candidate lane, and selecting the lane with the highest comprehensive score as a matching result;
the plurality of evaluation functions include:
(1) For distance-based evaluation, the evaluation function is set as:
Figure BDA0003622655330000171
wherein d is the distance from the positioning coordinate to the center line of the candidate lane;
(2) Evaluation based on topological relation: setting an evaluation function as F (link), if the current matching lane and the last matching lane have a topological connection relation or are the same lane, then F (link) =1, otherwise F (link) =0;
(3) Evaluation based on heading: setting the course angle of the driving direction of the center line of the current lane as theta 1 The heading angle of the vehicle is theta 2 Considering that the course angle of the vehicle and the direction of the center line of the curve are changed and the vehicle is not necessarily level with the lane when running, the method is constructedThe evaluation function of (a) is:
Figure BDA0003622655330000172
(4) Evaluation based on lane line type: setting the total number of the lane lines identified by the camera as M, comparing the lane line type identified by the camera with the lane line type at the corresponding position on the map, recording the number of the lane lines with the same type as N, and weakening the evaluation based on the lane line type by using a coefficient in consideration of the positioning uncertainty and the possible change of the lane line type in the same lane, wherein the evaluation function is as follows:
Figure BDA0003622655330000173
(5) Evaluation based on lane change recognition: setting the evaluation function as F (lc), the lane l of the above time cur And left and right adjacent lanes left And l right Three lanes are taken as examples, and when lane change is detected:
Figure BDA0003622655330000174
the comprehensive evaluation is as follows:
F sum =F(link)gF(θ)g(F(d)+F(lanetype)+F(lc)) (26)。
and calculating the comprehensive score of each candidate lane, and selecting the lane with the highest comprehensive score as a matching result.
Step 4.3, determining left and right boundary points:
in addition to matching the vehicle position to the correct lane corresponding to the map, in order to effectively use the lane line for auxiliary positioning, the map matching needs to determine the left and right boundary points, as shown in fig. 7, the lane left and right boundary point determination method is as follows:
when a vehicle leaves an intersection and enters a road, taking the starting point of the current lane as the driving starting point of the vehicle, and determining the initial left and right boundary points of the vehicle according to the driving starting point; performing lane change identification in the running process of the vehicle, if no lane change occurs, calculating the vehicle running distance according to the wheel revolution of the wheel type odometer and comparing the vehicle running distance with the distance between each discrete point of a lane line, and updating the boundary points by combining with the initial boundary points; when lane changing occurs, the lanes are re-matched according to the road matching method, then the vehicle driving starting point and the initial left and right boundary points are re-matched on the new lane, and the step of updating the left and right boundary points when the vehicle continues to drive along the lanes is the same as that when the lane changing does not occur;
and 5, auxiliary positioning based on the camera:
as shown in fig. 8, the camera-based auxiliary positioning is performed on the basis of obtaining the lane where the vehicle is located and the left and right boundary points;
establishing a lane line model:
y=C 3 x 3 +C 2 x 2 +C 1 x+C 0 (27)
wherein C is 0 、C 1 、C 2 And C 3 Respectively representing line side distance, gradient, curvature and curvature derivative;
let the rear axle center of the vehicle be (x) M ,y M ) Taking the coordinate system as the origin of a vehicle coordinate system, wherein the horizontal axis points to the front of the vehicle, and the vertical axis points to the left side of the vehicle; camera coordinate system R C Is point C (x) C ,y C ) The horizontal and vertical axis directions are the same as the vehicle coordinate system; camera coordinate system R C Is point C (x) C ,y C ) Calibrated at the rear axle center p of the vehicle M (x M ,y M ) The left boundary point of the vehicle is
Figure BDA0003622655330000181
And &>
Figure BDA0003622655330000182
Right boundary point is->
Figure BDA0003622655330000183
And &>
Figure BDA0003622655330000184
After passingAxial center p M (x M ,y M ) The intersection points of the straight line perpendicular to the course angle theta direction and the left boundary line and the right boundary line are respectively p L (x L ,y L )、p R (x R ,y R ) (ii) a The distances from the vehicle positioning coordinate to the left and right boundary lines are d respectively l And d r Setting the actual rear axle center position of the vehicle as p real The line side distances of the vehicle relative to the left and right boundary lines measured by the camera are respectively
Figure BDA0003622655330000191
And &>
Figure BDA0003622655330000192
From the rear axle centre p M Starting, the equation of a straight line with the direction perpendicular to the heading angle theta direction is expressed as:
Figure BDA0003622655330000193
finding the intersection point coordinate p L And p R After d, d l And d r Expressed as:
Figure BDA0003622655330000194
Figure BDA0003622655330000195
the center coordinate of the rear axle of the vehicle after being positioned by the left boundary line is (x) M L ,y M L ):
Figure BDA0003622655330000196
The center coordinate of the rear axle of the vehicle after being positioned by the right boundary line is (x) M R ,y M R ):
Figure BDA0003622655330000197
The average of the left and right boundary line locations is used as the final location coordinate (x) M ',y M ') and is set to Δ d considering that the line side distance has a certain error, which is close to a constant value cam Final positioning coordinates are shown in formula (33):
Figure BDA0003622655330000198
and (3) experimental verification:
the method uses SCANeR studio to construct a realistic virtual driving scene; consider a camera C 0 The deviation of the observed value is stable and small in practice, the mean value of the deviation of the observed value is set to be 0.2 meter, the standard deviations are respectively 0.05 meter, and delta d is obtained cam Set to 0.2 meters. The parameters in simulation are set according to table 1 by setting up a driving scene that four roads are intersected at the crossroad and referring to the sensor parameters equipped by an actual vehicle, and the parameters of subsequent simulation experiments are set according to table 1.
TABLE 1 parameter settings in simulation
Figure BDA0003622655330000201
The two roads are changed to the adjacent lane and then changed back to the original lane, the middle part of the road turns right to drive through the intersection, the front road is along the east-west direction, and the back road is along the north-south direction. We run the literature at the same timeRoad- Centered Map-Aided Localization for Driverless Cars Using Single-Frequency GNSS ReceiversThe method ensures that the sensor input and the vehicle state signal input received by the two methods are completely consistent, and then compares the positioning results output by the two methods.
Fig. 9 compares the lateral error of the opposite lane when the two methods are driven on two roads. The lateral error of the method disclosed by the document is greatly fluctuated, and the lateral error which is larger than 1 meter can be generated during lane changing, compared with the method disclosed by the invention, the error is smaller and close to 0 in most of the time, the fluctuation occasionally occurs, and the maximum lateral error is close to 1 meter, so that compared with the document, the method disclosed by the invention is higher in lateral positioning precision.
The results of the alignment of the methods proposed in the literature and those of the present invention are shown in Table 2.
As can be seen from Table 2, compared with the method proposed by the literature, the method provided by the invention has the advantages that the average value and the standard deviation of three indexes, namely the lateral accuracy, the overall accuracy and the course angle accuracy, of the relative lane are smaller, namely the positioning accuracy in the three aspects is improved and is more stable, and the method better meets the requirement of unmanned driving on a positioning system.
TABLE 2 comparison of positioning results
Figure BDA0003622655330000202
Figure BDA0003622655330000211
/>

Claims (4)

1. A low-cost high-precision positioning method based on multi-sensor fusion is characterized in that: the method comprises the following steps:
step 1, establishing a map model:
establishing a lightweight lane-level map model;
the single lane l of the lightweight lane-level map model is represented as:
l={id,pre,suc,p c ,p l ,p r ,q} (1)
wherein id represents the serial number of the lane l, the serial number on the left side is sequentially increased from 1 and sequentially decreased from-1 from the center line of the road; pre represents an intersection or a lane connected to a lane start point; suc is defined as the intersection with the end of the laneA crossroad or lane; p is a radical of c ,p l And p r Respectively representing a point set of a lane central line, a left lane line and a right lane line; q contains attributes corresponding to the lanes, including width, speed limit, and types of left and right lane lines;
wherein p is l And p r Expressed as:
Figure FDA0004024451620000011
Figure FDA0004024451620000012
step 2, based on the fusion of GPS, wheel type odometer and chassis signals:
(1) Fusing GPS, wheel odometers and chassis signals based on an extended Kalman filter EKF, and correcting a state variable by using a relevant measurement value of the wheel odometer;
(2) Excavating a StandStill signal of a vehicle chassis to carry out kinematic constraint, wherein the kinematic constraint comprises zero-speed correction and zero-angular-speed constraint;
step 3, lane changing identification:
a robust lane change identification method is adopted, lane change identification is carried out by establishing multiple criteria, and when a lane line can be identified and the quality of the lane line is good, the lane line is directly identified according to C 0 Judging whether to change lanes according to the judgment criterion; when the lane line can be identified but the quality of the lane line is poor or the lane line cannot be identified, comprehensively judging by adopting a plurality of criteria of yaw angular velocity, course angle and lane changing duration; said C 0 The judgment criteria are as follows:
assuming that the origin of the camera coordinate system is calibrated at the center of the rear axle of the vehicle, C 0 The line side distances from the actual rear axle center position of the vehicle to the left and right lane lines are represented as
Figure FDA0004024451620000021
And &>
Figure FDA0004024451620000022
Left lane line C 0 Value->
Figure FDA0004024451620000023
Greater than 0, C of the right lane line 0 Value->
Figure FDA0004024451620000024
Less than 0;
when changing lanes across lane lines to the left or right,
Figure FDA0004024451620000025
gradually decreases close to 0 and/or is greater or smaller than>
Figure FDA0004024451620000026
Increase is close to 0; left and right lane lines C at time t 0 Respectively have a value of>
Figure FDA0004024451620000027
And &>
Figure FDA0004024451620000028
Judging a variable xi; the lane change judgment condition is as follows:
Figure FDA0004024451620000029
starting the detection of the lane-changing ending state after the lane-changing state is identified, and setting C 0 Threshold value of change amount
Figure FDA00040244516200000210
When the condition (21) is met, judging that the lane change is finished; />
Figure FDA00040244516200000211
When not satisfying C 0 When judging the criterion, comprehensively judging by using the judging criteria of the yaw angular speed, the course angle and the lane-changing duration;
the criterion of the yaw rate is as follows:
identifying whether the vehicle is changing lanes using the change in yaw rate, identifying a lane change by observing the change in yaw rate over a predefined window length M: in the left lane changing process and the right lane changing process, the change curve of the yaw angular velocity goes through a wave crest and a wave trough, and for the left lane changing, the wave crest has the wave trough; the right lane change is that the wave trough is firstly provided with the wave crest;
the course angle judgment criterion is as follows:
if the yaw rate judgment criterion is met, lane change may occur, and driving on a curve may also occur; the change δ = | θ = course angle start and end in the detection period using a sliding window slightly longer than the window length M endstart If delta is approximately equal to 0, the vehicle changes the lane on the straight lane; otherwise, driving along a curve;
the criterion for judging lane-changing duration is as follows:
setting the complete lane-changing duration as T LC If the time of the yaw angular velocity from the peak to the trough is greater than T LC Judging that the vehicle runs along the curve instead of changing the road;
step 4, map matching:
step 4.1, road-intersection matching:
in the running process of the vehicle, three position states of the vehicle are judged: entering the road at the intersection, at the road and leaving the intersection; when the vehicle is on the road, determining the lane by using an in-road lane matching method; when the vehicle leaves from the intersection and enters the road, the camera auxiliary positioning method is started in time;
step 4.2, lane matching in the road:
when the vehicle is identified to run in the road, firstly, taking the current position of the vehicle as a center, searching candidate lanes in a certain range, and then carrying out comprehensive evaluation on the candidate lanes by utilizing a plurality of evaluation functions; calculating the comprehensive score of each candidate lane, and selecting the lane with the highest comprehensive score as a matching result; the plurality of evaluation functions include:
(1) For distance-based evaluation, the evaluation function is set as:
Figure FDA0004024451620000031
wherein d is the distance from the positioning coordinate to the center line of the candidate lane;
(2) Evaluation based on topological relation: setting an evaluation function as F (link), if the current matching lane and the last matching lane have a topological connection relation or are the same lane, then F (link) =1, otherwise F (link) =0;
(3) Course-based evaluation: setting the heading angle of the center line of the candidate lane as theta 1 The heading angle of the vehicle is theta 2 And the constructed evaluation function is as follows:
Figure FDA0004024451620000032
(4) Evaluation based on lane line type: setting the total number of the lane lines identified by the camera to be M, comparing the lane line type identified by the camera with the lane line type at the corresponding position on the map, recording the number of the lane lines with the same type as N, weakening the evaluation based on the lane line type by using a coefficient in consideration of the positioning uncertainty and the possible change of the lane line type of the same lane, and taking the evaluation function as follows:
Figure FDA0004024451620000041
(5) Evaluation based on lane change recognition: setting the evaluation function as F (lc), and setting the lane l at the above one moment cur And left and right adjacent lanes left And l right Three lanes are taken as examples, and when lane change is detected:
Figure FDA0004024451620000042
the comprehensive evaluation is as follows:
F sum =F(link)gF(θ)g(F(d)+F(lanetype)+F(lc)) (26);
step 4.3, determining left and right boundary points:
when a vehicle leaves an intersection and enters a road, taking the starting point of the current lane as the driving starting point of the vehicle, and determining the initial left and right boundary points of the vehicle according to the driving starting point; performing lane change identification in the running process of the vehicle, if no lane change occurs, calculating the vehicle running distance according to the wheel revolution of the wheel type odometer and comparing the vehicle running distance with the distance between each discrete point of a lane line, and updating the boundary points by combining with the initial boundary points; when lane changing occurs, the lanes are re-matched according to the road matching method, then the vehicle driving starting point and the initial left and right boundary points are re-matched on the new lane, and the step of updating the left and right boundary points when the vehicle continues to drive along the lanes is the same as that when the lane changing does not occur;
and 5, auxiliary positioning based on the camera:
performing camera-based auxiliary positioning on the basis of obtaining the lane where the vehicle is located and the left and right boundary points; calculating the intersection point of a straight line which passes through the center of the rear axle of the vehicle and is perpendicular to the course angle theta, the left boundary line and the right boundary line and the distance from the vehicle to the left boundary line and the right boundary line; and determining the final positioning coordinate by using the central coordinate of the rear axle of the vehicle after the positioning of the left boundary line and the central coordinate of the rear axle of the vehicle after the positioning of the right boundary line.
2. The low-cost high-precision positioning method based on multi-sensing fusion according to claim 1, characterized in that: in the process of correcting the state variable in the step 2 (1), the state variable is expressed as follows:
x=[p x ,p y ,θ,ω] T (4)
wherein p is x And p y The horizontal coordinate and the vertical coordinate of the vehicle under the ENU coordinate system are obtained, theta is the course angle of the vehicle under the ENU coordinate system, and omega is the yaw angular velocity of the vehicle;
the state transition equation is:
x'=Fx+Bu+λ (5)
here, the control input Bu is 0, λ is process noise;
the mass center slip angle of the vehicle is set as beta, and the state transition equation is as follows:
Figure FDA0004024451620000051
Figure FDA0004024451620000052
θ k =θ k-1k-1 T (8)
ω k =ω k-1 (9)
under normal driving and good weather conditions, wheel slip is negligible and the speed is calculated as follows:
Figure FDA0004024451620000053
wherein n is RL And n RR To about the wheel revolutions per second, c RL And c RR Is the actual left and right wheel circumference; the yaw angular velocity ω is provided by the chassis ESC;
the state transition matrix F is represented as:
Figure FDA0004024451620000054
when the GPS signal is good, the observation matrix established by the GPS system is as follows:
Figure FDA0004024451620000055
the yaw rate observed value established by using the wheel type odometer signal is as follows:
Figure FDA0004024451620000061
wherein t is R Is the rear axle wheel base, the corresponding observation matrix is:
H=(0 0 0 1) (14);
in the step 2 (2), when the StandStill is 0, the vehicle is in a running state;
when the StandStill is 1, indicating that the vehicle is in a stationary state, the vehicle speed and the yaw rate are both 0, it is possible to obtain:
p x,k =p x,k-1 (15)
p y,k =p y,k-1 (16)
θ k =θ k-1 (17)
ω k =0 (18)
in this case, F is:
Figure FDA0004024451620000062
3. the low-cost high-precision positioning method based on multi-sensing fusion according to claim 1, characterized in that: in step 4.1, the road-intersection matching process is as follows:
s01, starting;
s02, point-to-point matching, and finding the closest point as a matching point;
s03, judging whether the matching point is in the intersection at the moment, if so, entering S04, and otherwise, entering S05;
s04, if the lane line is not identified by the current camera, judging that the vehicle is in the intersection at the moment, and entering S08, otherwise entering S05;
s05, judging whether the vehicle is at the intersection at the previous moment or not, if so, entering S06, otherwise, carrying out lane matching in the road, and entering S08;
s06, judging whether the road where the current matching point is located is a subsequent road of the intersection, if so, entering a step S07, and if not, entering a step S08;
s07, judging whether the current camera can identify the lane line, if so, judging that the vehicle leaves the intersection at the moment to enter the road, and entering S08, otherwise, entering S08;
and S08, finishing matching.
4. The low-cost high-precision positioning method based on multi-sensing fusion according to claim 1, characterized in that: in the camera-based auxiliary positioning, the lane line model is as follows:
y=C 3 x 3 +C 2 x 2 +C 1 x+C 0 (27)
wherein C is 0 、C 1 、C 2 And C 3 Respectively representing line side distance, gradient, curvature and curvature derivative;
let the rear axle center of the vehicle be (x) M ,y M ) Taking the coordinate system as the origin of a vehicle coordinate system, wherein the horizontal axis points to the front of the vehicle, and the vertical axis points to the left side of the vehicle; camera coordinate system R C Is point C (x) C ,y C ) The horizontal and vertical axis directions are the same as the vehicle coordinate system; camera coordinate system R C Is point C (x) C ,y C ) Calibrated at the rear axle center p of the vehicle M (x M ,y M ) The left boundary point of the vehicle is
Figure FDA0004024451620000071
And &>
Figure FDA0004024451620000072
Right boundary point is->
Figure FDA0004024451620000073
And &>
Figure FDA0004024451620000074
Passing rear axle center p M (x M ,y M ) The intersection points of the straight line perpendicular to the course angle theta direction and the left boundary line and the right boundary line are respectively p L (x L ,y L )、p R (x R ,y R ) (ii) a The distances from the vehicle positioning coordinate to the left and right boundary lines are d l And d r Setting the actual rear axle center position of the vehicle as p real The line side distances of the vehicle relative to the left and right boundaries measured by the camera are ^ based on>
Figure FDA0004024451620000075
And
Figure FDA0004024451620000076
from the rear axle centre p M Starting, the equation of a straight line with the direction perpendicular to the heading angle theta direction is expressed as:
Figure FDA0004024451620000077
finding the intersection point coordinate p L And p R After d, d l And d r Expressed as:
Figure FDA0004024451620000078
Figure FDA0004024451620000079
the center coordinate of the rear axle of the vehicle after being positioned by the left boundary line is (x) M L ,y M L ):
Figure FDA00040244516200000710
The center coordinate of the rear axle of the vehicle after being positioned by the right boundary line is (x) M R ,y M R ):
Figure FDA0004024451620000081
Using the average of the left and right lane line locations as the final location coordinate (x) M ',y M ');
Considering that the line side distance has a certain error, the error is close to a constant value and is set as delta d cam Final positioning coordinates are shown in formula (33):
Figure FDA0004024451620000082
/>
CN202210471531.3A 2022-04-28 2022-04-28 Low-cost high-precision positioning method based on multi-sensor fusion Active CN114889606B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210471531.3A CN114889606B (en) 2022-04-28 2022-04-28 Low-cost high-precision positioning method based on multi-sensor fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210471531.3A CN114889606B (en) 2022-04-28 2022-04-28 Low-cost high-precision positioning method based on multi-sensor fusion

Publications (2)

Publication Number Publication Date
CN114889606A CN114889606A (en) 2022-08-12
CN114889606B true CN114889606B (en) 2023-04-07

Family

ID=82720181

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210471531.3A Active CN114889606B (en) 2022-04-28 2022-04-28 Low-cost high-precision positioning method based on multi-sensor fusion

Country Status (1)

Country Link
CN (1) CN114889606B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116559899B (en) * 2023-07-12 2023-10-03 蘑菇车联信息科技有限公司 Fusion positioning method and device for automatic driving vehicle and electronic equipment

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010191661A (en) * 2009-02-18 2010-09-02 Nissan Motor Co Ltd Traveling path recognition device, automobile, and traveling path recognition method
KR101225626B1 (en) * 2010-07-19 2013-01-24 포항공과대학교 산학협력단 Vehicle Line Recognition System and Method
KR102016549B1 (en) * 2014-01-13 2019-08-30 한화디펜스 주식회사 System and methof of detecting vehicle and lane position
CN108345019A (en) * 2018-04-20 2018-07-31 长安大学 The positioning device and method in a kind of vehicle place track
CN111089598B (en) * 2019-11-25 2021-08-06 首都师范大学 Vehicle-mounted lane-level real-time map matching method based on ICCIU
CN114396958B (en) * 2022-02-28 2023-08-18 重庆长安汽车股份有限公司 Lane positioning method and system based on multiple lanes and multiple sensors and vehicle

Also Published As

Publication number Publication date
CN114889606A (en) 2022-08-12

Similar Documents

Publication Publication Date Title
CN111033176B (en) Map information providing system
CN105937912B (en) The map data processing device of vehicle
CN110160542B (en) Method and device for positioning lane line, storage medium and electronic device
RU2735567C1 (en) Method for storing movement backgrounds, method for generating motion path model, method for estimating local position and storage device for storing movement backgrounds
CN101819042B (en) Navigation device and navigation program
CN102529975B (en) Systems and methods for precise sub-lane vehicle positioning
CN108628324B (en) Unmanned vehicle navigation method, device, equipment and storage medium based on vector map
CN101819043B (en) Navigation device and navigation method
CN108873038A (en) Autonomous parking localization method and positioning system
CN108645420B (en) Method for creating multipath map of automatic driving vehicle based on differential navigation
JP2001331787A (en) Road shape estimating device
CN112904395B (en) Mining vehicle positioning system and method
CN111076734B (en) High-precision map construction method for unstructured roads in closed area
Jung et al. Monocular visual-inertial-wheel odometry using low-grade IMU in urban areas
Shunsuke et al. GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon
CN110431609A (en) Vehicle location estimating device
CN112433531A (en) Trajectory tracking method and device for automatic driving vehicle and computer equipment
Asghar et al. Vehicle localization based on visual lane marking and topological map matching
CN114889606B (en) Low-cost high-precision positioning method based on multi-sensor fusion
CN114114367A (en) AGV outdoor positioning switching method, computer device and program product
JP2021092508A (en) Travel trajectory estimation method and travel trajectory estimation device
CN113034972A (en) Highway automatic lane changing method based on travelable area
Dean et al. Terrain-based road vehicle localization on multi-lane highways
CN111857121A (en) Patrol robot walking obstacle avoidance method and system based on inertial navigation and laser radar
CN113405555B (en) Automatic driving positioning sensing method, system and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant