CN111516673A - Lane line fusion system and method based on intelligent camera and high-precision map positioning - Google Patents

Lane line fusion system and method based on intelligent camera and high-precision map positioning Download PDF

Info

Publication number
CN111516673A
CN111516673A CN202010360355.7A CN202010360355A CN111516673A CN 111516673 A CN111516673 A CN 111516673A CN 202010360355 A CN202010360355 A CN 202010360355A CN 111516673 A CN111516673 A CN 111516673A
Authority
CN
China
Prior art keywords
lane line
precision map
intelligent camera
output
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010360355.7A
Other languages
Chinese (zh)
Other versions
CN111516673B (en
Inventor
汤兆丰
熊周兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN202010360355.7A priority Critical patent/CN111516673B/en
Publication of CN111516673A publication Critical patent/CN111516673A/en
Application granted granted Critical
Publication of CN111516673B publication Critical patent/CN111516673B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/02Control of vehicle driving stability
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/001Planning or execution of driving tasks
    • B60W60/0015Planning or execution of driving tasks specially adapted for safety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/09626Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0967Systems involving transmission of highway information, e.g. weather, speed limits
    • G08G1/096708Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
    • G08G1/096725Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information generates an automatic action on the vehicle control

Abstract

The invention discloses a lane line fusion method based on intelligent cameras and high-precision map positioning, which comprises the following steps: preprocessing lane line information output by an intelligent camera and a high-precision map positioning controller; judging whether the preprocessed intelligent camera lane lines are abnormal or not; if the lane line is available, selecting the lane line output by the high-precision map locator to assist automatic driving control; if the lane line of the intelligent camera is recovered and available, selecting the lane line output by the intelligent camera to assist automatic driving control; and tracking, verifying and processing the fused lane lines. The invention also discloses a lane line fusion system based on the intelligent camera and the high-precision map positioning. By the system and the method, under the complex driving condition, the intelligent camera and the high-precision map locator complement each other in a point manner, and the safety and the reliability of automatic driving are improved.

Description

Lane line fusion system and method based on intelligent camera and high-precision map positioning
Technical Field
The invention relates to the technical field of sensor information fusion, in particular to a lane line fusion system and method based on an intelligent camera and high-precision map positioning.
Background
The lane line information is one of the most important environmental perception information for realizing automatic driving of the automobile. The automatic driving of the car using the lane line information can keep driving safely in the current lane. Meanwhile, the accurate and smooth lane lines can enable the vehicle to run more stably, and experience feeling and safety are improved.
At present, the way of automatically driving a car to obtain lane line information is through an intelligent camera. The intelligent camera can detect and identify the lane line from the video image and output the lane line to the automatic driving controller in real time. The camera has the advantages of strong perception real-time performance and more accurate detection. However, the camera is also susceptible to weather, light and the like, has a limited field of view, and is easily shielded by obstacles. The performance of the current video image detection and identification technology under special working conditions of curves, road shadows, complex road markings and the like cannot be guaranteed. Due to the characteristics of the current camera, the driving safety of the automatic driving automobile cannot be guaranteed under the complex driving working conditions.
Disclosure of Invention
The invention aims to provide a lane line fusion system and method based on an intelligent camera and high-precision map positioning, which realize the point mutual supplement redundancy of the intelligent camera and a high-precision map positioner under the complex driving working condition and improve the safety and the reliability of automatic driving.
In order to achieve the purpose, the invention provides a lane line fusion method based on an intelligent camera and high-precision map positioning, which comprises the following steps:
(S1) the preprocessing of the lane line information output by the smart camera includes: the intelligent camera outputs a cubic curve of the lane line, and the cubic curve is subjected to smoothing processing; completing the lost lane line; judging the rationality of the lane lines and rejecting abnormal curves;
(S2) the preprocessing of the lane line information output by the high-precision map positioning controller includes: converting the longitude and latitude position of the vehicle and the course of the vehicle into relative coordinate points of a map lane line longitude and latitude point coordinate under a vehicle body coordinate system, then fitting into a cubic curve, and smoothing the cubic curve; judging the rationality of the lane lines and rejecting abnormal curves;
(S3) judging whether the lane line output by the preprocessed intelligent camera is abnormal or not; if the lane line output by the intelligent camera is not abnormal, selecting the lane line output by the intelligent camera to assist automatic driving control;
if the lane line output by the intelligent camera is abnormal, whether the lane line output by the high-precision map locator is available or not needs to be judged, if yes, the lane line output by the high-precision map locator is selected to assist automatic driving control, and the lane line output by the intelligent camera is stably transited to the lane line output by the high-precision map locator to realize fusion;
if the lane line output by the intelligent camera is recovered, judging that the lane line output by the intelligent camera is available, selecting the lane line output by the intelligent camera to assist automatic driving control, and stably transitioning the lane line output by the high-precision map locator to the lane line output by the intelligent camera to realize fusion;
(S4) performing tracking verification and processing on the fused lane line to output an accurate, stable and reliable lane line to assist automatic driving control.
Further, in the step (S1) and the step (S2), the cubic curve is smoothed, specifically, the steps are: coefficients of the cubic curve are smoothed by a variable-weight low-pass filter.
Further, in the step (S1), the method for completing the lane line loss includes the following steps: performing translation completion on the loss of the lane line on one side according to the historical lane width; and predicting and completing the condition that the left lane line and the right lane line are completely lost according to historical data.
Further, in the step (S3), the lane line output by the smart camera is smoothly transited to the lane line output by the high-precision map locator to implement fusion, which specifically includes the following steps: if the lane line output by the high-precision map locator is available, predicting lane line parameters A0, A1, A2 and A3 of a cubic curve to be fitted with the next frame of lane line according to the driving state of the vehicle and the lane line output by the intelligent camera history; according to the empirical value, the A0 maximum jump 0.05, the A1 maximum jump 0.001, the A2 maximum jump 0.00001 and the A3 maximum jump 0.000001 are output as the current lane line parameters at a time until the lane line output by the current high-precision map locator is transited to as the fusion output lane line.
Further, whether the lane line of the high-precision map locator is available or not is judged, and whether the precision and the confidence of the high-precision map locator meet the use requirements or not is judged; whether the average transverse error of the lane line of the high-precision map locator and the history of the camera and the predicted lane line is stably less than 20 cm or not and whether abnormal jumping exists or not; if the precision and the confidence coefficient of the high-precision map locator meet the use requirements, and the average transverse error of the lane line of the high-precision map locator, the history of the camera and the predicted lane line is stably less than 20 cm and has no abnormal jump, the lane line of the high-precision map locator is usable, otherwise, the lane line of the high-precision map locator is unusable.
Further, when no marker is used for longitudinally correcting the positioning, meter-level longitudinal errors can occur in the longitudinal positioning; when the lane is straight and has no special change, the influence of longitudinal errors on the accuracy of the lane line is small, and the lane line output by the high-precision map locator has high confidence level; when a curve is formed, the influence of longitudinal errors on the accuracy of a lane line is large, the lane line output by the high-precision map locator has low confidence coefficient, and the confidence coefficient is lower when the curvature of the curve is larger; if the map lane line is not available, the confidence of the lane line is 0.
Further, when the lane line has a low confidence or the confidence is 0, the track of the preceding vehicle is acquired to follow the vehicle or to assist the automatic driving control according to the history presumption.
Further, in the steps (S1) and (S2), the lane line rationality determination includes: judging the rationality of the ranges of cubic curve coefficients A0, A1, A2 and A3 of the lane lines and the range of inter-frame jump; judging the rationality of the lane width; and judging the reasonability of the length of the lane line.
The invention also provides a lane line fusion system based on the intelligent camera and the high-precision map positioning, which comprises:
the intelligent camera is used for outputting lane line information; the intelligent camera is connected with a vehicle body CAN bus;
a GPS receiver for receiving GPS satellite signals and determining a ground spatial location;
an inertial sensor for outputting an IMU signal;
the high-precision map positioning controller is used for finishing hardware system initialization, positioning algorithm initialization and outputting lane line information; the high-precision map positioning controller is connected with a CAN bus of the vehicle body; the high-precision map positioning controller is also connected with the intelligent camera, the GPS receiver and the inertial sensor;
an automatic driving controller for performing the steps (S3) and (S4) of the present invention; and the intelligent camera and the high-precision map positioning controller are both connected with the automatic driving controller.
Furthermore, the intelligent camera is a forward-looking intelligent camera and has an ASIL B-level function safety intelligent camera.
Compared with the prior art, the invention has the following advantages:
according to the lane line fusion system and method based on the intelligent camera and the high-precision map positioning, the intelligent camera is adopted to complete lane line detection and the output of the pre-stored lane line by the high-precision map positioning device, target-level lane line information output by the intelligent camera and the high-precision map positioning device is fused, the lane line is subjected to smooth and stable transition processing to generate a more stable lane line, the more stable and reliable lane line for automatic driving transverse control is generated, the mutual supplementary redundancy of the advantages of the two sensors is realized, and the safety and the experience are improved; the abnormal jump is prevented by smoothing the cubic curve; and the lane line output by the process fusion algorithm is switched to the lane line to guide the vehicle to transit stably, so that the lane line can be further prevented from jumping, and the safety and the experience are improved.
Drawings
FIG. 1 is a flow chart of a lane line fusion method based on an intelligent camera and high-precision map positioning according to the present invention;
fig. 2 is a schematic structural diagram of the lane line fusion system based on the intelligent camera and the high-precision map positioning.
In the figure:
1-an intelligent camera; 2-high precision map positioning controller; 3-an automatic driving controller; 4-a GPS receiver; 5-inertial sensor.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings.
The high-precision map is a map specially used for automatic driving, has higher precision and richer information, and stores lane-level road information and fixed marker information such as signs, rods and the like. The positioning algorithm fuses input signals of sensors such as a GPS, an IMU, wheel speeds and a camera, and lane-level accurate positioning of the vehicle is achieved. And acquiring the environment information of the corresponding lane level of the position through matching the precise position with the high-precision map.
Referring to fig. 1, the embodiment discloses a lane line fusion method based on an intelligent camera and high-precision map positioning, which includes the following steps:
(S1) the preprocessing of the lane line information output by the smart camera 1 includes: the intelligent camera 1 outputs a cubic curve of the lane line and smoothes the cubic curve; completing the lost lane line; judging the rationality of the lane lines and rejecting abnormal curves.
(S2) the preprocessing of the lane line information output from the high-precision map positioning controller 2 includes: converting the longitude and latitude position of the vehicle and the course of the vehicle into relative coordinate points of a map lane line longitude and latitude point coordinate under a vehicle body coordinate system, then fitting into a cubic curve, and smoothing the cubic curve; judging the rationality of the lane lines and rejecting abnormal curves.
(S3) judging whether the lane line output by the preprocessed intelligent camera 1 is abnormal or not; if the lane line output by the intelligent camera 1 is not abnormal, the lane line output by the intelligent camera 1 is selected to assist automatic driving control.
If the lane line output by the intelligent camera 1 is abnormal, whether the lane line output by the high-precision map locator 2 is available or not needs to be judged, if the lane line output by the high-precision map locator 2 is available, the lane line output by the high-precision map locator 2 is selected to assist automatic driving control, and the lane line output by the intelligent camera 1 is stably transited to the lane line output by the high-precision map locator 2 to realize fusion.
If the lane line output by the intelligent camera 1 is recovered, judging that the lane line output by the intelligent camera 1 is available, selecting the lane line output by the intelligent camera to assist automatic driving control, and stably transitioning the lane line output by the high-precision map positioner 2 to the lane line output by the intelligent camera 1 to realize fusion; the high-precision map stores prior information of roads and has hysteresis, so that the current high-precision positioning equipment is not functionally safe. Meanwhile, the positioning error is large and unavoidable, so when the lane line output by the camera is not abnormal, the fusion scheme selects to trust the output of the camera. The lane line output by the process fusion algorithm is switched to guide the vehicle to transit smoothly, so that the lane line can be prevented from jumping, and the safety and the experience are improved.
(S4) performing tracking verification and processing on the fused lane line to output an accurate, stable and reliable lane line to assist automatic driving control.
In this embodiment, in the step (S1), the lane line loss is compensated, and the specific steps include: performing translation completion on the loss of the lane line on one side according to the historical lane width; and predicting and completing the condition that the left lane line and the right lane line are completely lost according to historical data.
In this embodiment, in the step (S1), the preprocessing of the lane line information output by the smart camera 1 further includes processing abnormal jump, width abnormality, short effective range, and misrecognition.
In the present embodiment, in the step (S2), the preprocessing of the lane line information output by the high-precision map positioning controller 2 further includes processing an abnormal jump and an abnormal state. The abnormal jump of the lane line output by the high-precision map positioning controller 2 is mainly caused by the jump of the positioning position of the vehicle and the curve fitting error; the abnormal state of the lane line output by the high-precision map positioning controller 2 includes: if the vehicle has large positioning error and large course angle error, the map and the actual road do not accord with each other, which can cause the abnormal state of the lane line. In the abnormal state, the lane line of the map is not available.
In the present embodiment, in steps (S1) and (S2), the lane line rationality determination includes: judging the rationality of the ranges of cubic curve coefficients A0, A1, A2 and A3 of the lane lines and the range of inter-frame jump; judging the rationality of the lane width; and judging the reasonability of the length of the lane line. The method aims to primarily screen the obviously abnormal lane lines and then eliminate the screened abnormal lane lines.
In the embodiment, the contents of the trace verification include lane line loss, abnormal jump and width abnormality.
In the present embodiment, in the step (S1) and the step (S2), the cubic curve is smoothed by: coefficients of the cubic curve are smoothed by a variable-weight low-pass filter. And the abnormal jump is prevented by smoothing the cubic curve.
In this embodiment, in step (S3), the method for smoothly transitioning the lane line output by the smart camera to the lane line output by the high-precision map locator to achieve fusion specifically includes the following steps: predicting lane line parameters A0, A1, A2 and A3 of a cubic curve fitted by a next frame of lane line according to the lane line historically output by the intelligent camera according to the driving state of the vehicle; according to the empirical value, the A0 maximum jump 0.05, the A1 maximum jump 0.001, the A2 maximum jump 0.00001 and the A3 maximum jump 0.000001 are output as the current lane line parameters at a time until the lane line output by the current high-precision map locator 2 is transited to as the fusion output lane line. If the difference between the predicted parameter of the intelligent camera and the lane line parameter A0-A3 of the high-precision map locator 2 is too large, large jump can exist. The lane line parameters A0, A1, A2 and A3 of the cubic curve are cubic functions
y=A0x3+A1x2+A2x+A3Four constant terms in (1).
In the embodiment, whether the lane line of the high-precision map locator 2 is available at the moment is judged, and whether the precision and the confidence of the high-precision map locator 2 meet the use requirements or not is judged; secondly, whether the average transverse error of the lane line of the high-precision map locator 2, the history of the camera and the predicted lane line is stably less than 20 centimeters or not and whether abnormal jumping exists or not; if the precision and the confidence coefficient of the high-precision map locator 2 meet the use requirements, and the average transverse error of the lane line of the high-precision map locator 2, the history of the camera and the predicted lane line is stably less than 20 cm and has no abnormal jump, the lane line of the high-precision map locator 2 is usable, otherwise, the lane line of the high-precision map locator 2 is unusable. The positioning error of the high-precision map positioning equipment is inevitable, the accuracy of positioning affects the accuracy of the lane line, the average transverse error between the lane line output by the high-precision map positioner 2 and the history and predicted lane line of the intelligent camera is stably less than 20 cm and has no abnormal jump, and then the lane line of the high-precision map positioner 2 is usable.
In this embodiment, when no marker is used to longitudinally correct the positioning, the longitudinal positioning has a meter-level longitudinal error; when the lane is straight and has no special change, the influence of the longitudinal error on the accuracy of the lane line is small, and the lane line output by the high-precision map locator 2 has high confidence level; when a curve is formed, the influence of longitudinal errors on the accuracy of a lane line is large, the lane line output by the high-precision map positioner 2 has low confidence, and the higher the curvature of the curve is, the lower the confidence is; if the map lane line is not available, the confidence of the lane line is 0. The smaller the longitudinal error, the higher the confidence. And a confidence coefficient calculation mode of the fused lane line is a weighted confidence coefficient of a low-pass filter of the lane line parameters, and when the lane line is completely switched into a map lane line, influence factors of curvature are added, and the confidence coefficient is lower when the curvature is larger.
In this embodiment, when the lane line has a low confidence level or the confidence level is 0, the track of the preceding vehicle is acquired to follow the vehicle or the automatic driving control is assisted by the data inference of the historical lane line to ensure the safety of the automatic driving.
Referring to fig. 2, the present embodiment discloses a lane line fusion system based on intelligent camera and high-precision map positioning, including:
the intelligent camera 1 is used for outputting lane line information; the intelligent camera 1 is connected with a vehicle body CAN bus;
a GPS receiver 4 for receiving GPS satellite signals and determining a ground spatial location;
an inertial sensor 5 for outputting an IMU signal;
the high-precision map positioning controller 2 is used for finishing hardware system initialization, positioning algorithm initialization and outputting lane line information; the high-precision map positioning controller 2 is connected with a vehicle body CAN bus; the high-precision map positioning controller 2 is also connected with the intelligent camera 1, the GPS receiver 4 and the inertial sensor 5;
an automatic driving controller 3 for executing the above-mentioned steps (S3) and (S4); the intelligent camera 1 and the high-precision map positioning controller 2 are both connected with the automatic driving controller 3. The current vehicle is located the fence that has high-accuracy map to cover, can stably output location and map information.
The intelligent camera 1 and the high-precision map positioning controller 2 CAN acquire information such as wheel speed from a vehicle body CAN bus.
In this embodiment, the smart camera 1 is a forward-looking smart camera and has an ASIL class B function security smart camera. The high-precision map stores prior information of roads and has hysteresis, so that the current high-precision positioning equipment is not functionally safe. Meanwhile, the positioning error is large and unavoidable, so when the lane line output by the camera is not abnormal, the fusion scheme selects to trust the output of the camera.
According to the lane line fusion system and method based on the intelligent camera and the high-precision map positioning, the intelligent camera is adopted to complete lane line detection and the output of the pre-stored lane line by the high-precision map positioning device, target-level lane line information output by the intelligent camera and the high-precision map positioning device is fused, the lane line is subjected to smooth and stable transition processing to generate a more stable lane line, the more stable and reliable lane line for automatic driving transverse control is generated, the mutual supplementary redundancy of the advantages of the two sensors is realized, and the safety and the experience are improved; the abnormal jump is prevented by smoothing the cubic curve; and the lane line output by the process fusion algorithm is switched to the lane line to guide the vehicle to transit stably, so that the lane line can be further prevented from jumping, and the safety and the experience are improved.
The foregoing detailed description of the preferred embodiments of the invention has been presented. It should be understood that numerous modifications and variations could be devised by those skilled in the art in light of the present teachings without departing from the inventive concepts. Therefore, the technical solutions available to those skilled in the art through logic analysis, reasoning and limited experiments based on the prior art according to the concept of the present invention should be within the scope of protection defined by the claims.

Claims (10)

1. A lane line fusion method based on intelligent cameras and high-precision map positioning is characterized by comprising the following steps:
(S1) the preprocessing of the lane line information output by the smart camera (1) includes: the intelligent camera (1) outputs a cubic curve of the lane line and smoothes the cubic curve; completing the lost lane line; judging the rationality of the lane lines and rejecting abnormal curves;
(S2) the preprocessing of the lane line information output from the high-precision map positioning controller (2) includes: converting the longitude and latitude position of the vehicle and the course of the vehicle into relative coordinate points of a map lane line longitude and latitude point coordinate under a vehicle body coordinate system, then fitting into a cubic curve, and smoothing the cubic curve; judging the rationality of the lane lines and rejecting abnormal curves;
(S3) judging whether the lane line output by the preprocessed intelligent camera (1) is abnormal or not; if the lane line output by the intelligent camera (1) is not abnormal, selecting the lane line output by the intelligent camera (1) to assist automatic driving control;
if the lane line output by the intelligent camera (1) is abnormal, whether the lane line output by the high-precision map locator (2) is available or not needs to be judged, if yes, the lane line output by the high-precision map locator (2) is selected to assist automatic driving control, and the lane line output by the intelligent camera (1) is stably transited to the lane line output by the high-precision map locator (2) to realize fusion;
if the lane line output by the intelligent camera (1) is recovered, judging that the lane line output by the intelligent camera (1) is available, selecting the lane line output by the intelligent camera to assist automatic driving control, and enabling the lane line output by the high-precision map locator (2) to smoothly transit to the lane line output by the intelligent camera (1) to realize fusion;
(S4) performing tracking verification and processing on the fused lane line to output an accurate, stable and reliable lane line to assist automatic driving control.
2. The lane line fusion method based on intelligent camera and high-precision map positioning as claimed in claim 1, wherein in step (S1) and step (S2), the cubic curve is smoothed, specifically comprising the steps of: coefficients of the cubic curve are smoothed by a variable-weight low-pass filter.
3. The lane line fusion method based on intelligent camera and high-precision map positioning according to claim 1, wherein in step (S1), the lane line loss is completed, and the specific steps include: performing translation completion on the loss of the lane line on one side according to the historical lane width; and predicting and completing the condition that the left lane line and the right lane line are completely lost according to historical data.
4. The lane line fusion method based on intelligent camera and high-precision map positioning as claimed in claim 1, wherein in step (S3), the lane line output by the intelligent camera (1) smoothly transits to the lane line output by the high-precision map positioning device (2) to achieve fusion, specifically comprising the following steps: predicting lane line parameters A0, A1, A2 and A3 of a cubic curve fitted by a next frame of lane line according to the lane line historically output by the intelligent camera according to the driving state of the vehicle; according to the empirical value, the A0 maximum jump 0.05, the A1 maximum jump 0.001, the A2 maximum jump 0.00001 and the A3 maximum jump 0.000001 are output as the current lane line parameters at a time until the lane line output by the current high-precision map locator (2) is transited to as the fusion output lane line.
5. The lane line fusion method based on the intelligent camera and the high-precision map positioning is characterized in that whether the lane line of the high-precision map positioner (2) is available or not is judged according to the method of claim 1, and whether the precision and the confidence of the high-precision map positioner (2) meet the use requirements or not is judged; secondly, the high-precision map locator (2) judges whether the average transverse error between the lane line and the history of the camera and the predicted lane line is stably less than 20 cm or not and whether abnormal jumping exists or not; if the precision and the confidence coefficient of the high-precision map locator (2) meet the use requirements, and the average transverse error of the lane line of the high-precision map locator (2), the history of the camera and the predicted lane line is stably less than 20 cm and has no abnormal jump, the lane line of the high-precision map locator (2) is usable, otherwise, the lane line of the high-precision map locator (2) is unusable.
6. The lane line fusion method based on the intelligent camera and the high-precision map positioning according to claim 5, wherein when no marker longitudinally corrects the positioning, the longitudinal positioning has a meter-level longitudinal error; when the lane is straight and has no special change, the influence of longitudinal errors on the accuracy of the lane line is small, and the lane line output by the high-precision map locator (2) has high confidence level; when a curve is formed, the influence of longitudinal errors on the accuracy of a lane line is large, the lane line output by the high-precision map positioner (2) has low confidence, and the higher the curvature of the curve is, the lower the confidence is; if the map lane line is not available, the confidence of the lane line is 0.
7. The lane line fusion method based on intelligent cameras and high-precision map positioning according to claim 6, wherein when the lane line has low confidence or the confidence is 0, a track of a preceding vehicle is acquired to follow the vehicle or to assist automatic driving control according to history speculation.
8. The lane line fusion method based on smart camera and high-precision map positioning according to claim 1, wherein in steps (S1) and (S2), the lane line rationality judgment comprises: judging the rationality of the ranges of cubic curve coefficients A0, A1, A2 and A3 of the lane lines and the range of inter-frame jump; judging the rationality of the lane width; and judging the reasonability of the length of the lane line.
9. The utility model provides a lane line fuses system based on intelligence camera and high-precision map location which characterized in that includes:
the intelligent camera (1) is used for outputting lane line information; the intelligent camera (1) is connected with a vehicle body CAN bus;
a GPS receiver (4) for receiving GPS satellite signals and determining a ground spatial location;
an inertial sensor (5) for outputting an IMU signal;
the high-precision map positioning controller (2) is used for finishing hardware system initialization, positioning algorithm initialization and outputting lane line information; the high-precision map positioning controller (2) is connected with a vehicle body CAN bus; the high-precision map positioning controller (2) is also connected with the intelligent camera (1), the GPS receiver (4) and the inertial sensor (5);
an automatic driving controller (3), the automatic driving controller (3) being configured to perform the steps (S3) and (S4) as claimed in claim 1; the intelligent camera (1) and the high-precision map positioning controller (2) are connected with the automatic driving controller (3).
10. The lane line fusion system based on smart camera and high-precision map positioning according to claim 9, wherein the smart camera (1) is a forward-looking smart camera and has ASIL class B function security smart camera.
CN202010360355.7A 2020-04-30 2020-04-30 Lane line fusion system and method based on intelligent camera and high-precision map positioning Active CN111516673B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010360355.7A CN111516673B (en) 2020-04-30 2020-04-30 Lane line fusion system and method based on intelligent camera and high-precision map positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010360355.7A CN111516673B (en) 2020-04-30 2020-04-30 Lane line fusion system and method based on intelligent camera and high-precision map positioning

Publications (2)

Publication Number Publication Date
CN111516673A true CN111516673A (en) 2020-08-11
CN111516673B CN111516673B (en) 2022-08-09

Family

ID=71911393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010360355.7A Active CN111516673B (en) 2020-04-30 2020-04-30 Lane line fusion system and method based on intelligent camera and high-precision map positioning

Country Status (1)

Country Link
CN (1) CN111516673B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112373474A (en) * 2020-11-23 2021-02-19 重庆长安汽车股份有限公司 Lane line fusion and transverse control method, system, vehicle and storage medium
CN112721930A (en) * 2021-01-15 2021-04-30 重庆长安汽车股份有限公司 Vehicle cornering deceleration planning method, system, vehicle and storage medium
CN112818804A (en) * 2021-01-26 2021-05-18 重庆长安汽车股份有限公司 Parallel processing method and system for target level lane line, vehicle and storage medium
CN112896181A (en) * 2021-01-14 2021-06-04 重庆长安汽车股份有限公司 Electronic fence control method, system, vehicle and storage medium
CN113160454A (en) * 2021-05-31 2021-07-23 重庆长安汽车股份有限公司 Method and system for recharging historical sensor data of automatic driving vehicle
CN113436190A (en) * 2021-07-30 2021-09-24 重庆长安汽车股份有限公司 Lane line quality calculation method and device based on lane line curve coefficient and automobile
CN113487901A (en) * 2021-07-30 2021-10-08 重庆长安汽车股份有限公司 Lane width checking method and system based on camera perception
CN113591618A (en) * 2021-07-14 2021-11-02 重庆长安汽车股份有限公司 Method, system, vehicle and storage medium for estimating shape of road ahead
CN114023072A (en) * 2021-05-31 2022-02-08 合肥中科类脑智能技术有限公司 Vehicle violation monitoring system and method and computer readable storage medium
CN114419590A (en) * 2022-01-17 2022-04-29 北京百度网讯科技有限公司 High-precision map verification method, device, equipment and storage medium
EP4125051A1 (en) * 2021-07-30 2023-02-01 Nio Technology (Anhui) Co., Ltd Method and device for determining reliability of visual detection
EP4246466A1 (en) * 2022-03-15 2023-09-20 Beijing Tusen Zhitu Technology Co., Ltd. Control method, vehicle, device and storage medium

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1094298A2 (en) * 1999-10-09 2001-04-25 Volkswagen Aktiengesellschaft Method and device for driving using navigation support
JP2013045343A (en) * 2011-08-25 2013-03-04 Mitsubishi Electric Corp Navigation system, navigation device, and operation method of navigation system
US20150325127A1 (en) * 2014-05-06 2015-11-12 Toyota Motor Engineering & Manufacturing North America, Inc. Method and apparatus for determining lane identification in a roadway
CN105698812A (en) * 2016-01-15 2016-06-22 武汉光庭科技有限公司 Lane line detecting system and method based on safe driving map and cameras on two sides during automatic driving
CN105718865A (en) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 System and method for road safety detection based on binocular cameras for automatic driving
CN106627582A (en) * 2016-12-09 2017-05-10 重庆长安汽车股份有限公司 Path planning system and method for overtaking vehicle on adjacent lane in single-lane automatic drive mode
CN107139917A (en) * 2017-04-27 2017-09-08 江苏大学 It is a kind of based on mix theory pilotless automobile crosswise joint system and method
US20170329345A1 (en) * 2016-05-13 2017-11-16 Delphi Technologies, Inc. Lane-Keeping System For Automated Vehicles
US20170334452A1 (en) * 2016-05-19 2017-11-23 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and vehicle control program
US20180188060A1 (en) * 2016-12-30 2018-07-05 DeepMap Inc. Traffic Sign Feature Creation for High Definition Maps Used for Navigating Autonomous Vehicles
CN108253975A (en) * 2017-12-29 2018-07-06 驭势(上海)汽车科技有限公司 A kind of method and apparatus for establishing cartographic information and vehicle location
WO2018227980A1 (en) * 2017-06-13 2018-12-20 蔚来汽车有限公司 Camera sensor based lane line map construction method and construction system
CN109724615A (en) * 2019-02-28 2019-05-07 北京经纬恒润科技有限公司 A kind of method of calibration and system of Lane detection result
JP2019129417A (en) * 2018-01-25 2019-08-01 クラリオン株式会社 Display control apparatus and display system
CN110293970A (en) * 2019-05-22 2019-10-01 重庆长安汽车股份有限公司 A kind of travel control method of autonomous driving vehicle, device and automobile
WO2019207160A1 (en) * 2018-04-27 2019-10-31 Visteon Global Technologies, Inc. Map line interface for autonomous driving
US20190362633A1 (en) * 2018-05-24 2019-11-28 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and storage medium
US20200026282A1 (en) * 2018-07-23 2020-01-23 Baidu Usa Llc Lane/object detection and tracking perception system for autonomous vehicles
CN110956128A (en) * 2019-11-28 2020-04-03 重庆中星微人工智能芯片技术有限公司 Method, apparatus, electronic device, and medium for generating lane line image
CN110969059A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Lane line identification method and system
CN110969837A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Road information fusion system and method for automatic driving vehicle

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1094298A2 (en) * 1999-10-09 2001-04-25 Volkswagen Aktiengesellschaft Method and device for driving using navigation support
JP2013045343A (en) * 2011-08-25 2013-03-04 Mitsubishi Electric Corp Navigation system, navigation device, and operation method of navigation system
US20150325127A1 (en) * 2014-05-06 2015-11-12 Toyota Motor Engineering & Manufacturing North America, Inc. Method and apparatus for determining lane identification in a roadway
CN105698812A (en) * 2016-01-15 2016-06-22 武汉光庭科技有限公司 Lane line detecting system and method based on safe driving map and cameras on two sides during automatic driving
CN105718865A (en) * 2016-01-15 2016-06-29 武汉光庭科技有限公司 System and method for road safety detection based on binocular cameras for automatic driving
US20170329345A1 (en) * 2016-05-13 2017-11-16 Delphi Technologies, Inc. Lane-Keeping System For Automated Vehicles
US20170334452A1 (en) * 2016-05-19 2017-11-23 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and vehicle control program
CN106627582A (en) * 2016-12-09 2017-05-10 重庆长安汽车股份有限公司 Path planning system and method for overtaking vehicle on adjacent lane in single-lane automatic drive mode
US20180188060A1 (en) * 2016-12-30 2018-07-05 DeepMap Inc. Traffic Sign Feature Creation for High Definition Maps Used for Navigating Autonomous Vehicles
CN107139917A (en) * 2017-04-27 2017-09-08 江苏大学 It is a kind of based on mix theory pilotless automobile crosswise joint system and method
WO2018227980A1 (en) * 2017-06-13 2018-12-20 蔚来汽车有限公司 Camera sensor based lane line map construction method and construction system
CN108253975A (en) * 2017-12-29 2018-07-06 驭势(上海)汽车科技有限公司 A kind of method and apparatus for establishing cartographic information and vehicle location
JP2019129417A (en) * 2018-01-25 2019-08-01 クラリオン株式会社 Display control apparatus and display system
WO2019207160A1 (en) * 2018-04-27 2019-10-31 Visteon Global Technologies, Inc. Map line interface for autonomous driving
US20190362633A1 (en) * 2018-05-24 2019-11-28 Honda Motor Co., Ltd. Vehicle control device, vehicle control method, and storage medium
US20200026282A1 (en) * 2018-07-23 2020-01-23 Baidu Usa Llc Lane/object detection and tracking perception system for autonomous vehicles
CN110969059A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Lane line identification method and system
CN110969837A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Road information fusion system and method for automatic driving vehicle
CN109724615A (en) * 2019-02-28 2019-05-07 北京经纬恒润科技有限公司 A kind of method of calibration and system of Lane detection result
CN110293970A (en) * 2019-05-22 2019-10-01 重庆长安汽车股份有限公司 A kind of travel control method of autonomous driving vehicle, device and automobile
CN110956128A (en) * 2019-11-28 2020-04-03 重庆中星微人工智能芯片技术有限公司 Method, apparatus, electronic device, and medium for generating lane line image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
吴彦文等: "基于多传感融合的车道线检测与跟踪方法的研究", 《计算机应用研究》 *
赵翔等: "基于视觉和毫米波雷达的车道级定位方法", 《上海交通大学学报》 *
闫春香等: "辅助车道线检测的端到端自动驾驶", 《汽车实用技术》 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112373474A (en) * 2020-11-23 2021-02-19 重庆长安汽车股份有限公司 Lane line fusion and transverse control method, system, vehicle and storage medium
CN112373474B (en) * 2020-11-23 2022-05-17 重庆长安汽车股份有限公司 Lane line fusion and transverse control method, system, vehicle and storage medium
CN112896181A (en) * 2021-01-14 2021-06-04 重庆长安汽车股份有限公司 Electronic fence control method, system, vehicle and storage medium
CN112896181B (en) * 2021-01-14 2022-07-08 重庆长安汽车股份有限公司 Electronic fence control method, system, vehicle and storage medium
CN112721930A (en) * 2021-01-15 2021-04-30 重庆长安汽车股份有限公司 Vehicle cornering deceleration planning method, system, vehicle and storage medium
CN112818804A (en) * 2021-01-26 2021-05-18 重庆长安汽车股份有限公司 Parallel processing method and system for target level lane line, vehicle and storage medium
CN112818804B (en) * 2021-01-26 2024-02-20 重庆长安汽车股份有限公司 Parallel processing method, system, vehicle and storage medium for target-level lane lines
CN114023072A (en) * 2021-05-31 2022-02-08 合肥中科类脑智能技术有限公司 Vehicle violation monitoring system and method and computer readable storage medium
CN113160454A (en) * 2021-05-31 2021-07-23 重庆长安汽车股份有限公司 Method and system for recharging historical sensor data of automatic driving vehicle
CN113591618A (en) * 2021-07-14 2021-11-02 重庆长安汽车股份有限公司 Method, system, vehicle and storage medium for estimating shape of road ahead
CN113591618B (en) * 2021-07-14 2024-02-20 重庆长安汽车股份有限公司 Method, system, vehicle and storage medium for estimating shape of road ahead
CN113487901A (en) * 2021-07-30 2021-10-08 重庆长安汽车股份有限公司 Lane width checking method and system based on camera perception
CN113436190A (en) * 2021-07-30 2021-09-24 重庆长安汽车股份有限公司 Lane line quality calculation method and device based on lane line curve coefficient and automobile
EP4125051A1 (en) * 2021-07-30 2023-02-01 Nio Technology (Anhui) Co., Ltd Method and device for determining reliability of visual detection
CN113436190B (en) * 2021-07-30 2023-03-14 重庆长安汽车股份有限公司 Lane line quality calculation method and device based on lane line curve coefficient and automobile
CN114419590A (en) * 2022-01-17 2022-04-29 北京百度网讯科技有限公司 High-precision map verification method, device, equipment and storage medium
CN114419590B (en) * 2022-01-17 2024-03-19 北京百度网讯科技有限公司 Verification method, device, equipment and storage medium of high-precision map
EP4246466A1 (en) * 2022-03-15 2023-09-20 Beijing Tusen Zhitu Technology Co., Ltd. Control method, vehicle, device and storage medium

Also Published As

Publication number Publication date
CN111516673B (en) 2022-08-09

Similar Documents

Publication Publication Date Title
CN111516673B (en) Lane line fusion system and method based on intelligent camera and high-precision map positioning
US20200331495A1 (en) System for steering an autonomous vehicle
CN110530372B (en) Positioning method, path determining device, robot and storage medium
CN112537294B (en) Automatic parking control method and electronic equipment
CN109747645B (en) Driving assistance control system for vehicle
US20150055831A1 (en) Apparatus and method for recognizing a lane
US20220169280A1 (en) Method and Device for Multi-Sensor Data Fusion For Automated and Autonomous Vehicles
CN111409632A (en) Vehicle control method and device, computer equipment and storage medium
CN113942524B (en) Vehicle running control method, system and computer readable storage medium
CN112433531A (en) Trajectory tracking method and device for automatic driving vehicle and computer equipment
JP2018063476A (en) Apparatus, method and computer program for driving support
WO2022012316A1 (en) Control method, vehicle, and server
US10108866B2 (en) Method and system for robust curb and bump detection from front or rear monocular cameras
US20200193176A1 (en) Automatic driving controller and method
CN113009539A (en) Automatic lane changing processing method for vehicle, vehicle and equipment
CN115923839A (en) Vehicle path planning method
Yang et al. Autonomous lane keeping control system based on road lane model using deep convolutional neural networks
CN116625375A (en) Vehicle positioning method based on wheel parameter calibration and monocular lane line detection
US11267477B2 (en) Device and method for estimating the attention level of a driver of a vehicle
CN115373383A (en) Autonomous obstacle avoidance method and device for garbage recovery unmanned boat and related equipment
CN109631925B (en) Main and auxiliary road determining method and device, storage medium and electronic equipment
後方カメラ用画像処理技術 et al. Image processing technology for rear view camera (1): Development of lane detection system
CN112505724A (en) Road negative obstacle detection method and system
JP2019206310A (en) Steering angle determination device and automatic operation vehicle
CN116481548B (en) Positioning method and device for automatic driving vehicle and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant