CN112373474A - Lane line fusion and transverse control method, system, vehicle and storage medium - Google Patents

Lane line fusion and transverse control method, system, vehicle and storage medium Download PDF

Info

Publication number
CN112373474A
CN112373474A CN202011323037.XA CN202011323037A CN112373474A CN 112373474 A CN112373474 A CN 112373474A CN 202011323037 A CN202011323037 A CN 202011323037A CN 112373474 A CN112373474 A CN 112373474A
Authority
CN
China
Prior art keywords
lane
lane line
millimeter wave
fusion
guardrail
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011323037.XA
Other languages
Chinese (zh)
Other versions
CN112373474B (en
Inventor
卢斌
任传兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Changan Automobile Co Ltd
Original Assignee
Chongqing Changan Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Changan Automobile Co Ltd filed Critical Chongqing Changan Automobile Co Ltd
Priority to CN202011323037.XA priority Critical patent/CN112373474B/en
Publication of CN112373474A publication Critical patent/CN112373474A/en
Application granted granted Critical
Publication of CN112373474B publication Critical patent/CN112373474B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/18Propelling the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a lane line fusion and transverse control method, a system, a vehicle and a storage medium, comprising the following steps: acquiring guardrail information, road edge information, lane lines and lane line confidence rates detected by a forward-looking camera; acquiring road edge information and guardrail information detected by a forward millimeter wave radar; acquiring road edge information and guardrail information detected by a lateral millimeter wave radar; acquiring the road curvature and the road grade output by the ADAS map; acquiring a lane line and a lane line confidence rate detected by a look-around camera; fusing the lane lines based on the lane lines, the lane line confidence rate, the guardrail information, the road edge information, the road curvature and the road grade, and outputting the fused lane lines, the types of the lane lines and the fusion confidence rate; and transversely controlling the vehicle according to the fused lane line, the type of the lane line and the fusion confidence rate. When the lane line is lost, the method can virtualize a lane line, and cannot directly exit the system, so that the control continuity can be ensured, and the user experience is friendly.

Description

Lane line fusion and transverse control method, system, vehicle and storage medium
Technical Field
The invention belongs to the technical field of lane property fusion, and particularly relates to a lane line fusion and transverse control method, a lane line fusion and transverse control system, a vehicle and a storage medium.
Background
Along with the development of the intelligent driving technology of the automobile, more and more driving assistance technologies are produced in mass production on the passenger car, and the integration level of the driving assistance technologies is higher and higher. The driving assistance technology is a safety technology for assisting a driver in driving, and improves driving safety and comfort. As driving support techniques have become popular, the continuity of the driving support techniques has been increasing.
At present, in mainstream driving assistance, after a camera detects that a lane line is lost, a driving assistance system can directly exit, and the driving assistance system is very unfriendly for user experience.
Therefore, it is necessary to develop a lane line fusion and lateral control method, system, vehicle, and storage medium.
Disclosure of Invention
The invention aims to provide a lane line fusion and transverse control method, a lane line fusion and transverse control system, a vehicle and a storage medium.
The invention relates to a method for fusing and transversely controlling a base lane line, which comprises the following steps:
acquiring guardrail information, road edge information, lane lines and lane line confidence rates detected by a forward-looking camera;
acquiring road edge information and guardrail information detected by a forward millimeter wave radar;
acquiring road edge information and guardrail information detected by a lateral millimeter wave radar;
acquiring the road curvature and the road grade output by the ADAS map;
acquiring a lane line and a lane line confidence rate detected by a look-around camera;
fusing the lane lines based on the lane lines, the lane line confidence rate, the guardrail information, the road edge information, the road curvature and the road grade, and outputting the fused lane lines, the types of the lane lines and the fusion confidence rate;
and transversely controlling the vehicle according to the fused lane line, the type of the lane line and the fusion confidence rate.
Further, the method of fusing lane lines based on lane lines, lane line confidence rates, guardrail information, road edge information, road curvature and road grades and outputting the fused lane lines, the types of the lane lines and the fusion confidence rates specifically comprises the following steps:
(1) when the condition 1a and the condition 1b are simultaneously met, directly outputting the lane line detected by the forward-looking camera, wherein the type of the lane line is a detection mode, and the fusion confidence rate is high;
condition 1 a: the road grade output by the ADAS map is a high speed or urban expressway;
condition 1 b: the forward-looking camera detects the lane line of the lane, and the confidence rate of the lane line of the lane is greater than a first preset value;
(2) when the conditions 2a to 2c are simultaneously met, taking the width of the lane before loss as a reference, translating the lane line of the lane according to the lane lines of the left lane and the right lane, outputting a fused lane line, wherein the type of the lane line is a prediction mode, and the fusion confidence rate is high;
condition 2 a: the road grade output by the ADAS map is a high speed or urban expressway;
condition 2 b: the forward-looking camera detects the lane line of the vehicle lane, and the confidence rate of the lane line of the vehicle lane is smaller than a first preset value;
condition 2 c: the front-view camera detects lane lines of the left lane and the right lane, the confidence rate of the lane lines of the left lane and the right lane is greater than a first preset value, and the error between the width of the left lane and the width of the right lane detected by the front-view camera and the width of the left lane and the width of the right lane when the confidence rate of the lane lines of the vehicle is high is smaller than a second preset value;
(3) when the conditions 3a to 3f are met, the width of the lane detected by the all-round-looking camera is used as a reference, the road edge or the guardrail detected by the all-round-looking camera is translated, a fused lane line is output, the type of the lane line is a prediction mode, and the fusion confidence rate is medium;
3 a: the road grade output by the ADAS map is a high speed or urban expressway;
3 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value;
3 c: the forward-looking camera detects a road edge or a guardrail, and the confidence rate of the guardrail or the road edge is greater than a third preset value;
3 d: the front millimeter wave radar or the side rear millimeter wave radar detects a road edge or a guardrail, and the confidence rate of the road edge or the guardrail is greater than a third preset value;
3 e: the all-round-looking camera detects lane lines on two sides, and the confidence rate of the lane lines on the two sides is greater than a third preset value;
3 f: the curvature error of the road edge or the guardrail detected by the forward-looking camera and the forward millimeter wave radar or the lateral millimeter wave radar is within a fourth preset value;
(4) when the conditions 4a to 4f are met, taking the width of the lane detected by the look-around camera as a reference, translating the road edges or guardrails detected by the lateral millimeter wave radar and the forward millimeter wave radar, and outputting a fused lane line, wherein the type of the lane line is a prediction mode, and the fusion confidence rate is medium;
4 a: the road grade output by the ADAS map is a high speed or urban expressway;
4 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value;
4 c: the forward-looking camera cannot detect the road edge or the guardrail, or the confidence rate of the guardrail or the road edge is smaller than a third preset value;
4 d: the front millimeter wave radar and the side rear millimeter wave radar detect the road edge or the guardrail, and the confidence rates of the road edge or the guardrail are both greater than a third preset value;
4 e: the all-round-looking camera can detect lane lines on two sides, and the confidence rate of the lane lines is greater than a third preset value;
4 f: the curvature error of the road edge or the guardrail detected by the forward millimeter wave radar and the lateral millimeter wave radar is within a fourth preset value;
(5) when the conditions 5a to 5g are met, taking the width of the lane before loss as a reference, translating the road edges or guardrails detected by the lateral millimeter wave radar and the forward millimeter wave radar, and outputting a fused lane line, wherein the type of the lane line is a prediction mode, and the fusion confidence rate is low;
5 a: the road grade output by the ADAS map is a high speed or urban expressway;
5 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value;
5 c: the forward-looking camera cannot detect the road edge or the guardrail, or the confidence rate of the guardrail or the road edge is smaller than a third preset value;
5 d: the front millimeter wave radar and the side rear millimeter wave radar detect the road edge or the guardrail, and the confidence rates of the road edge or the guardrail are both greater than a third preset value;
5 e: the all-round-looking camera cannot detect lane lines on two sides, or the confidence rates of the output lane lines are all smaller than a third preset value;
5 f: the curvature error of the road edge or the guardrail detected by the forward millimeter wave radar and the lateral millimeter wave radar is within a fourth preset value;
5 g: and the error between the curvature of the road output by the ADAS map and the curvature output by the lateral millimeter wave radar and the forward millimeter wave radar is smaller than a fifth preset value.
Further, according to the fused lane line, the type of the lane line and the fusion confidence rate, the vehicle is transversely controlled, specifically:
when the fusion lane line is in a detection mode and the fusion confidence rate is high, performing long-time transverse control on the vehicle on the basis of the lane line;
when the fusion lane line is in a prediction mode and the fusion confidence rate is high, performing long-time transverse control on the vehicle on the basis of the lane line;
when the fusion lane line is in the prediction mode and the fusion confidence rate is middle, the vehicle is transversely controlled for a period of time based on the lane line,
when the fusion lane line is in the prediction mode and the fusion confidence rate is low, the vehicle is subjected to short-time lateral control based on the lane line.
Further, the longer time is within 2km of driving or within 50s of driving; the period of time is within 800m of driving or within 20s of driving; the short time is within 150m or within 10m of driving.
Further, the first preset value is 90%; the second preset value is 5%; the third preset value is 95%; the fourth preset value is 10%; the fifth preset value is 20%.
In a second aspect, the present invention provides a lane line fusion and lateral control system, including:
the forward-looking camera is used for detecting guardrail information, road edge information, lane lines and lane line confidence rates;
the forward millimeter wave radar is used for detecting detection road edge information and guardrail information;
the lateral millimeter wave radar is used for detecting road edge information and guardrail information;
ADAS maps of road curvature and road grade for output;
the all-round camera is used for detecting the lane lines and the confidence rates of the lane lines;
a memory for storing a computer readable program;
the controller is used for receiving data output by the forward-looking camera, the forward millimeter wave radar, the lateral millimeter wave radar, the ADAS map and the look-around camera, and is respectively and electrically connected with the memory, the forward-looking camera, the forward millimeter wave radar, the lateral millimeter wave radar, the ADAS map and the look-around camera; the computer readable program when invoked by a controller is capable of performing the steps of the lane-line fusion and lateral control method of claims 1 to 5.
In a third aspect, the invention provides a vehicle, which adopts the lane line fusion and transverse control system.
In a fourth aspect, the present invention provides a storage medium having a computer readable program stored therein, where the computer readable program is capable of executing the steps of the lane line fusion and lateral control method according to the present invention when the computer readable program is called.
The invention has the following advantages: the lane line fusion technology based on the combination of the foresight camera, the forward millimeter wave radar, the lateral millimeter wave radar, the ADAS map and the look-around camera identifies lane lines, curbs and guardrails through the foresight camera, the curbs and guardrails identified by the forward millimeter wave radar, the curbs and road information provided by the lateral millimeter wave radar, the guardrails and the ADAS map are fused, when the lane line identified by the camera is lost, the relevant curbs, the guardrails and the curbs information are fused, one lane line is virtualized, the system cannot be directly withdrawn, the continuity of control can be guaranteed, and the lane line fusion technology is very friendly to user experience.
Drawings
FIG. 1 is a block diagram of system elements;
FIG. 2 is a process flow diagram of a scenario;
FIG. 3 is a flowchart of a scenario two process;
FIG. 4 is a flow chart of a scenario three process;
FIG. 5 is a flow chart of a scene four process;
FIG. 6 is a scene five process flow diagram;
fig. 7 is a fusion decision flow diagram.
Detailed Description
The invention will be further explained with reference to the drawings.
In this embodiment, a method for merging and laterally controlling a base lane line includes the following steps:
acquiring guardrail information, road edge information, lane lines and lane line confidence rates detected by a forward-looking camera;
acquiring road edge information and guardrail information detected by a forward millimeter wave radar;
acquiring road edge information and guardrail information detected by a lateral millimeter wave radar;
acquiring the road curvature and the road grade output by the ADAS map;
acquiring a lane line and a lane line confidence rate detected by a look-around camera;
fusing the lane lines based on the lane lines, the lane line confidence rate, the guardrail information, the road edge information, the road curvature and the road grade, and outputting the fused lane lines, the types of the lane lines and the fusion confidence rate;
and transversely controlling the vehicle according to the fused lane line, the type of the lane line and the fusion confidence rate.
In this embodiment, the front-view camera includes a high-definition telephoto camera and an image processing chip. The front-view camera is arranged in the front windshield of the vehicle and is used for collecting image information right in front of the vehicle. The image processing chip identifies the original image, identifies lane marking, road edge or guardrail information in the image and transmits an identification result to the controller.
The method is mainly used for making up for the situation that the lane line cannot be well identified in the environments of lane line loss, backlight and shadow. The specific implementation process comprises scene classification, scene processing, fusion decision and the like.
Data scene classification
(1) The forward-looking camera can detect the lane line of the vehicle lane, and when the confidence of the detected lane line is greater than a first preset value (recommended 90%), the situation is scene one (namely the lane line of the vehicle lane is clear).
(2) The forward-looking camera can detect the lane line of the vehicle, but the confidence of the detected lane line is low, and the forward-looking camera can detect the lane line of the left lane and the right lane, and when the confidence is high, the scene is a second scene (namely the lane line of the vehicle is fuzzy, the water is accumulated in the vehicle, the camera part is backlighted, and the ground is reflected).
(3) The forward-looking camera cannot detect the lane line, only can detect the road edge or the guardrail, and when the forward-looking camera can detect the lane line, the scene three is shown (namely partial backlight in front of the camera and ground reflection).
(4) The forward-looking camera cannot detect lane lines and road edges or guardrails, the look-around camera can detect the lane lines, the lateral millimeter wave radar or the front radar can detect the road edges or guardrails, and the scene is four (namely, the camera is backlighted).
(5) When the forward-looking camera cannot detect the lane line, the look-around camera cannot detect the lane line, and the forward millimeter wave radar and the lateral millimeter wave radar can detect the road edge or the guardrail, the situation is a scene five (namely the lane line on the road is lost);
(II) scene processing:
fusing the lane lines based on the lane lines, the lane line confidence rate, the guardrail information, the road edge information, the road curvature and the road grade, and outputting the fused lane lines, the types of the lane lines and the fusion confidence rate, which specifically comprises the following steps:
scene one: as shown in fig. 2, when the condition 1a and the condition 1b are simultaneously satisfied, the lane line detected by the forward-looking camera is directly output, the type of the lane line is a detection mode, and the fusion confidence rate is high;
condition 1 a: the road grade output by the ADAS map is a high speed or urban expressway;
condition 1 b: the front-view camera detects the lane line of the lane, and the confidence rate of the lane line of the lane is larger than a first preset value (for example, 90%);
scene two: as shown in fig. 3, when the conditions 2a to 2c are simultaneously satisfied, the lane line of the own lane is translated according to the lane lines of the left and right lanes on the basis of the width of the own lane before loss, and a fused lane line is output, where the type of the lane line is a prediction mode and the fusion confidence rate is high;
condition 2 a: the road grade output by the ADAS map is a high speed or urban expressway;
condition 2 b: the forward-looking camera detects the lane line of the vehicle, and the confidence rate of the lane line of the vehicle is smaller than a first preset value (for example, 90%);
condition 2 c: when the confidence rate of the lane lines of the left lane and the right lane is higher than a first preset value (for example: 90 percent), and the error between the width of the left lane and the width of the right lane detected by the forward-looking camera and the width of the left lane and the width of the right lane when the confidence rate of the lane lines of the self-vehicle is high is smaller than a second preset value (for example: 5 percent);
scene three: as shown in fig. 4, when the conditions 3a to 3f are simultaneously satisfied, the road width detected by the look-around camera is used as a reference, the road edge detected by the look-ahead camera (i.e. the curvature equation detected by the look-ahead camera) is translated, and a fused lane line is output, wherein the type of the lane line is a prediction mode, and the fusion confidence rate is medium;
3 a: the road grade output by the ADAS map is a high speed or urban expressway;
3 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value (for example: 90%);
3 c: the forward-looking camera detects a road edge or a guardrail, and the confidence rate of the guardrail or the road edge is greater than a third preset value (for example: 95%);
3 d: the front millimeter wave radar or the side rear millimeter wave radar detects a road edge or a guardrail, and the confidence rate of the road edge or the guardrail is greater than a third preset value (for example, 95%);
3 e: the look-around camera detects lane lines on two sides, and the confidence rate of the lane lines on two sides is greater than a third preset value (for example, 95%);
3 f: the curvature error of the road edge or the guardrail detected by the forward-looking camera and the forward millimeter wave radar or the lateral millimeter wave radar is within a fourth preset value (for example, 10 percent);
scene four: as shown in fig. 5, when the conditions 4a to 4f are simultaneously satisfied, the width of the lane detected by the look-around camera is used as a reference, the road edges or guardrails detected by the lateral millimeter wave radar and the forward millimeter wave radar are translated, and a fused lane line is output, wherein the type of the lane line is a prediction mode, and the fusion confidence rate is medium;
4 a: the road grade output by the ADAS map is a high speed or urban expressway;
4 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value (for example: 90%);
4 c: the forward-looking camera cannot detect the road edge or the guardrail, or the confidence rate of the guardrail or the road edge is less than a third preset value (95%);
4 d: the front millimeter wave radar and the side rear millimeter wave radar detect the road edge or the guardrail, and the confidence rate of the road edge or the guardrail is greater than a third preset value (for example, 95%);
4 e: the all-round-looking camera can detect lane lines on two sides, and the confidence rate of the lane lines is greater than a third preset value (95%);
4 f: the curvature error of the road edge or the guardrail detected by the forward millimeter wave radar and the lateral millimeter wave radar is within a fourth preset value (for example, 10 percent);
scene five: as shown in fig. 6, when the conditions 5a to 5g are simultaneously satisfied, the lane width before loss is taken as a reference, the road edges or guardrails detected by the lateral millimeter wave radar and the forward millimeter wave radar are translated, and a fused lane line is output, the type of the lane line is a prediction mode, and the fusion confidence rate is low;
5 a: the road grade output by the ADAS map is a high speed or urban expressway;
5 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value (for example: 90%);
5 c: the forward-looking camera cannot detect the road edge or the guardrail, or the confidence rate of the guardrail or the road edge is less than a third preset value (for example: 95%);
5 d: the front millimeter wave radar and the side rear millimeter wave radar detect the road edge or the guardrail, and the confidence rate of the road edge or the guardrail is greater than a third preset value (for example, 95%);
5 e: the all-round-looking camera cannot detect lane lines on two sides, or the confidence rates of the output lane lines are all smaller than a third preset value (for example, 95%);
5 f: the curvature error of the road edge or the guardrail detected by the forward millimeter wave radar and the lateral millimeter wave radar is within a fourth preset value (for example, 10 percent);
5 g: the error between the curvature of the road output by the ADAS map and the curvature output by the lateral millimeter wave radar and the forward millimeter wave radar is smaller than a fifth preset value (for example, 20%).
In this embodiment, the first preset value, the second preset value, the third preset value, the fourth preset value and the fifth preset value can be properly adjusted according to actual conditions.
(III) fusion decision:
as shown in fig. 7, the vehicle is controlled laterally according to the fused lane line, the type of the lane line, and the fusion confidence rate, specifically:
when the fusion lane line is in a detection mode and the fusion confidence rate is high, performing long-time transverse control on the vehicle on the basis of the lane line;
when the fusion lane line is in a prediction mode and the fusion confidence rate is high, performing transverse control on the vehicle for a long time (such as running within 2km or running within 50s and properly adjusting according to actual conditions) based on the lane line;
when the fusion lane line is in the prediction mode and the fusion confidence rate is middle, the vehicle is transversely controlled for a period of time (such as running within 800m or running within 20s and being properly adjusted according to actual conditions) based on the lane line,
when the fusion lane line is in a prediction mode and the fusion confidence rate is low, the vehicle is transversely controlled for a short time (such as within 150m of driving or within 10m of driving and properly adjusted according to actual conditions) based on the lane line, and the transverse control is used for the driver to take over the stable control before taking over.
In this embodiment, a lane line fusion and lateral control system includes:
the forward-looking camera is used for detecting guardrail information, road edge information, lane lines and lane line confidence rates;
the forward millimeter wave radar is used for detecting detection road edge information and guardrail information;
the lateral millimeter wave radar is used for detecting road edge information and guardrail information;
ADAS maps of road curvature and road grade for output;
the all-round camera is used for detecting the lane lines and the confidence rates of the lane lines;
a memory for storing a computer readable program;
a controller (in this embodiment, the controller includes a fusion processing unit and a transverse control unit) for receiving data output by the forward-looking camera, the forward-looking millimeter wave radar, the lateral millimeter wave radar, the ADAS map and the look-around camera, wherein the controller is electrically connected with the memory, the forward-looking camera, the forward-looking millimeter wave radar, the lateral millimeter wave radar, the ADAS map and the look-around camera respectively; the computer readable program, when invoked by a controller, is capable of performing the steps of the lane-line fusion and lateral control method as described in this embodiment.
In this embodiment, a vehicle adopts the lane line fusion and lateral control system described in this embodiment.
In this embodiment, a storage medium stores a computer readable program, and the computer readable program can execute the steps of the lane line fusion and lateral control method described in this embodiment when being called.

Claims (8)

1. A lane line fusion and transverse control method is characterized by comprising the following steps:
acquiring guardrail information, road edge information, lane lines and lane line confidence rates detected by a forward-looking camera;
acquiring road edge information and guardrail information detected by a forward millimeter wave radar;
acquiring road edge information and guardrail information detected by a lateral millimeter wave radar;
acquiring the road curvature and the road grade output by the ADAS map;
acquiring a lane line and a lane line confidence rate detected by a look-around camera;
fusing the lane lines based on the lane lines, the lane line confidence rate, the guardrail information, the road edge information, the road curvature and the road grade, and outputting the fused lane lines, the types of the lane lines and the fusion confidence rate;
and transversely controlling the vehicle according to the fused lane line, the type of the lane line and the fusion confidence rate.
2. The lane line fusion and lateral control method according to claim 1, wherein: the method comprises the following steps of fusing lane lines based on lane lines, lane line confidence rates, guardrail information, road edge information, road curvature and road grades, and outputting the fused lane lines, the types of the lane lines and the fusion confidence rates, and specifically comprises the following steps:
(1) when the condition 1a and the condition 1b are simultaneously met, directly outputting the lane line detected by the forward-looking camera, wherein the type of the lane line is a detection mode, and the fusion confidence rate is high;
condition 1 a: the road grade output by the ADAS map is a high speed or urban expressway;
condition 1 b: the forward-looking camera detects the lane line of the lane, and the confidence rate of the lane line of the lane is greater than a first preset value;
(2) when the conditions 2a to 2c are simultaneously met, taking the width of the lane before loss as a reference, translating the lane line of the lane according to the lane lines of the left lane and the right lane, outputting a fused lane line, wherein the type of the lane line is a prediction mode, and the fusion confidence rate is high;
condition 2 a: the road grade output by the ADAS map is a high speed or urban expressway;
condition 2 b: the forward-looking camera detects the lane line of the vehicle lane, and the confidence rate of the lane line of the vehicle lane is smaller than a first preset value;
condition 2 c: the front-view camera detects lane lines of the left lane and the right lane, the confidence rate of the lane lines of the left lane and the right lane is greater than a first preset value, and the error between the width of the left lane and the width of the right lane detected by the front-view camera and the width of the left lane and the width of the right lane when the confidence rate of the lane lines of the vehicle is high is smaller than a second preset value;
(3) when the conditions 3a to 3f are met, the width of the lane detected by the all-round-looking camera is used as a reference, the road edge or the guardrail detected by the all-round-looking camera is translated, a fused lane line is output, the type of the lane line is a prediction mode, and the fusion confidence rate is medium;
3 a: the road grade output by the ADAS map is a high speed or urban expressway;
3 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value;
3 c: the forward-looking camera detects a road edge or a guardrail, and the confidence rate of the guardrail or the road edge is greater than a third preset value;
3 d: the front millimeter wave radar or the side rear millimeter wave radar detects a road edge or a guardrail, and the confidence rate of the road edge or the guardrail is greater than a third preset value;
3 e: the all-round-looking camera detects lane lines on two sides, and the confidence rate of the lane lines on the two sides is greater than a third preset value;
3 f: the curvature error of the road edge or the guardrail detected by the forward-looking camera and the forward millimeter wave radar or the lateral millimeter wave radar is within a fourth preset value;
(4) when the conditions 4a to 4f are met, taking the width of the lane detected by the look-around camera as a reference, translating the road edges or guardrails detected by the lateral millimeter wave radar and the forward millimeter wave radar, and outputting a fused lane line, wherein the type of the lane line is a prediction mode, and the fusion confidence rate is medium;
4 a: the road grade output by the ADAS map is a high speed or urban expressway;
4 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value;
4 c: the forward-looking camera cannot detect the road edge or the guardrail, or the confidence rate of the guardrail or the road edge is smaller than a third preset value;
4 d: the front millimeter wave radar and the side rear millimeter wave radar detect the road edge or the guardrail, and the confidence rates of the road edge or the guardrail are both greater than a third preset value;
4 e: the all-round-looking camera can detect lane lines on two sides, and the confidence rate of the lane lines is greater than a third preset value;
4 f: the curvature error of the road edge or the guardrail detected by the forward millimeter wave radar and the lateral millimeter wave radar is within a fourth preset value;
(5) when the conditions 5a to 5g are met, taking the width of the lane before loss as a reference, translating the road edges or guardrails detected by the lateral millimeter wave radar and the forward millimeter wave radar, and outputting a fused lane line, wherein the type of the lane line is a prediction mode, and the fusion confidence rate is low;
5 a: the road grade output by the ADAS map is a high speed or urban expressway;
5 b: the forward-looking camera cannot detect the lane line or the confidence rate of the output lane line is smaller than a first preset value;
5 c: the forward-looking camera cannot detect the road edge or the guardrail, or the confidence rate of the guardrail or the road edge is smaller than a third preset value;
5 d: the front millimeter wave radar and the side rear millimeter wave radar detect the road edge or the guardrail, and the confidence rates of the road edge or the guardrail are both greater than a third preset value;
5 e: the all-round-looking camera cannot detect lane lines on two sides, or the confidence rates of the output lane lines are all smaller than a third preset value;
5 f: the curvature error of the road edge or the guardrail detected by the forward millimeter wave radar and the lateral millimeter wave radar is within a fourth preset value;
5 g: and the error between the curvature of the road output by the ADAS map and the curvature output by the lateral millimeter wave radar and the forward millimeter wave radar is smaller than a fifth preset value.
3. The lane line fusion and lateral control method according to claim 2, wherein: according to the fused lane line, the type of the lane line and the fusion confidence rate, the vehicle is transversely controlled, and the method specifically comprises the following steps:
when the fusion lane line is in a detection mode and the fusion confidence rate is high, performing long-time transverse control on the vehicle on the basis of the lane line;
when the fusion lane line is in a prediction mode and the fusion confidence rate is high, performing long-time transverse control on the vehicle on the basis of the lane line;
when the fusion lane line is in the prediction mode and the fusion confidence rate is middle, the vehicle is transversely controlled for a period of time based on the lane line,
when the fusion lane line is in the prediction mode and the fusion confidence rate is low, the vehicle is subjected to short-time lateral control based on the lane line.
4. The lane line fusion and lateral control method of claim 3, wherein: the longer time is within 2km of driving or within 50s of driving; the period of time is within 800m of driving or within 20s of driving; the short time is within 150m or within 10m of driving.
5. The lane line fusion and lateral control method of claim 4, wherein: the first preset value is 90%; the second preset value is 5%; the third preset value is 95%; the fourth preset value is 10%; the fifth preset value is 20%.
6. The utility model provides a lane line fuses and horizontal control system which characterized in that: the method comprises the following steps:
the forward-looking camera is used for detecting guardrail information, road edge information, lane lines and lane line confidence rates;
the forward millimeter wave radar is used for detecting detection road edge information and guardrail information;
the lateral millimeter wave radar is used for detecting road edge information and guardrail information;
ADAS maps of road curvature and road grade for output;
the all-round camera is used for detecting the lane lines and the confidence rates of the lane lines;
a memory for storing a computer readable program;
the controller is used for receiving data output by the forward-looking camera, the forward millimeter wave radar, the lateral millimeter wave radar, the ADAS map and the look-around camera, and is respectively and electrically connected with the memory, the forward-looking camera, the forward millimeter wave radar, the lateral millimeter wave radar, the ADAS map and the look-around camera; the computer readable program when invoked by a controller is capable of performing the steps of the lane-line fusion and lateral control method of any of claims 1 to 5.
7. A vehicle, characterized in that: the lane line fusion and lateral control system of claim 6 is employed.
8. A storage medium having a computer-readable program stored therein, characterized in that: the computer readable program when invoked is capable of performing the steps of the lane line fusion and lateral control method of any of claims 1 to 5.
CN202011323037.XA 2020-11-23 2020-11-23 Lane line fusion and transverse control method, system, vehicle and storage medium Active CN112373474B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011323037.XA CN112373474B (en) 2020-11-23 2020-11-23 Lane line fusion and transverse control method, system, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011323037.XA CN112373474B (en) 2020-11-23 2020-11-23 Lane line fusion and transverse control method, system, vehicle and storage medium

Publications (2)

Publication Number Publication Date
CN112373474A true CN112373474A (en) 2021-02-19
CN112373474B CN112373474B (en) 2022-05-17

Family

ID=74587445

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011323037.XA Active CN112373474B (en) 2020-11-23 2020-11-23 Lane line fusion and transverse control method, system, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN112373474B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114353817A (en) * 2021-12-28 2022-04-15 重庆长安汽车股份有限公司 Multi-source sensor lane line determination method, system, vehicle and computer-readable storage medium
CN114368393A (en) * 2021-12-21 2022-04-19 重庆长安汽车股份有限公司 Lane line loss early warning method and system on straight lane and man-machine driving method
CN114396958A (en) * 2022-02-28 2022-04-26 重庆长安汽车股份有限公司 Lane positioning method and system based on multiple lanes and multiple sensors and vehicle
WO2023005354A1 (en) * 2021-07-30 2023-02-02 驭势(上海)汽车科技有限公司 Vehicle control method and apparatus, electronic device, and storage medium

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062747A1 (en) * 2010-07-20 2012-03-15 Gm Global Technology Operations, Inc. Lane fusion system using forward-view and rear-view cameras
US20150203114A1 (en) * 2014-01-22 2015-07-23 Honda Research Institute Europe Gmbh Lane relative position estimation method and system for driver assistance systems
DE102016220717A1 (en) * 2016-10-21 2018-05-09 Volkswagen Aktiengesellschaft Determining a lane and lateral control for a vehicle
CN108189838A (en) * 2017-12-30 2018-06-22 吉利汽车研究院(宁波)有限公司 A kind of pattern of fusion adaptive cruise curve control method and device
CN108764187A (en) * 2018-06-01 2018-11-06 百度在线网络技术(北京)有限公司 Extract method, apparatus, equipment, storage medium and the acquisition entity of lane line
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
CN109492566A (en) * 2018-10-31 2019-03-19 奇瑞汽车股份有限公司 Lane position information acquisition method, device and storage medium
CN110781816A (en) * 2019-10-25 2020-02-11 北京行易道科技有限公司 Method, device, equipment and storage medium for transverse positioning of vehicle in lane
CN110969059A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Lane line identification method and system
TWI694019B (en) * 2019-06-05 2020-05-21 國立中正大學 Lane line detection and tracking method
WO2020103892A1 (en) * 2018-11-21 2020-05-28 北京市商汤科技开发有限公司 Lane line detection method and apparatus, electronic device, and readable storage medium
CN111291676A (en) * 2020-02-05 2020-06-16 清华大学 Lane line detection method and device based on laser radar point cloud and camera image fusion and chip
CN111401446A (en) * 2020-03-16 2020-07-10 重庆长安汽车股份有限公司 Single-sensor and multi-sensor lane line rationality detection method and system and vehicle
WO2020146983A1 (en) * 2019-01-14 2020-07-23 深圳市大疆创新科技有限公司 Lane detection method and apparatus, lane detection device, and mobile platform
CN111516673A (en) * 2020-04-30 2020-08-11 重庆长安汽车股份有限公司 Lane line fusion system and method based on intelligent camera and high-precision map positioning
CN111860155A (en) * 2020-06-12 2020-10-30 华为技术有限公司 Lane line detection method and related equipment
US20200349363A1 (en) * 2019-05-02 2020-11-05 Miguel Hurtado Method and system for estimating lane lines in vehicle advanced driver assistance driver assistance systems

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062747A1 (en) * 2010-07-20 2012-03-15 Gm Global Technology Operations, Inc. Lane fusion system using forward-view and rear-view cameras
US20150203114A1 (en) * 2014-01-22 2015-07-23 Honda Research Institute Europe Gmbh Lane relative position estimation method and system for driver assistance systems
DE102016220717A1 (en) * 2016-10-21 2018-05-09 Volkswagen Aktiengesellschaft Determining a lane and lateral control for a vehicle
CN108189838A (en) * 2017-12-30 2018-06-22 吉利汽车研究院(宁波)有限公司 A kind of pattern of fusion adaptive cruise curve control method and device
CN108764187A (en) * 2018-06-01 2018-11-06 百度在线网络技术(北京)有限公司 Extract method, apparatus, equipment, storage medium and the acquisition entity of lane line
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
CN110969059A (en) * 2018-09-30 2020-04-07 长城汽车股份有限公司 Lane line identification method and system
CN109492566A (en) * 2018-10-31 2019-03-19 奇瑞汽车股份有限公司 Lane position information acquisition method, device and storage medium
WO2020103892A1 (en) * 2018-11-21 2020-05-28 北京市商汤科技开发有限公司 Lane line detection method and apparatus, electronic device, and readable storage medium
WO2020146983A1 (en) * 2019-01-14 2020-07-23 深圳市大疆创新科技有限公司 Lane detection method and apparatus, lane detection device, and mobile platform
US20200349363A1 (en) * 2019-05-02 2020-11-05 Miguel Hurtado Method and system for estimating lane lines in vehicle advanced driver assistance driver assistance systems
TWI694019B (en) * 2019-06-05 2020-05-21 國立中正大學 Lane line detection and tracking method
CN110781816A (en) * 2019-10-25 2020-02-11 北京行易道科技有限公司 Method, device, equipment and storage medium for transverse positioning of vehicle in lane
CN111291676A (en) * 2020-02-05 2020-06-16 清华大学 Lane line detection method and device based on laser radar point cloud and camera image fusion and chip
CN111401446A (en) * 2020-03-16 2020-07-10 重庆长安汽车股份有限公司 Single-sensor and multi-sensor lane line rationality detection method and system and vehicle
CN111516673A (en) * 2020-04-30 2020-08-11 重庆长安汽车股份有限公司 Lane line fusion system and method based on intelligent camera and high-precision map positioning
CN111860155A (en) * 2020-06-12 2020-10-30 华为技术有限公司 Lane line detection method and related equipment

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
HUANG HAIBO等: "Lane recognition and tracking based on muli-features fuzzy fusion and particle filter", 《2011 INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS, COMMUNICATIONS AND NETWORKS (CECNET)》 *
LIYUSHAN等: "Research on lane a compensation methon based on multi-sensor fusion", 《SENSORS》 *
刘晓龙等: "基于全局与局部模型相互制约及具有模型不确定性评估的车道线检测方法", 《东南大学学报(自然科学版)》 *
惠飞等: "基于动态概率网格和贝叶斯决策网络的车辆变道辅助驾驶决策方法", 《交通运输工程学报》 *
陈洋: "基于深度学习的复杂环境下车道线检测方法研究", 《上海工程技术大学硕士论文》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023005354A1 (en) * 2021-07-30 2023-02-02 驭势(上海)汽车科技有限公司 Vehicle control method and apparatus, electronic device, and storage medium
CN114368393A (en) * 2021-12-21 2022-04-19 重庆长安汽车股份有限公司 Lane line loss early warning method and system on straight lane and man-machine driving method
CN114368393B (en) * 2021-12-21 2023-09-15 重庆长安汽车股份有限公司 Lane line loss early warning method and system on straight road and man-machine co-driving method
CN114353817A (en) * 2021-12-28 2022-04-15 重庆长安汽车股份有限公司 Multi-source sensor lane line determination method, system, vehicle and computer-readable storage medium
CN114353817B (en) * 2021-12-28 2023-08-15 重庆长安汽车股份有限公司 Multi-source sensor lane line determination method, system, vehicle and computer readable storage medium
CN114396958A (en) * 2022-02-28 2022-04-26 重庆长安汽车股份有限公司 Lane positioning method and system based on multiple lanes and multiple sensors and vehicle
CN114396958B (en) * 2022-02-28 2023-08-18 重庆长安汽车股份有限公司 Lane positioning method and system based on multiple lanes and multiple sensors and vehicle

Also Published As

Publication number Publication date
CN112373474B (en) 2022-05-17

Similar Documents

Publication Publication Date Title
CN112373474B (en) Lane line fusion and transverse control method, system, vehicle and storage medium
EP3859280A1 (en) Traffic lane line fitting method and system
CN107389084B (en) Driving path planning method and storage medium
CN110775057A (en) Lane assisting method for analyzing and controlling steering torque based on vehicle-mounted blind zone visual scene
WO2023197408A1 (en) Method and apparatus for determining vehicle speed control model training sample
CN113460086A (en) Control system, method, vehicle and storage medium for automatically driving to enter ramp
CN110412980A (en) Automatic driving and line control method
CN113415275A (en) Vehicle message processing method and device, readable medium and electronic equipment
CN114802234A (en) Road edge avoiding method and system in intelligent cruise
CN110784680B (en) Vehicle positioning method and device, vehicle and storage medium
US20230032741A1 (en) Road model generation method and device
JP5682302B2 (en) Traveling road estimation device, method and program
CN114516327A (en) Self-learning vehicle following system and method based on driver behavior learning and surrounding environment
CN115489530A (en) Lane changing method and device for vehicle, vehicle and storage medium
CN114954452A (en) Vehicle speed control method and system based on adaptive cruise
CN114464005A (en) Method and system for assisting driving of vehicle
CN114842432A (en) Automobile light control method and system based on deep learning
CN115431981B (en) Driving auxiliary identification system based on high-precision map
CN110531347A (en) Detection method, device and the computer readable storage medium of laser radar
CN114132317B (en) Intelligent curve side driving control method, system, vehicle and storage medium
CN117037538B (en) System for determining AGS distance of special expressway of small bus
CN116691695A (en) Driving state judging method, computer equipment, readable storage medium and motor vehicle
CN116758502A (en) Intersection identification method, device, equipment and storage medium
JP2023184097A (en) Driving support system, driving support method, and driving support program
WO2022078670A1 (en) Vehicle control method and device, computer storage medium and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant