CN112183415A - Lane line processing method, vehicle, and readable storage medium - Google Patents

Lane line processing method, vehicle, and readable storage medium Download PDF

Info

Publication number
CN112183415A
CN112183415A CN202011068706.3A CN202011068706A CN112183415A CN 112183415 A CN112183415 A CN 112183415A CN 202011068706 A CN202011068706 A CN 202011068706A CN 112183415 A CN112183415 A CN 112183415A
Authority
CN
China
Prior art keywords
lane line
vehicle
lane
confidence threshold
road
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011068706.3A
Other languages
Chinese (zh)
Other versions
CN112183415B (en
Inventor
廖尉华
邓琬云
蒋祖坚
胡旺
罗覃月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SAIC GM Wuling Automobile Co Ltd
Original Assignee
SAIC GM Wuling Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SAIC GM Wuling Automobile Co Ltd filed Critical SAIC GM Wuling Automobile Co Ltd
Priority to CN202011068706.3A priority Critical patent/CN112183415B/en
Publication of CN112183415A publication Critical patent/CN112183415A/en
Application granted granted Critical
Publication of CN112183415B publication Critical patent/CN112183415B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Mathematical Physics (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a lane line processing method, a vehicle and a readable storage medium, wherein the lane line processing method comprises the following steps: in the driving process of the vehicle, acquiring a lane image of a driving road of the vehicle, and acquiring the corresponding lane line type, the existence time of the lane line and the road edge information according to the image; analyzing according to the lane line type, the road participant state information and the road edge information to obtain a road type; analyzing according to the road type and the existence time of the lane line to obtain a dynamic confidence threshold; judging whether the credibility of the actual lane line is smaller than the dynamic confidence threshold value or not; and if the credibility of the actual lane line is less than the dynamic confidence threshold, simulating a virtual lane line according to the road type and the lane line correspondingly obtained. According to the invention, the clear and reliable lane line is obtained by judging the confidence threshold of the lane line detected by the vehicle-mounted vision sensor device and carrying out different processing according to the judgment result.

Description

Lane line processing method, vehicle, and readable storage medium
Technical Field
The present invention relates to the field of vehicles, and in particular, to a lane line processing method, a vehicle, and a computer-readable storage medium.
Background
At present, most of domestic and foreign vehicles carrying intelligent driving assistance systems (ADAS) can make confidence threshold judgment and filtering on a lane line detected by vehicle-mounted vision sensor equipment in a back-end system, so that the ADAS system of the vehicle can process the detected lane line according to the judgment and filtering results, and further adjust the driving of the vehicle. However, the lane lines detected by the vehicle-mounted vision sensor device often have the problems that the lane lines are incomplete, intermittent, even fuzzy and difficult to identify, and the like, so that the ADAS system cannot acquire accurate lane line information, and further cannot make correct adjustment on the vehicle, thereby influencing the driving experience of a user, and more serious people can cause traffic accidents.
Disclosure of Invention
The invention mainly aims to provide a lane line processing method, a vehicle and a readable storage medium, and aims to solve the problem that the recognition of a vehicle-mounted vision sensor device on a lane line is unstable due to unclear lane lines or incomplete lane lines.
In order to achieve the above object, the present invention provides a lane line processing method, including the steps of:
in the driving process of the vehicle, acquiring a lane image of a driving road of the vehicle, and acquiring the corresponding lane line type, the existence time of the lane line and the road edge information according to the image;
analyzing according to the lane line type and the road edge information to obtain a road type;
analyzing according to the road type and the existence time of the lane line to obtain a dynamic confidence threshold;
acquiring the corresponding lane line reliability according to the lane image, and judging whether the reliability of the actual lane line is smaller than the dynamic confidence threshold value;
and if the credibility of the actual lane line is less than the dynamic confidence threshold, simulating a virtual lane line according to the road type and the lane line correspondingly obtained.
Preferably, the road type includes a high speed/fast lane; the dynamic confidence thresholds include a first dynamic confidence threshold, a second dynamic confidence threshold, and a third dynamic confidence threshold; the step of analyzing and obtaining the corresponding dynamic confidence threshold according to the road type and the existence time of the lane line comprises the following steps:
if the road type is a high-speed/fast lane, judging whether the existing time of the lane line is greater than a first preset threshold value;
if not, acquiring a first dynamic confidence threshold;
if so, judging whether the existing time of the lane line is greater than a second preset threshold value or not;
if not, acquiring a second dynamic confidence threshold;
and if so, acquiring a third dynamic confidence threshold.
Preferably, the road type includes an urban lane, and the step of analyzing to obtain the corresponding dynamic confidence threshold according to the road type and the time of existence of the lane line further includes:
if the road type is an urban lane, judging whether the existing time of the lane line is greater than a third preset threshold value;
if not, acquiring a fourth dynamic confidence threshold;
if so, judging whether the existing time of the lane line is greater than a fourth preset threshold value or not;
if not, acquiring a fifth dynamic confidence threshold;
if so, a sixth dynamic confidence threshold is obtained.
Preferably, the road type includes a country lane, and the step of obtaining a corresponding dynamic confidence threshold according to the road type and the lane line existing time analysis further includes:
if the road type is a country lane, judging whether the existing time of the lane line is greater than a fifth preset threshold value;
if not, acquiring a seventh dynamic confidence threshold;
if so, judging whether the existing time of the lane line is greater than a sixth preset threshold value;
if not, acquiring an eighth dynamic confidence threshold;
if so, acquiring a ninth dynamic confidence threshold.
Preferably, the step of judging whether the confidence level of the actual lane line is smaller than the dynamic confidence level threshold further includes:
if the credibility of the actual lane line is greater than or equal to the dynamic confidence threshold, acquiring vehicle motion information and historical lane line information;
calculating to obtain a first predicted lane line according to the vehicle motion information and the historical lane line information;
and fusing the obtained actual lane line with the first predicted lane line through a Kalman filtering algorithm to obtain a first fused lane line, controlling the vehicle to move according to the first fused lane line, and displaying the first fused lane line through an instrument display screen.
Preferably, if the reliability of the actual lane line is less than the dynamic confidence threshold, the step of simulating a virtual lane line according to the road type includes:
acquiring vehicle motion information and historical lane line information;
calculating to obtain a second predicted lane line according to the vehicle motion information and the historical lane line information;
and fusing the virtual lane line and the second predicted lane line through a Kalman filtering algorithm to obtain a second fused lane line, controlling the vehicle to move according to the second fused lane line, and displaying the second fused lane line through an instrument display screen.
Preferably, the step of analyzing the road type according to the lane line type, the road participant status information and the road edge information includes:
analyzing according to the state information of the road participants and the road type to obtain the scene of the vehicle;
acquiring a first fusion lane line or a second fusion lane line;
judging whether the scene of the vehicle is a first preset scene or a second preset scene, wherein the preset scenes comprise the first preset scene, the second preset scene and a third preset scene;
and if the scene of the vehicle is the first preset scene or the second preset scene, performing offset processing on the first fusion lane line or the second fusion lane line.
Preferably, if the scene of the vehicle is the preset scene, the step of performing offset processing on the first fusion lane line or the second fusion lane line includes:
when the scene of the vehicle is a first preset scene, controlling the vehicle to drive by taking the side of the driving intention area of the user as the center, and adjusting the first fusion lane line or the second fusion lane line to deviate to the side of the driving intention area of the user;
and when the scene of the vehicle is a second preset scene, controlling the vehicle to run at the center of the driving intention area of the user, and adjusting the deviation of the first fusion lane line or the second fusion lane line to the center of the driving intention area of the user.
Furthermore, to achieve the above object, the present invention also provides a vehicle comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the lane line processing method as described above.
In order to achieve the above object, the present invention further provides a computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and when executed by a processor, the computer program implements the steps of the lane line processing method as described above.
According to the lane line processing method provided by the embodiment of the invention, the type of the road where the vehicle is located is obtained through analysis by obtaining the type of the lane line and the information of the road edge, and road data support is provided for analyzing the scene where the vehicle is located and obtaining the threshold value of the dynamic confidence coefficient of the vehicle. The method comprises the steps of obtaining the existence time of a lane line and analyzing the type of a road where a vehicle is located to obtain a dynamic confidence threshold, judging whether the reliability of an actual lane line detected by the vehicle is smaller than the dynamic confidence threshold to determine to obtain a clear lane line, and simulating a virtual lane line according to the road type and the lane line correspondingly obtained by the road type to ensure the driving safety of the vehicle and improve the user experience.
Drawings
FIG. 1 is a schematic structural view of a portion of a vehicle component according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart illustrating a lane line processing method according to a first embodiment of the present invention;
FIG. 3 is a first detailed flowchart of step S30;
FIG. 4 is a second detailed flowchart of step S30;
FIG. 5 is a third detailed flowchart of step S30;
FIG. 6 is a flowchart illustrating a lane line processing method according to a third embodiment of the present invention;
FIG. 7 is a flowchart illustrating a lane line processing method according to a fourth embodiment of the present invention;
fig. 8 is a first detailed flowchart of step S710.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As shown in fig. 1, fig. 1 is a schematic structural view of a part of a vehicle according to an embodiment of the present invention.
The vehicle comprises a communication module 01, a memory 02, a processor 03 and the like. Those skilled in the art will appreciate that the vehicle shown in FIG. 1 may also include more or fewer components than shown, or some components may be combined, or a different arrangement of components. The processor 03 is connected to the memory 02 and the communication module 01, respectively, and the memory 02 stores a computer program, which is executed by the processor 03 at the same time.
The communication module 01 may be connected to an external device through a network. The communication module 01 may receive data sent by an external device, and may also send data, instructions, and information to the external device, where the external device may be an electronic device such as a mobile phone, a tablet computer, a notebook computer, and a desktop computer.
The memory 02 may be used to store software programs and various data. The memory 02 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (whether software and hardware of the vehicle are matched is detected by judging whether an actual vehicle speed to rotation speed ratio satisfies a preset vehicle speed to rotation speed ratio range), and the like; the storage data area may store data or information created according to the use of the vehicle, or the like. Further, the memory 02 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 03, which is a control center of the vehicle, connects various parts of the entire vehicle using various interfaces and lines, and performs various functions of the vehicle and processes data by operating or executing software programs and/or modules stored in the memory 02 and calling data stored in the memory 02, thereby integrally monitoring the vehicle. Processor 03 may include one or more processing units; preferably, the processor 03 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 03.
Although not shown in fig. 1, the vehicle may further include a circuit control module, where the circuit control module is configured to be connected to a mains power supply to implement power control and ensure normal operation of other components.
Those skilled in the art will appreciate that the vehicle configuration shown in FIG. 1 does not constitute a limitation of the vehicle, and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
Various embodiments of the method of the present invention are presented in terms of the above-described hardware architecture.
Referring to fig. 2, in a first embodiment of the lane line processing method of the present invention, the lane line processing method includes:
and step S10, acquiring a lane image of a vehicle driving road during the driving process of the vehicle, and acquiring the corresponding lane type, the existence time of the lane and the road edge information according to the image.
In the driving process of the vehicle, a camera on the vehicle is used for acquiring a lane image of a driving road of the vehicle, wherein the camera can be a camera on a driving recorder or a panoramic camera, or other cameras arranged on the vehicle.
The image of the lane on which the vehicle runs includes one or a combination of two or more of information of the type of lane line, the time when the lane line exists, and the color of the lane line, that is, the information about the lane line applied may be different in each embodiment. The road edge information comprises devices such as green belts, railings, stone piers and warning lines on two sides of the road, is used for limiting the lane range and distinguishing different safety fields for vehicle driving users and pedestrians.
In order to improve the accuracy of judgment, the corresponding lane line type, the existence time of the lane line, the state information of road participants and the road edge information can be obtained according to the image, namely the states of pedestrians, other vehicles and/or animals and the like on the driving road of the vehicle are increased, so that the accuracy of judgment is improved.
The road participant status information includes statuses of pedestrians, other vehicles and/or animals, etc. on the road within a certain distance of the user, such as speeds of the pedestrians, other vehicles and animals on the road, a distance from the user, etc. The distance of the user is within a range which can be detected by a camera and other sensors on the vehicle, and can be set by technicians in the field according to actual conditions, wherein the distance set by the technicians according to the actual conditions is smaller than the maximum distance which can be detected by the camera and other sensors; the other vehicles include both automotive vehicles and non-automotive vehicles, such as human-driven non-automotive vehicles like bicycles, electric vehicles, tricycles, and wheelchairs.
Step S20, analyzing according to the lane line type and the road edge information to obtain a road type;
the lane line types comprise various types such as a solid line, a dotted line, a solid line combined dotted line, a double solid line and the like, some lane line types can only appear on urban roads, and some lane line types can only appear on high-speed/express roads; in this embodiment, the road types are divided into three types, which are: high speed/express lanes, urban lanes, and rural lanes, implementations may be divided into other road types. The high-speed/high-speed lane is a lane with the lowest speed per hour limited to 60km/h and the highest speed per hour limited to 120km/h, the urban lane is a lane with the highest speed per hour of 80km/h, and the rural lane is a lane with the highest speed per hour of 60 km/h. The vehicle speed per hour of the road type where the vehicle is specifically located is different according to the speed limit sign beside the road or the highest speed per hour prompted by the vehicle-mounted navigator.
It should be noted that, at some time, the vehicle of the user may not obtain one or more of the lane line type, the road participant status information, and the road edge information, or even may not obtain all of them, so that the corresponding road type cannot be analyzed. For example, when the user vehicle cannot acquire any one of the lane line type, the road participant state information and the road edge information, setting the road type where the user vehicle is located at the current moment as a rural lane; and when the user vehicle obtains that the vehicle is in the fast/high-speed lane at the current moment through the lane line type analysis and cannot obtain the state information and the road edge information of the road participants, setting the type of the road where the user vehicle is located at the current moment as the fast/high-speed lane.
And step S30, analyzing according to the road type and the time of the lane line to obtain a dynamic confidence threshold.
The road types include three types, which are respectively: high speed/express lanes, urban lanes, and rural lanes. The time during which the lane line exists is a time obtained by dividing the length of the lane line by the vehicle speed. The dynamic confidence threshold is a probability value which cannot be obtained through calculation and needs to be obtained through statistics. In the invention, the dynamic confidence threshold comprises three thresholds, namely 0.5, 0.7 and 0.9; those skilled in the art will appreciate that the dynamic confidence threshold values may vary in value due to different criteria used by different vehicle enterprises. Confidence refers to the probability that an overall parameter value falls within a certain region of the sample statistics. Each road type corresponds to a plurality of dynamic confidence degree threshold values, the time interval of the lane line corresponding to each road type is divided into a plurality of small intervals through a plurality of preset threshold values, and the plurality of small intervals respectively correspond to one dynamic confidence degree threshold value.
Step S40: acquiring the corresponding lane line reliability according to the lane image, and judging whether the reliability of the actual lane line is smaller than the dynamic confidence threshold value;
after the vehicle-mounted vision sensor device acquires the actual lane line, reliability judgment is carried out according to the acquired lane line, and confirmed reliability is obtained. And the credibility of the actual lane line is a probability value obtained by comprehensively judging the acquired information such as the type, the existence time, the definition and the like of the lane line.
Step S410: and if the credibility of the actual lane line is less than the dynamic confidence threshold, simulating a virtual lane line according to the road type and the lane line correspondingly obtained.
After the vehicle-mounted vision sensor device acquires the actual lane line, the reliability of the actual lane line is obtained through comprehensive judgment, the reliability is compared with the dynamic confidence threshold obtained in the step S30, and if the reliability of the actual lane line is smaller than the dynamic confidence threshold, the vehicle simulates a virtual lane line by combining the lane line acquired by the road type according to the acquired road type. For example, if the type of the road where the vehicle is located at the current time is a highway and the lane line obtained corresponding to the lane is a virtual-solid line, a virtual-solid line lane line of a highway is simulated.
In this embodiment, the vehicle obtains the type of the lane, the state information of the road participants, and the road edge information by analysis, so that the vehicle can perform scene judgment and comprehensive judgment to obtain a dynamic confidence threshold, thereby providing a road foundation for processing the lane. And obtaining the credibility of the actual lane line through analysis, judging whether the credibility of the obtained actual lane line is smaller than the dynamic confidence threshold value, judging whether the obtained actual lane line is clear enough or not, and enabling the vehicle to control the vehicle driving according to the actual lane line, and simulating a virtual lane line according to the road type if the obtained actual lane line is not clear enough or incomplete enough so that the vehicle can support and control the vehicle driving according to the virtual lane line.
Further, referring to fig. 3, 4 and 5, a second embodiment of the lane line processing method according to the present invention is proposed according to the first embodiment of the lane line processing method according to the present invention, and in this embodiment, referring to fig. 3, step S30 includes:
step S310: if the road type is a high-speed/fast lane, judging whether the existing time of the lane line is greater than a first preset threshold value;
step S311: if not, acquiring a first dynamic confidence threshold;
step S312: if so, judging whether the existing time of the lane line is greater than a second preset threshold value or not;
step S313: if not, acquiring a second dynamic confidence threshold;
step S314: if yes, acquiring a third dynamic confidence threshold;
the first preset threshold and the second preset threshold are two thresholds calculated according to the limited speed per hour of the high speed/fast lane and the lane line of the national standard, for example, the first preset threshold may be a threshold obtained by dividing the length of the lane line meeting the national standard by 80km/h, the second preset threshold may be a threshold obtained by dividing the length of the lane line meeting the national standard by 100km/h, the two preset thresholds divide the time information corresponding to the vehicle speed of 60km/h to 120km/h into three time intervals, the time information corresponding to each vehicle speed is necessarily located in a certain interval, each interval is provided with a corresponding confidence threshold, so that by comparing the existence time of the lane line with the two preset thresholds, which interval of the three time intervals the existence time of the lane line is located can be determined, and further acquiring a dynamic confidence threshold corresponding to the interval, for example, if the time information corresponding to 70km/h is between the time information corresponding to 60km/h and the time information corresponding to 80km/h, acquiring that the dynamic confidence threshold corresponding to the time interval is 0.5. Meanwhile, as can be understood by those skilled in the art, the value criteria of the first preset threshold and the second preset threshold may have different criteria, and may be set by the skilled person.
Further, referring to fig. 4, step S30 further includes:
step S320: if the road type is an urban lane, judging whether the existing time of the lane line is greater than a third preset threshold value;
step S321: if not, acquiring a fourth dynamic confidence threshold;
step S322: if so, judging whether the existing time of the lane line is greater than a fourth preset threshold value or not;
step S323: if not, acquiring a fifth dynamic confidence threshold;
step S324: if so, a sixth dynamic confidence threshold is obtained.
The third preset threshold and the fourth preset threshold are two thresholds calculated according to the limited speed per hour of the urban lane and the national standard lane line, and the method for obtaining the specific numerical value is similar to that in fig. 3, and is not repeated herein.
Further, referring to fig. 5, step S30 further includes:
step S330: if the road type is a country lane, judging whether the existing time of the lane line is greater than a fifth preset threshold value;
step S331: if not, acquiring a seventh dynamic confidence threshold;
step S332: if so, judging whether the existing time of the lane line is greater than a sixth preset threshold value;
step S333: if not, acquiring an eighth dynamic confidence threshold;
step S334: if so, acquiring a ninth dynamic confidence threshold.
The fifth preset threshold and the sixth preset threshold are two thresholds calculated according to the limited speed per hour of the country lane and the national standard lane line, and the method for obtaining the specific numerical values is similar to that shown in fig. 3, and is not repeated herein.
In this embodiment, the time during which the lane line corresponding to each road type exists is divided into three time intervals by two preset thresholds, and each time interval corresponds to one dynamic confidence threshold, so as to achieve the purpose of obtaining different dynamic confidence thresholds in different road types and improve the definition of the lane line.
Further, referring to fig. 6, a third embodiment of the lane line processing method according to the present invention is proposed according to the first embodiment of the lane line processing method according to the present invention, and in this embodiment, step S40 is followed by:
step S420: if the credibility of the actual lane line is greater than or equal to the dynamic confidence threshold, acquiring vehicle motion information and historical lane line information;
and if the credibility of the actual lane line acquired by the vehicle is greater than or equal to the dynamic confidence threshold, indicating that the actual lane line is enough to control the vehicle to drive according to the actual lane line. In the present invention, the vehicle motion information refers to the inclination of the vehicle, including rolling, forward tilting, backward tilting, and the like. For example, the vehicle may lean left or right during a turn; when a vehicle is ascending or descending a slope, it may tilt forward or backward. The historical lane line information refers to the lane lines which are stored before the current lane line processing cycle and are processed by the method.
Step S440: obtaining a first predicted lane line through calculation according to the vehicle motion information and the historical lane line information;
the Kalman filtering algorithm is an algorithm which utilizes a linear system state equation, outputs observation data through system input and outputs and performs optimal estimation on the system state. The first predicted lane line is finally obtained by inputting historical lane line information, outputting a predicted lane line through calculation, and assisting vehicle motion information to adjust. The "first" of the first predicted lane lines only represents the lane line obtained in step S440, and does not represent an ordinal word.
Step S460: and fusing the obtained actual lane line with the first predicted lane line through a Kalman filtering algorithm to obtain a first fused lane line, controlling the vehicle to move according to the first fused lane line, and displaying the first fused lane line through an instrument display screen.
The method comprises the steps of fusing an obtained actual lane line with a first predicted lane line, taking the actual lane line as a main part, combining the actual lane line with the same data part of the first predicted lane line to form a first fused lane line through a Kalman filtering algorithm, storing the first fused lane line as a historical lane line, and displaying the first fused lane line through an instrument display screen.
Further, step S410 is followed by:
step S430: acquiring vehicle motion information and historical lane line information;
the vehicle motion information and the historical lane line information are as described above and will not be described herein.
Step S450: and calculating to obtain a second predicted lane line according to the vehicle motion information and the historical lane line information.
The principle of the method in step S450 is similar to that in step S440, and therefore, the description thereof is omitted.
Step S470: and fusing the virtual lane line and the second predicted lane line through a Kalman filtering algorithm to obtain a second fused lane line, controlling the vehicle to move according to the second fused lane line, and displaying the second fused lane line through an instrument display screen.
The virtual lane line and the second predicted lane line are fused, the virtual lane line is taken as a main part, the same data part of the virtual lane line and the second predicted lane line is combined to form a second fused lane line through a Kalman filtering algorithm, the second fused lane line is stored as a historical lane line, and the second fused lane line is displayed through an instrument display screen.
It should be noted that step S410 and step S420 may not be executed at the same time, and only one of them may be executed, for example, step S410 is executed, and step S420 and the following steps are not executed; step S410 and thereafter will not be executed if step S420 is executed.
In this embodiment, after comparing the relationship between the reliability of the actual lane line and the dynamic confidence threshold, the vehicle motion information and the historical lane line information of the vehicle are obtained, the predicted lane line is obtained through calculation, and the predicted lane line is fused with the actual lane line or the virtual lane line through a kalman filtering method to obtain a fused lane line, so that the lane line is filtered to obtain a more stable and accurate lane line.
Further, referring to fig. 7 and 8, a fourth embodiment of the lane line processing method according to the present invention is proposed according to the first embodiment of the lane line processing method according to the present invention, and in this embodiment, referring to fig. 7, step S20 is followed by:
step S50: analyzing according to the state information of the road participants and the road type to obtain the scene of the vehicle at the current moment;
the scene of the vehicle at the current moment refers to the relation formed between the vehicle of the user and the surrounding road participants and the road edges, for example, the conditions of the front, the back, the left and the right of the vehicle can be judged through the state information of the road participants, and whether unstable or unsafe factors exist around the vehicle can be analyzed; the road type can be used for judging the unstable or unsafe factors around the vehicle on which road type.
Step S60: acquiring a first fusion lane line or a second fusion lane line;
it should be noted that the first fused lane line and the second fused lane line can only obtain one of them, cannot be obtained at the same time, and cannot be obtained at the same time.
Step S70: judging whether a scene where the vehicle is located at the current moment is a preset scene or not, wherein the preset scene comprises a first preset scene and a second preset scene;
the preset scenes comprise a first preset scene and a second preset scene, and in this embodiment, the first preset scene may be a situation where both sides are uncomfortable, for example, both sides of a vehicle user have vehicles, or one side has vehicles and the other side is close to a road edge; the second preset scene refers to the situation that one side is uncomfortable, for example, one side is a vehicle or one side is close to the edge of a road.
Step S710: and if the scene of the vehicle at the current moment is the preset scene, carrying out bias processing on the lane line.
Due to the special scene of the vehicle, if the vehicle still runs according to the normal lane line, the vehicle user may be anxious, make wrong operation, and even more may cause traffic accidents. Therefore, if the scene of the vehicle belongs to the preset scene, the lane line on the instrument display screen is biased leftwards or rightwards. If the scene of the vehicle does not belong to the preset scene, the lane line does not need to be subjected to bias processing, and the vehicle is controlled to run in the area of the first fusion lane line or the second fusion lane line.
Further, referring to fig. 8, step S710 includes:
step S720: and when the scene where the vehicle is located is a first preset scene, controlling the vehicle to run by taking the side of the driving intention area of the user as the center, and adjusting the first fusion lane line or the second fusion lane line to deviate towards the side of the driving intention area of the user.
Step S730: and when the scene of the vehicle is a second preset scene, controlling the vehicle to run at the center of the driving intention area of the user, and adjusting the deviation of the first fusion lane line or the second fusion lane line to the center of the driving intention area of the user.
If the scene of the vehicle at the current moment belongs to a preset scene, the lane line on the instrument display screen needs to be subjected to bias processing; according to different scenes, the lane line is biased differently, and two biasing methods are proposed in the embodiment of the present invention, but those skilled in the art will understand that the two biasing methods for the lane line given in fig. 5 and fig. 6 do not constitute a limitation to the present invention, and may include more scenes and more lane line biasing methods.
It should be noted that step S50 and step S30 may be performed simultaneously, or step S30 may be completed first, and then step S50 may be completed.
In the embodiment, the scene of the vehicle at the current moment is obtained by obtaining the state information of the road participants and analyzing the road type, and if the scene of the vehicle at the current moment belongs to the preset scene, the lane lines on the instrument display screen need to be subjected to bias processing, so that the driving safety of the vehicle user is enhanced, the driving experience of the vehicle user is improved, and the risk of traffic accidents is reduced. And if the scene of the vehicle at the current moment does not belong to the preset scene, controlling the vehicle to drive according to the lane line on the instrument display screen.
Furthermore, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored. The computer-readable storage medium may be the Memory 02 in the vehicle of fig. 1, and may also be at least one of a ROM (Read-Only Memory)/RAM (Random Access Memory), a magnetic disk, and an optical disk, and the computer-readable storage medium includes several pieces of information for causing the vehicle to perform the method according to the embodiments of the present invention.
The specific embodiment of the computer-readable storage medium of the present invention is substantially the same as the embodiments of the lane line processing method described above, and will not be described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on this understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for causing a vehicle to perform the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A lane line processing method is characterized by comprising the following steps:
in the driving process of the vehicle, acquiring a lane image of a driving road of the vehicle, and acquiring the corresponding lane line type, the existence time of the lane line and the road edge information according to the image;
analyzing according to the lane line type and the road edge information to obtain a road type;
analyzing according to the road type and the existence time of the lane line to obtain a corresponding dynamic confidence threshold;
acquiring the corresponding lane line reliability according to the lane image, and judging whether the reliability of the actual lane line is smaller than the dynamic confidence threshold value;
and if the credibility of the actual lane line is less than the dynamic confidence threshold, simulating a virtual lane line according to the road type and the lane line correspondingly obtained.
2. The lane line processing method of claim 1, wherein the road type includes a high speed/fast lane; the dynamic confidence thresholds include a first dynamic confidence threshold, a second dynamic confidence threshold, and a third dynamic confidence threshold; the step of analyzing and obtaining the corresponding dynamic confidence threshold according to the road type and the existence time of the lane line comprises the following steps:
if the road type is a high-speed/fast lane, judging whether the existing time of the lane line is greater than a first preset threshold value;
if not, acquiring a first dynamic confidence threshold;
if so, judging whether the existing time of the lane line is greater than a second preset threshold value or not;
if not, acquiring a second dynamic confidence threshold;
and if so, acquiring a third dynamic confidence threshold.
3. The method of claim 1, wherein the road type comprises an urban lane, and the step of obtaining a corresponding dynamic confidence threshold from the analysis of the road type and the time of existence of the lane further comprises:
if the road type is an urban lane, judging whether the existing time of the lane line is greater than a third preset threshold value;
if not, acquiring a fourth dynamic confidence threshold;
if so, judging whether the existing time of the lane line is greater than a fourth preset threshold value or not;
if not, acquiring a fifth dynamic confidence threshold;
if so, a sixth dynamic confidence threshold is obtained.
4. The lane line processing method of claim 3, wherein the road type comprises a country lane, and the step of deriving the corresponding dynamic confidence threshold from the road type and the lane line age analysis further comprises:
if the road type is a country lane, judging whether the existing time of the lane line is greater than a fifth preset threshold value;
if not, acquiring a seventh dynamic confidence threshold;
if so, judging whether the existing time of the lane line is greater than a sixth preset threshold value;
if not, acquiring an eighth dynamic confidence threshold;
if so, acquiring a ninth dynamic confidence threshold.
5. The lane line processing method of claim 1, wherein the step of determining whether the confidence level of the actual lane line is less than the dynamic confidence threshold further comprises:
if the credibility of the actual lane line is greater than or equal to the dynamic confidence threshold, acquiring vehicle motion information and historical lane line information;
calculating to obtain a first predicted lane line according to the vehicle motion information and the historical lane line information;
and fusing the obtained actual lane line with the first predicted lane line through a Kalman filtering algorithm to obtain a first fused lane line, controlling the vehicle to move according to the first fused lane line, and displaying the first fused lane line through an instrument display screen.
6. The lane line processing method of claim 1, wherein if the confidence level of the actual lane line is less than the dynamic confidence threshold, the step of simulating a virtual lane line according to the road type comprises:
acquiring vehicle motion information and historical lane line information;
calculating to obtain a second predicted lane line according to the vehicle motion information and the historical lane line information;
and fusing the virtual lane line and the second predicted lane line through a Kalman filtering algorithm to obtain a second fused lane line, controlling the vehicle to move according to the second fused lane line, and displaying the second fused lane line through an instrument display screen.
7. The lane line processing method according to claim 5 or 6, wherein the step of analyzing the road type according to the lane line type, the road participant status information, and the road edge information is followed by:
analyzing according to the state information of the road participants and the road type to obtain the scene of the vehicle;
acquiring a first fusion lane line or a second fusion lane line;
judging whether a scene where the vehicle is located is a preset scene or not, wherein the preset scene comprises a first scene and a second scene;
and if the scene of the vehicle is the preset scene, performing bias processing on the first fusion lane line or the second fusion lane line.
8. The lane line processing method according to claim 5, wherein the step of performing the offset processing on the first fusion lane line or the second fusion lane line if the scene where the vehicle is located is the first preset scene or the second preset scene comprises:
when the scene of the vehicle is a first preset scene, controlling the vehicle to drive by taking the side of the driving intention area of the user as the center, and adjusting the first fusion lane line or the second fusion lane line to deviate to the side of the driving intention area of the user;
and when the scene of the vehicle is a second preset scene, controlling the vehicle to run at the center of the driving intention area of the user, and adjusting the deviation of the first fusion lane line or the second fusion lane line to the center of the driving intention area of the user.
9. A vehicle, characterized in that the vehicle comprises a memory, a processor and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the lane line processing method according to any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that a computer program is stored thereon, which computer program, when being executed by a processor, realizes the steps of the lane line processing method according to any one of claims 1 to 8.
CN202011068706.3A 2020-09-30 2020-09-30 Lane line processing method, vehicle, and readable storage medium Active CN112183415B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011068706.3A CN112183415B (en) 2020-09-30 2020-09-30 Lane line processing method, vehicle, and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011068706.3A CN112183415B (en) 2020-09-30 2020-09-30 Lane line processing method, vehicle, and readable storage medium

Publications (2)

Publication Number Publication Date
CN112183415A true CN112183415A (en) 2021-01-05
CN112183415B CN112183415B (en) 2024-07-16

Family

ID=73948224

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011068706.3A Active CN112183415B (en) 2020-09-30 2020-09-30 Lane line processing method, vehicle, and readable storage medium

Country Status (1)

Country Link
CN (1) CN112183415B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113525400A (en) * 2021-06-21 2021-10-22 上汽通用五菱汽车股份有限公司 Lane change reminding method and device, vehicle and readable storage medium
CN114494518A (en) * 2022-01-19 2022-05-13 上汽通用五菱汽车股份有限公司 Method, device, equipment and storage medium for generating lane line

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107830869A (en) * 2017-11-16 2018-03-23 百度在线网络技术(北京)有限公司 Information output method and device for vehicle
US20180181817A1 (en) * 2015-09-10 2018-06-28 Baidu Online Network Technology (Beijing) Co., Ltd. Vehicular lane line data processing method, apparatus, storage medium, and device
CN110136295A (en) * 2018-02-09 2019-08-16 驭势(上海)汽车科技有限公司 System and method for detecting the confidence level of automatic Pilot
CN110174113A (en) * 2019-04-28 2019-08-27 福瑞泰克智能系统有限公司 A kind of localization method, device and the terminal in vehicle driving lane
US20200074189A1 (en) * 2018-09-04 2020-03-05 Baidu Online Network Technology (Beijing) Co., Ltd. Lane line processing method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180181817A1 (en) * 2015-09-10 2018-06-28 Baidu Online Network Technology (Beijing) Co., Ltd. Vehicular lane line data processing method, apparatus, storage medium, and device
CN107830869A (en) * 2017-11-16 2018-03-23 百度在线网络技术(北京)有限公司 Information output method and device for vehicle
CN110136295A (en) * 2018-02-09 2019-08-16 驭势(上海)汽车科技有限公司 System and method for detecting the confidence level of automatic Pilot
US20200074189A1 (en) * 2018-09-04 2020-03-05 Baidu Online Network Technology (Beijing) Co., Ltd. Lane line processing method and device
CN110174113A (en) * 2019-04-28 2019-08-27 福瑞泰克智能系统有限公司 A kind of localization method, device and the terminal in vehicle driving lane

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113525400A (en) * 2021-06-21 2021-10-22 上汽通用五菱汽车股份有限公司 Lane change reminding method and device, vehicle and readable storage medium
CN114494518A (en) * 2022-01-19 2022-05-13 上汽通用五菱汽车股份有限公司 Method, device, equipment and storage medium for generating lane line

Also Published As

Publication number Publication date
CN112183415B (en) 2024-07-16

Similar Documents

Publication Publication Date Title
CN105620489B (en) Driving assistance system and vehicle real-time early warning based reminding method
RU2608582C2 (en) Method of switching mobile terminal into standby mode and equipment for its implementation
CN111332309B (en) Driver monitoring system and method of operating the same
CN112349144B (en) Monocular vision-based vehicle collision early warning method and system
KR20210078530A (en) Lane property detection method, device, electronic device and readable storage medium
CN108860045B (en) Driving support method, driving support device, and storage medium
CN112339622B (en) Seat adjusting method and device and vehicle-mounted system
CN112183415B (en) Lane line processing method, vehicle, and readable storage medium
CN113064135A (en) Method and device for detecting obstacle in 3D radar point cloud continuous frame data
CN112001208A (en) Target detection method and device for vehicle blind area and electronic equipment
DE102015213538A1 (en) Method and system for warning against a wrong-way drive of a vehicle
CN112257542A (en) Obstacle sensing method, storage medium, and electronic device
DE102021107602A1 (en) DRIVER ASSISTANCE DEVICE AND DATA COLLECTION SYSTEM
CN111985388B (en) Pedestrian attention detection driving assistance system, device and method
CN112861683A (en) Driving direction detection method and device, computer equipment and storage medium
CN112949470A (en) Method, device and equipment for identifying lane-changing steering lamp of vehicle and storage medium
CN113352989A (en) Intelligent driving safety auxiliary method, product, equipment and medium
CN110871810A (en) Vehicle, vehicle equipment and driving information prompting method based on driving mode
CN110269787A (en) A kind of blind-guiding method, device and cap
CN112912892A (en) Automatic driving method and device and distance determining method and device
JP2020095466A (en) Electronic device
CN107452230B (en) Obstacle detection method and device, terminal equipment and storage medium
CN114379582A (en) Method, system and storage medium for controlling respective automatic driving functions of vehicles
CN116958915B (en) Target detection method, target detection device, electronic equipment and storage medium
CN115096324B (en) Route recommendation method and related device, vehicle machine, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant