CN111742326A - Lane line detection method, electronic device, and storage medium - Google Patents

Lane line detection method, electronic device, and storage medium Download PDF

Info

Publication number
CN111742326A
CN111742326A CN201980012354.9A CN201980012354A CN111742326A CN 111742326 A CN111742326 A CN 111742326A CN 201980012354 A CN201980012354 A CN 201980012354A CN 111742326 A CN111742326 A CN 111742326A
Authority
CN
China
Prior art keywords
lane line
precision map
observation
matching
map lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980012354.9A
Other languages
Chinese (zh)
Inventor
唐蔚博
许睿
吴显亮
陈竞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhuoyu Technology Co ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN111742326A publication Critical patent/CN111742326A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention provides a method for detecting a lane line, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring an observation lane line for online observation, wherein the observation lane line is observed and acquired by a sensor mounted on a vehicle; matching the observation lane line with the high-precision map lane line in a preset local range; and determining the detected lane line according to the matching result. Therefore, the reliability of the high-precision map lane line is judged through the near credible observation lane line, and the problem that the observation lane line is low in precision and shielded is solved by using the far high-precision map lane line, so that the high-precision and high-reliability detection lane line is finally obtained.

Description

Lane line detection method, electronic device, and storage medium
Technical Field
The embodiment of the invention relates to the technical field of intelligent driving, in particular to a lane line detection method, electronic equipment and a storage medium.
Background
With the development of intelligent driving, in order to improve the safety of intelligent driving during road driving, it is necessary to check a lane line on a road. The current method for inspecting the lane line is an on-line observer, that is, a visual navigation system is used to find out the position of the lane line in a road image from a photographed road image, thereby realizing the inspection of the lane line.
However, the method for observing the lane line on line is influenced by the accuracy of the vision sensor, the lane line is more accurate at a near position, and the accuracy of the lane line farther away from the own vehicle is lower, and the lane line is more unreliable. Meanwhile, if a vehicle, an obstacle and the like exist in the visual field range of the camera sensor and are shielded, the detection result of the lane line is inaccurate.
Disclosure of Invention
The embodiment of the invention provides a method for detecting a lane line, electronic equipment and a storage medium, which are used for realizing accurate detection of the lane line.
In a first aspect, an embodiment of the present application provides a method for detecting a lane line, including:
acquiring an observation lane line for online observation, wherein the observation lane line is observed and acquired by a sensor mounted on a vehicle;
matching the observation lane line with the high-precision map lane line in a preset local range to obtain a matching result;
and determining a detected lane line according to the matching result.
In a second aspect, an embodiment of the present application provides an electronic device, including:
a memory for storing a computer program;
a processor for executing the computer program, in particular for performing:
acquiring an observation lane line for online observation, wherein the observation lane line is observed and acquired by a sensor mounted on a vehicle;
matching the observation lane line with the high-precision map lane line in a preset local range to obtain a matching result;
and determining a detection lane line according to the matching result through the observation lane line and the high-precision map lane line.
In a third aspect, an embodiment of the present application provides a vehicle, including: a vehicle body and the electronic apparatus of any one of the second aspect mounted on the vehicle body.
In a fourth aspect, an embodiment of the present application provides a vehicle, including: a vehicle body and the electronic device of any one of the second aspects mounted on the vehicle body.
In a fifth aspect, an embodiment of the present application provides a computer storage medium, where a computer program is stored, and the computer program, when executed, implements the lane line detection method according to any one of the first aspect.
According to the lane line detection method, the electronic device and the storage medium provided by the embodiment of the application, the observation lane line observed on line is obtained by obtaining the lane line observed and obtained by a sensor mounted on a vehicle; matching the observation lane line with the high-precision map lane line in a preset local range to obtain a matching result; and determining a detected lane line according to the matching result. Therefore, whether the high-precision map lane line has the problems of positioning deviation, outdating and the like is judged through the near credible observation lane line, if the high-precision map lane line is high in reliability, the problem that the observation lane line is low in precision and is shielded is repaired by using the far high-precision map lane line, the high-precision and high-reliability detection lane line is finally obtained, and the vehicle plans the intelligent driving state of the vehicle according to the high-precision detection lane line, such as lane changing, speed reducing or parking, so that accurate guidance on the intelligent driving can be realized, and the safety of the intelligent driving is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a schematic view of an application scenario according to an embodiment of the present application;
fig. 2 is a flowchart of a lane line detection method according to an embodiment of the present disclosure;
fig. 3 is a flowchart of a lane line detection method according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of an electronic device provided in an embodiment of the present application;
FIG. 5 is a schematic structural diagram of a vehicle according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a vehicle according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The method provided by the embodiment of the invention is suitable for the technical fields of computer vision, intelligent driving and the like, can realize the detection of the lane line, and further improves the safety of intelligent driving. The intelligent driving comprises automatic driving and auxiliary driving.
When intelligent driving is carried out, a vehicle needs to obtain a local lane line map for planning and making a vehicle driving plan and predicting the vehicle trend. The method for constructing the fused local information in real time by using the vehicle-mounted sensor is called Online observation (Online Measurement).
And carrying out operations such as projection, segmentation or fusion on the lane line data detected by the vehicle-mounted sensor to obtain an online observed lane line, and recording the online observed lane line as an observed lane line. The projection is to transform the coordinates of the lane line on the sensor to a bird's-eye view (bird's-view Map) by geometric calculation. The segmentation is to cluster the lane line points on the overhead view and to assign a Label (Label) corresponding to the lane line to each lane line point. The fusion is to perform data association on the overlooking images with labels of multiple frames, merge lane lines of the multiple frames in a time sequence and optimize curve parameters of the lane lines.
When intelligent driving is carried out, the vehicle can also obtain the current absolute position of the vehicle, and a high-precision Map (HD Map) which is manually collected, modeled and labeled in advance around the vehicle is read from a database according to the absolute position. The lane lines marked in the high-precision map (namely, the lane lines of the high-precision map) can also be used for planning and making a vehicle driving plan and predicting the vehicle trend.
When intelligent driving, the lane line is observed by single use, and the following problems exist: the lane line is more accurate near the sensor due to the influence of the accuracy of the sensor, and the lane line farther away from the vehicle is lower in accuracy and more unreliable; if the visual field range of the sensor is blocked by vehicles, obstacles and the like, the visual field range of the lane line can be greatly reduced, and the planning of the vehicle driving plan is difficult.
When intelligent driving, the high-precision map lane line is used independently, and the following problems exist: the accuracy of the high-precision map is influenced by vehicle positioning, and the positioning of the vehicle deviates, so that the reliability of the lane line of the whole high-precision map is greatly reduced; the high-precision map has timeliness, and after road construction and diversion, if the high-precision map cannot be updated in time, the reliability of the lane line of the high-precision map is greatly reduced.
According to the lane line detection method provided by the embodiment of the application, the high-precision map lane line and the observation lane line are fused, whether the high-precision map lane line has the problems of positioning deviation, outdating and the like is judged through the observation lane line which is relatively credible near (namely, in an area close to a vehicle), and if the high-precision map lane line is relatively high in reliability, the problems of low accuracy and shielding of the observation lane line are repaired by the high-precision map lane line which is far away (namely, in an area far away from the vehicle).
Fig. 1 is a schematic view of an application scenario related to an embodiment of the present application, and it should be noted that the application scenario of the embodiment of the present application includes, but is not limited to, that shown in fig. 1. As shown in fig. 1, a sensor, such as a monocular or binocular camera, is mounted on the smart driving vehicle, and the sensor may capture, for example, an environment image including a road image of the surrounding environment while the smart driving vehicle is driving. The vehicle can identify the lane line according to the collected environment image, obtain an observation lane line for online observation, match the observation lane line with the high-precision map lane line in a preset local range, and determine a final detection lane line. And judging whether the high-precision map lane line has the problems of positioning deviation, outdating and the like through the near credible observation lane line, and if the high-precision map lane line has high reliability, repairing the problems of low precision and shielding of the observation lane line by using the far high-precision map lane line to finally obtain the high-precision and high-reliability detection lane line. The vehicle plans the intelligent driving state of the vehicle according to the high-precision detection lane line, such as lane changing, deceleration or parking, and the like, so that the intelligent driving can be accurately guided, and the safety of the intelligent driving is improved.
The technical solution of the present invention will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 2 is a flowchart of a method for detecting a lane line according to an embodiment of the present disclosure, and as shown in fig. 2, the method according to the embodiment of the present disclosure includes:
s101, an observation lane line for online observation is obtained, wherein the observation lane line is obtained by observing through a sensor mounted on a vehicle.
The execution main body of the embodiment of the present application is a device having a lane line detection function, for example, a lane line detection device, hereinafter referred to as a detection device for short, and the detection device may be integrated in any electronic device as a part of the electronic device. Alternatively, the detection means may also be a separate electronic device.
The electronic device may be an in-vehicle device, such as an advanced driver assistance device or the like.
The embodiment of the present application takes an execution subject as an electronic device as an example for description.
Optionally, the sensor may be a three-dimensional detection device, a visual sensor, or the like, or a combination thereof, the visual sensor may be an imaging device, and the point cloud sensor includes a laser radar, a Time Of Flight (TOF) ranging detection device, a depth visual sensor, a high-resolution millimeter-wave radar, or the like. For example, the embodiment of the present application takes a sensor as an imaging device, for example, a monocular camera or a monocular camera, where the monocular camera includes two or more cameras.
The observation lane line is only one name of the lane line observed by the sensor mounted on the vehicle, and other names may be used as needed.
The electronic equipment of the embodiment of the application is in communication connection with the sensor.
In one example, S101 may be that the electronic device obtains the lane line of online observation from the sensor. Specifically, the sensor collects environmental data around the vehicle in real time, for example, collects a road image around the vehicle, and at the same time, the sensor processes the collected road image to identify a lane line in the road image as an observation lane line. The sensor then sends the observed lane line to the electronic device.
In another example, in S101, the electronic device may acquire environmental data around the vehicle, such as a road image, through the sensor mounted on the vehicle, and then perform lane line detection on the road image to obtain the observed lane line. Specifically, the sensor sends the road image around the vehicle collected in real time to the electronic device, and the electronic device processes the road image around the vehicle sent by the sensor to obtain an observation lane line observed on line.
Alternatively, the observed lane line may be identified according to an existing image identification method, for example, environment data (such as a road image) collected by a sensor is input into a trained neural network, and the neural network outputs the observed lane line.
S102, matching the observation lane line with the high-precision map lane line in a preset local range to obtain a matching result.
According to the steps, after the observation lane line is obtained, the observation lane line is matched with the high-precision map lane line to obtain a matching result, and specifically, the observation lane line is matched with the high-precision map lane line in a preset local range to obtain the matching result.
Wherein the high-precision map lane lines are obtained from the high-precision map.
The preset local range can be any range obtained from an observation lane line according to actual needs.
Optionally, the preset local range may be a local range close to the vehicle, and since the accuracy of the observation lane line close to the local range of the vehicle is high, the accurate observation lane line is used to determine whether the high-precision map lane line has problems of positioning deviation, outdating, and the like, so that the accurate determination of the reliability of the high-precision map lane line can be improved.
S103, determining a detected lane line according to the matching result.
And determining a detection lane line through the observation lane line and the high-precision map lane line.
In one example, an observation lane line or a high-precision map lane line is taken as the detection lane line according to the matching result obtained in the above step.
For example, if the matching result of the observation lane line and the high-precision map lane line is greater than the first preset value within the preset local range, the high-precision map is reliable, and thus the high-precision map lane line can be used as the detection lane line. And if the matching result of the observation lane line and the high-precision map lane line is smaller than a third preset value, the high-precision map lane line is not reliable, and the observation lane line is used as a detection lane line for driving guidance. The first preset value is greater than the third preset value, and both the first preset value and the third preset value are set according to actual needs, which is not limited in this embodiment.
In another example, the S103 may include: and determining a detection lane line according to the matching result through the observation lane line and the high-precision map lane line.
For example, in a preset local range, if the matching result of the observation lane line and the high-precision map lane line is greater than a second preset value, the observation lane line and the high-precision map lane line are fused into one lane line, and the fused lane line is used as a detection lane line. The second preset value is set according to actual needs, and this embodiment is not limited.
Optionally, a detection lane line may be determined by the observation lane line and the high-precision map lane line based on the matching result according to other methods.
Optionally, the electronic device is in communication connection with the intelligent driving system, and the electronic device may send the determined detection lane line to the intelligent driving system, so that the intelligent driving system performs intelligent driving control on the vehicle according to the detection lane line.
According to the lane line detection method provided by the embodiment of the application, an observation lane line observed on line is obtained, wherein the observation lane line is obtained by observing through a sensor mounted on a vehicle; matching the observation lane line with the high-precision map lane line in a preset local range to obtain a matching result; and determining a detected lane line according to the matching result. Therefore, whether the high-precision map lane line has the problems of positioning deviation, outdating and the like is judged through the near credible observation lane line, if the high-precision map lane line is high in reliability, the problem that the observation lane line is low in precision and is shielded is repaired by using the far high-precision map lane line, the high-precision and high-reliability detection lane line is finally obtained, and the vehicle plans the intelligent driving state of the vehicle according to the high-precision detection lane line, such as lane changing, speed reducing or parking, so that accurate guidance on the intelligent driving can be realized, and the safety of the intelligent driving is further improved.
Fig. 3 is a flowchart of a method for detecting a lane line according to an embodiment of the present application, where on the basis of the foregoing embodiment, the method according to the embodiment of the present application includes:
s201, acquiring at least one high-precision map lane line from a high-precision map according to the position information of the vehicle.
The electronic equipment obtains the position information of the vehicle at the current moment from a positioning module of the vehicle, and obtains at least one high-precision map lane line near the vehicle from a high-precision map according to the position information of the vehicle.
S202, matching the observation lane line with the at least one high-precision map lane line in a preset local range.
And matching the observation lane line with at least one high-precision map lane line within a preset local range, wherein the matching process comprises but is not limited to the following mode.
In a possible embodiment, the S202 may include:
and A1, acquiring at least one high-precision map lane line closest to the observation lane line from the at least one high-precision map lane line.
And A2, matching the observation lane line with each high-precision map lane line in at least one high-precision map lane line closest to the observation lane line in the preset local range.
Specifically, at least one high-precision map lane line closest to the observation lane line is obtained from the at least one high-precision map lane line. And matching the observation lane line with each high-precision map lane line in at least one high-precision map lane line closest to the observation lane line in a preset local range.
For example, a head point of the feature points of the observation lane line a is selected, K points closest to the head point are selected from the feature points of all the high-precision map lane lines, and the lane line where the K points are located is set as K high-precision map lane lines closest to the observation lane line. Then, one high-precision map lane line b is selected from the K high-precision map lane lines, and the initial matching degree S of the high-precision map lane line b and the observation lane line a is made to be 0. And (3) taking a first characteristic point PB of the high-precision map lane line b and a first characteristic point PA of the observation lane line a, calculating the matching degree D of the PA and the PB, and then making S equal to S + D. And selecting the next feature point of the PA and the PB along the advancing direction of the lane line, repeating the steps until the feature point matching of the observation lane line a or the high-precision map lane line b is finished, determining the matching degree S at the moment, and determining the matching degree S at the moment as the matching degree of the observation lane line a and the high-precision map lane line b.
And repeating the steps to determine the matching degree of each high-precision map lane line in the K high-precision map lane lines and the observation lane line. And according to the matching degree, selecting one high-precision map lane line from the K high-precision map lane lines, fusing the high-precision map lane line with the observation lane line, and determining the detection lane line.
Optionally, the feature points on the observation lane line and the high-precision map lane line are acquired along the direction of the lane line according to a preset acquisition interval, for example, at equal intervals.
In a possible embodiment, the S202 may include:
and step B1, aiming at each high-precision map lane line, acquiring a plurality of feature points on the observation lane line and a plurality of feature points on the high-precision map lane line in the preset local range.
And step B2, determining the matching degree between each feature point on the observation lane line and each feature point on the high-precision map lane line.
And step B3, determining the matching degree between the observation lane line and the high-precision map lane line according to the matching degree between each feature point on the observation lane line and each feature point on the high-precision map lane line.
In this embodiment, both the input high-precision map lane line and the observation lane line can be regarded as a set of points in a local space. These points are too dense and the computation cost for matching is too large. Firstly, fitting curves of the observation lane lines and the point sets of each high-precision map lane line, and then sampling, for example, sampling at equal intervals on the fitting curves of the observation lane lines and the high-precision map lane lines according to the advancing direction of the lane lines to obtain a feature point set of the observation lane lines and a feature point set of each high-precision map lane line for subsequent matching operation.
Taking a high-precision map lane line c as an example, acquiring a plurality of feature points of the high-precision map lane line c and a plurality of feature points of an observation lane line a, wherein the feature points of the high-precision map lane line c correspond to the feature points of the observation lane line a one to one. And determining the matching degree between each characteristic point on the observation lane line a and each characteristic point on the high-precision map lane line c. And determining the matching degree between the observation lane line a and the high-precision map lane line c according to the matching degree between each feature point on the observation lane line a and each feature point on the high-precision map lane line c. For example, the sum of the degree of matching between each feature point on the observation lane line a and each feature point on the high-precision map lane line c is determined as the degree of matching between the observation lane line a and the high-precision map lane line c.
S203, selecting one of the at least one high-precision map lane line as a target high-precision map lane line according to the matching degree of the observation lane line and the at least one high-precision map lane line.
According to the above steps, the matching degree of the observation lane line and the at least one high-precision map lane line can be determined, so that one of the at least one high-precision map lane line is selected as the target high-precision map lane line according to the matching degree of the observation lane line and the at least one high-precision map lane line, for example, the high-precision map lane line with the matching degree meeting the preset matching degree, which is determined according to actual needs, is determined as the target high-precision map lane line, which is not limited in the embodiments of the present application.
Optionally, the high-precision map lane line with the highest matching degree with the observation lane line in the at least one high-precision map lane line is used as the target high-precision map lane line.
S204, fusing the observation lane line and the target high-precision map lane line, and determining the detection lane line.
According to the steps, after the target high-precision map lane line is obtained, the observation lane line and the target high-precision map lane line are fused into one lane line to serve as a detection lane line.
In a possible implementation manner, the fusing the observation lane line and the target high-precision map lane line in S204 may include:
and step C, combining the characteristic points on the observation lane line with the characteristic points on the target high-precision map lane line.
The step is to combine the feature points of the observed lane line and the feature points of the target high-precision map lane line into one feature point one by one, for example, the observed lane line includes feature points 1 and 2, and the target high-precision map lane line includes feature points 3 and 4, where along the lane line direction, the feature points 1 correspond to the feature points 3 and the feature points 2 correspond to the feature points 4, so that the feature points 1 and 3 can be combined into one feature point, for example, an average value of the position information of the feature points 1 and 2 is used as the position information of the combined feature points, the feature points 2 and 4 are combined into one feature point, for example, an average value of the position information of the feature points 2 and 4 is used as the position information of the combined feature points, and the combined feature points form a new lane line as the detected lane line.
In a possible implementation manner, the step C includes:
and step C1, merging the feature points on the observation lane line and the feature points on the target high-precision map lane line according to the weight of the feature points on the observation lane line and the weight of the feature points on the target high-precision map lane line.
In this step, when merging, the weights of the feature points of the observation lane line and the weights of the feature points of the target high-precision map lane line are different, so that the feature points on the observation lane line and the feature points on the target high-precision map lane line can be merged according to the weights of the feature points on the observation lane line and the weights of the feature points on the target high-precision map lane line. Optionally, the weight of each feature point is set according to actual needs.
Optionally, in the preset local range, the weight of the feature point on the observation lane line is greater than the weight of the feature point on the target high-precision map lane line; and/or in a range outside the preset local range, the weight of the characteristic point on the observation lane line is smaller than that of the characteristic point on the target high-precision map lane line.
Because the precision of the observation lane line in the preset local range (namely the near-vehicle range) is high, in the near-vehicle range, the weight of the feature points on the observation lane line is greater than that of the feature points on the target high-precision map lane line, and the precision of the combined detection lane line in the near-vehicle range can be improved. And the precision of the high-precision map lane line is high in the range outside the preset local range (namely, in the far vehicle range), so that in the far vehicle range, the weight of the characteristic point on the observation lane line is smaller than that of the characteristic point on the target high-precision map lane line, the precision of the combined detection lane line in the far vehicle range can be improved, and the precision and the reliability of the detection lane line are improved.
In some embodiments, before S204, the method of the embodiments of the present application further includes:
s2001, determining the position deviation degree between the observation lane line and the target high-precision map lane line.
And S2002, when the position deviation degree is smaller than or equal to a position preset value, fusing the observation lane line and the target high-precision map lane line.
In order to further improve the accuracy of detecting the lane line, in some embodiments, the reliability of obtaining the target high-precision map lane line is also evaluated.
In the near-vehicle range, the observed lane line can be considered to be more accurate, so the position deviation degree between the observed lane line and the lane line of the target high-precision map can be used for measuring the reliability of the high-precision map. Specifically, the degree of positional deviation between the observation lane line and the target high-accuracy map lane line may be determined based on the positional information of each feature point of the observation lane line and the positional information of each feature point of the target high-accuracy map lane line, for example, the sum of differences between the positional information of each feature point of the observation lane line and the positional information of each feature point of the target high-accuracy map lane line is taken as the degree of positional deviation between the observation lane line and the target high-accuracy map lane line.
And when the position deviation degree between the observation lane line and the target high-precision map lane line is smaller than or equal to the preset position value, the high-precision map lane line is reliable, and at the moment, the step S204 is executed, and the observation lane line and the target high-precision map lane line are fused into a detection lane line.
In one possible implementation, the S2002 includes:
and D, determining the position deviation degree between the observation lane line and the target high-precision map lane line according to the matching degree between the characteristic points of the observation lane line and the characteristic points on the target high-precision map lane line.
For example, the degree of positional deviation between the observation lane line and the target high-precision map lane line may be determined by summing the degrees of matching between the feature points of the observation lane line and the feature points on the target high-precision map lane line.
The lane line detection method provided by the embodiment of the application acquires at least one high-precision map lane line from a high-precision map according to the position information of the vehicle, and within a preset local range, the observation lane line is matched with the at least one high-precision map lane line, and according to the matching degree of the observation lane line and the at least one high-precision map lane line, one of the at least one high-precision map lane line is selected as a target high-precision map lane line, so that the observation lane line and the target high-precision map lane line are fused to determine the detection lane line. Therefore, a target high-precision map lane line with the highest reliability is selected from at least one high-precision map lane line through the relatively credible observation lane line close to the high-precision map lane line, the target high-precision map lane line is used for repairing the observation lane line with low precision, the high-precision and high-reliability detection lane line is finally obtained, the vehicle plans the intelligent driving state of the vehicle according to the high-precision detection lane line, the accurate guidance of the intelligent driving can be realized, and the safety of the intelligent driving is further improved.
Fig. 4 is a schematic diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 12, the electronic device 200 according to the embodiment of the present application is disposed on a vehicle, and the electronic device 200 includes at least one memory 210 and at least one processor 220. Wherein, the memory 210 is used for storing the computer program; a processor 220 for executing the computer program.
A processor 220 that acquires an observation lane line observed on line by acquiring the lane line observed by a sensor 230 mounted on the vehicle when executing the computer program; matching the observation lane line with the high-precision map lane line in a preset local range to obtain a matching result; and determining a detected lane line according to the matching result.
Alternatively, the sensor 230 may be disposed on the electronic device 200 or on the vehicle, and the sensor 230 is in communication connection with the electronic device 200.
The electronic device of the embodiment of the present application may be configured to execute the technical solutions of the above-mentioned method embodiments, and the implementation principles and technical effects thereof are similar and will not be described herein again.
In a possible implementation manner, the processor 220 is specifically configured to determine, according to the matching result, a detected lane line through the observed lane line and the high-precision map lane line.
In a possible implementation manner, the processor 200 is specifically configured to acquire a road image of an environment around the vehicle through the sensor 230 mounted on the vehicle; and detecting the lane line of the road image to obtain the observation lane line.
Optionally, the preset local range is a local range close to the vehicle.
In one possible implementation, before the processor 220 matches the observation lane line and the high-precision map lane line, the processor 220 is further configured to: acquiring at least one high-precision map lane line from a high-precision map according to the position information of the vehicle; in the preset local range, the observation lane line and the high-precision map lane line are matched, and the method comprises the following steps: and matching the observation lane line with the at least one high-precision map lane line in a preset local range.
In a possible implementation manner, the processor 220 is specifically configured to obtain, from the at least one high-precision map lane line, at least one high-precision map lane line closest to the observation lane line; and matching the observation lane line with each high-precision map lane line in at least one high-precision map lane line closest to the observation lane line in the preset local range.
In a possible implementation manner, the processor 220 is specifically configured to, for each high-precision map lane line, obtain, in the preset local range, a plurality of feature points on the observation lane line and a plurality of feature points on the high-precision map lane line; determining the matching degree between each feature point on the observation lane line and each feature point on the high-precision map lane line; and determining the matching degree between the observation lane line and the high-precision map lane line according to the matching degree between each feature point on the observation lane line and each feature point on the high-precision map lane line.
In a possible implementation manner, the processor 220 is specifically configured to determine a sum of matching degrees between each feature point on the observation lane line and each feature point on the high-precision map lane line as a matching degree between the observation lane line and the high-precision map lane line.
In a possible implementation manner, the processor 220 is specifically configured to select one of the at least one high-precision map lane line as a target high-precision map lane line according to a matching degree between the observation lane line and the at least one high-precision map lane line; and fusing the observation lane line and the target high-precision map lane line to determine the detection lane line.
In one possible implementation, the processor 220, prior to fusing the observation lane line with the target high-precision map lane line,
the processor 220 is further configured to: determining the position deviation degree between the observation lane line and the target high-precision map lane line; and when the position deviation degree is smaller than or equal to a position preset value, fusing the observation lane line and the target high-precision map lane line.
In a possible implementation manner, the processor 220 is specifically configured to determine a position deviation degree between the observation lane line and the target high-precision map lane line according to a matching degree between the feature point of the observation lane line and the feature point on the target high-precision map lane line.
In a possible implementation manner, the processor 220 is specifically configured to merge the feature points on the observation lane line with the feature points on the target high-precision map lane line.
In a possible implementation manner, the processor 220 is specifically configured to merge the feature points on the observation lane line and the feature points on the target high-precision map lane line according to the weights of the feature points on the observation lane line and the weights of the feature points on the target high-precision map lane line.
In a possible implementation manner, within the preset local range, the weight of the feature point on the observation lane line is greater than the weight of the feature point on the target high-precision map lane line; and/or in a range outside the preset local range, the weight of the characteristic point on the observation lane line is smaller than that of the characteristic point on the target high-precision map lane line.
Optionally, the feature points on the observation lane line and the high-precision map lane line are acquired along the direction of the lane line according to a preset acquisition interval.
The electronic device of the embodiment of the present application may be configured to execute the technical solutions of the above-mentioned method embodiments, and the implementation principles and technical effects thereof are similar and will not be described herein again.
Fig. 5 is a schematic structural diagram of a vehicle according to an embodiment of the present application, and as shown in fig. 5, a vehicle 50 according to the present embodiment includes: a vehicle body 51 and an electronic device 52 mounted on the vehicle body 51.
The electronic device 52 may be the electronic device shown in fig. 4, and the electronic device 52 is used for detecting the lane line.
Optionally, the electronic device 52 is mounted on the roof of the vehicle body 51 and the sensors are mounted on the vehicle body for collecting environmental data, such as road images, around the vehicle.
Alternatively, the electronic device 52 is mounted on a front windshield of the vehicle body 51, or the electronic device 52 is mounted on a rear windshield of the vehicle body 51.
Optionally, the electronic device 52 is mounted on a head of the vehicle body 51, or the electronic device 52 is mounted on a tail of the vehicle body 51.
The installation position of the electronic device 52 on the vehicle body 51 is not limited in the embodiment of the application, and is specifically determined according to actual needs.
The vehicle of the embodiment of the application can be used for implementing the technical scheme of the embodiment of the lane line detection method, the implementation principle and the technical effect are similar, and details are not repeated here.
Fig. 6 is a schematic structural diagram of a vehicle according to an embodiment of the present application, and as shown in fig. 6, a vehicle 60 according to the present embodiment includes: a vehicle body 61 and an electronic device 62 mounted on the vehicle body 61.
The electronic device 62 may be the electronic device shown in fig. 4, and the electronic device 62 is used for detecting the lane line.
Alternatively, the vehicle 60 of the present embodiment may be a boat, automobile, bus, rail vehicle, aircraft, railroad locomotive, scooter, bicycle, or the like.
Optionally, the electronic device 62 may be mounted at the front, the rear, or the middle of the vehicle body 61, and the mounting position of the electronic device 62 on the vehicle body 61 is not limited in the embodiment of the present application, and is specifically determined according to actual needs.
The vehicle according to the embodiment of the present application may be used to implement the technical solution of the above-described embodiment of the lane line detection method, and the implementation principle and the technical effect are similar, which are not described herein again.
Further, when at least a part of the functions of the lane line detection method in the embodiment of the present application are implemented by software, the embodiment of the present application further provides a computer storage medium, which is used to store computer software instructions for detecting the lane line, and when the computer storage medium runs on a computer, the computer storage medium enables the computer to execute various possible lane line detection methods in the embodiment of the method. The processes or functions described in accordance with the embodiments of the present application may be generated in whole or in part when the computer-executable instructions are loaded and executed on a computer. The computer instructions may be stored on a computer storage medium or transmitted from one computer storage medium to another via wireless (e.g., cellular, infrared, short-range wireless, microwave, etc.) to another website site, computer, server, or data center. The computer storage media may be any available media that can be accessed by a computer or a data storage device, such as a server, data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., SSD), among others.
The embodiment of the present invention further provides a computer storage medium, where program instructions are stored in the computer storage medium, and when the program is executed, the computer storage medium may include some or all of the steps of the lane line detection method in the foregoing embodiments.
Those of ordinary skill in the art will understand that: all or part of the steps for implementing the method embodiments may be implemented by hardware related to program instructions, and the program may be stored in a computer readable storage medium, and when executed, the program performs the steps including the method embodiments; and the aforementioned storage medium includes: various media capable of storing program codes, such as a Read-only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (33)

1. A method for detecting a lane line, comprising:
acquiring an observation lane line for online observation, wherein the observation lane line is observed and acquired by a sensor mounted on a vehicle;
matching the observation lane line with the high-precision map lane line in a preset local range to obtain a matching result;
and determining a detected lane line according to the matching result.
2. The method of claim 1, wherein determining a detected lane line according to the matching result comprises:
and determining a detection lane line according to the matching result through the observation lane line and the high-precision map lane line.
3. The method of claim 1 or 2, wherein the obtaining an online observed observation lane line comprises:
acquiring a road image of an environment around the vehicle through the sensor mounted on the vehicle;
and detecting the lane line of the road image to obtain the observation lane line.
4. The method according to claim 2, wherein the preset local range is a local range near the vehicle.
5. The method of claim 4, wherein prior to matching the observation lane line and the high-precision map lane line, the method further comprises:
acquiring at least one high-precision map lane line from a high-precision map according to the position information of the vehicle;
in the preset local range, the observation lane line and the high-precision map lane line are matched, and the method comprises the following steps: and matching the observation lane line with the at least one high-precision map lane line in a preset local range.
6. The method of claim 5, wherein matching the observation lane line with the at least one high-precision map lane line within a preset local range comprises:
acquiring at least one high-precision map lane line closest to the observation lane line from the at least one high-precision map lane line;
and matching the observation lane line with each high-precision map lane line in at least one high-precision map lane line closest to the observation lane line in the preset local range.
7. The method according to claim 5, wherein the matching the observation lane line and the at least one high-precision map lane line within a preset local range comprises:
for each high-precision map lane line, acquiring a plurality of feature points on the observation lane line and a plurality of feature points on the high-precision map lane line within the preset local range;
determining the matching degree between each feature point on the observation lane line and each feature point on the high-precision map lane line;
and determining the matching degree between the observation lane line and the high-precision map lane line according to the matching degree between each feature point on the observation lane line and each feature point on the high-precision map lane line.
8. The method of claim 7, wherein determining the degree of match between the observation lane line and the high-accuracy map lane line according to the degree of match between each feature point on the observation lane line and each feature point on the high-accuracy map lane line comprises:
and determining the sum of the matching degrees between each feature point on the observation lane line and each feature point on the high-precision map lane line as the matching degree between the observation lane line and the high-precision map lane line.
9. The method according to any one of claims 6 to 8, wherein determining a detection lane line from the observation lane line and the high-precision map lane line according to the matching result comprises:
selecting one of the at least one high-precision map lane line as a target high-precision map lane line according to the matching degree of the observation lane line and the at least one high-precision map lane line;
and fusing the observation lane line and the target high-precision map lane line to determine the detection lane line.
10. The method of claim 9, wherein prior to fusing the observation lane line with the target high precision map lane line, the method further comprises:
determining the position deviation degree between the observation lane line and the target high-precision map lane line;
and when the position deviation degree is smaller than or equal to a position preset value, fusing the observation lane line and the target high-precision map lane line.
11. The method of claim 10, wherein the determining a degree of positional deviation between the observed lane line and the target high-precision map lane line comprises:
and determining the position deviation degree between the observation lane line and the target high-precision map lane line according to the matching degree between the feature points of the observation lane line and the feature points on the target high-precision map lane line.
12. The method according to any one of claims 9-11, wherein said fusing the observation lane line with the target high-precision map lane line comprises:
and merging the characteristic points on the observation lane line and the characteristic points on the target high-precision map lane line.
13. The method of claim 12, wherein merging the feature points on the observation lane line with the feature points on the target high-precision map lane line comprises:
and combining the feature points on the observation lane line and the feature points on the target high-precision map lane line according to the weight of the feature points on the observation lane line and the weight of the feature points on the target high-precision map lane line.
14. The method according to claim 13, wherein the weight of the feature point on the observation lane line is greater than the weight of the feature point on the target high-precision map lane line within the preset local range; and/or in a range outside the preset local range, the weight of the characteristic point on the observation lane line is smaller than that of the characteristic point on the target high-precision map lane line.
15. The method according to any one of claims 7 to 14, wherein the feature points on the observation lane line and the high-precision map lane line are acquired at a preset acquisition interval along the direction of the lane line.
16. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing the computer program, in particular for performing:
acquiring an observation lane line for online observation, wherein the observation lane line is observed and acquired by a sensor mounted on a vehicle;
matching the observation lane line with the high-precision map lane line in a preset local range to obtain a matching result;
and determining a detection lane line according to the matching result through the observation lane line and the high-precision map lane line.
17. The electronic device of claim 16,
and the processor is specifically used for determining a detection lane line through the observation lane line and the high-precision map lane line according to the matching result.
18. The electronic device of claim 16 or 17,
the processor is specifically configured to acquire a road image of an environment around the vehicle through the sensor mounted on the vehicle; and detecting the lane line of the road image to obtain the observation lane line.
19. The electronic device according to claim 17, wherein the preset local range is a local range near the vehicle.
20. The electronic device of claim 19, wherein prior to matching the observation lane line and the high-precision map lane line, the processor is further configured to:
acquiring at least one high-precision map lane line from a high-precision map according to the position information of the vehicle;
in the preset local range, the observation lane line and the high-precision map lane line are matched, and the method comprises the following steps: and matching the observation lane line with the at least one high-precision map lane line in a preset local range.
21. The electronic device of claim 20,
the processor is specifically configured to obtain at least one high-precision map lane line closest to the observation lane line from the at least one high-precision map lane line; and matching the observation lane line with each high-precision map lane line in at least one high-precision map lane line closest to the observation lane line in the preset local range.
22. The electronic device of claim 20,
the processor is specifically configured to acquire, for each high-precision map lane line, a plurality of feature points on the observation lane line and a plurality of feature points on the high-precision map lane line within the preset local range; determining the matching degree between each feature point on the observation lane line and each feature point on the high-precision map lane line; and determining the matching degree between the observation lane line and the high-precision map lane line according to the matching degree between each feature point on the observation lane line and each feature point on the high-precision map lane line.
23. The electronic device of claim 22,
the processor is specifically configured to determine a sum of matching degrees between each feature point on the observation lane line and each feature point on the high-precision map lane line as a matching degree between the observation lane line and the high-precision map lane line.
24. The electronic device of any of claims 21-23,
the processor is specifically configured to select one of the at least one high-precision map lane line as a target high-precision map lane line according to the matching degree between the observation lane line and the at least one high-precision map lane line; and fusing the observation lane line and the target high-precision map lane line to determine the detection lane line.
25. The electronic device of claim 24, wherein the processor, prior to fusing the observation lane line with the target high-precision map lane line, is further configured to:
determining the position deviation degree between the observation lane line and the target high-precision map lane line;
and when the position deviation degree is smaller than or equal to a position preset value, fusing the observation lane line and the target high-precision map lane line.
26. The electronic device of claim 25,
the processor is specifically configured to determine a position deviation degree between the observation lane line and the target high-precision map lane line according to a matching degree between the feature point of the observation lane line and the feature point on the target high-precision map lane line.
27. The electronic device of any of claims 24-26,
the processor is specifically configured to merge the feature points on the observation lane line with the feature points on the target high-precision map lane line.
28. The electronic device of claim 27,
the processor is specifically configured to merge the feature points on the observation lane line and the feature points on the target high-precision map lane line according to the weights of the feature points on the observation lane line and the weights of the feature points on the target high-precision map lane line.
29. The electronic device according to claim 28, wherein the weight of the feature point on the observation lane line is greater than the weight of the feature point on the target high-precision map lane line within the preset local range; and/or in a range outside the preset local range, the weight of the characteristic point on the observation lane line is smaller than that of the characteristic point on the target high-precision map lane line.
30. The electronic device according to any one of claims 22 to 29, wherein the feature points on the observation lane line and the high-precision map lane line are acquired at a preset acquisition interval along the direction of the lane line.
31. A vehicle, characterized by comprising: a vehicle body and an electronic device as claimed in any one of claims 16-30 mounted on the vehicle body.
32. A vehicle, comprising: a vehicle body and an electronic device as claimed in any one of claims 16-30 mounted on the vehicle body.
33. A computer storage medium, characterized in that the storage medium has stored therein a computer program which, when executed, implements the lane line detection method according to any one of claims 1 to 15.
CN201980012354.9A 2019-05-22 2019-05-22 Lane line detection method, electronic device, and storage medium Pending CN111742326A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/087874 WO2020232648A1 (en) 2019-05-22 2019-05-22 Lane line detection method, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN111742326A true CN111742326A (en) 2020-10-02

Family

ID=72646056

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980012354.9A Pending CN111742326A (en) 2019-05-22 2019-05-22 Lane line detection method, electronic device, and storage medium

Country Status (2)

Country Link
CN (1) CN111742326A (en)
WO (1) WO2020232648A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419590A (en) * 2022-01-17 2022-04-29 北京百度网讯科技有限公司 High-precision map verification method, device, equipment and storage medium

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114639079B (en) 2020-12-15 2023-06-30 北京百度网讯科技有限公司 Method, device, equipment and storage medium for matching lane line data
CN113009539A (en) * 2021-02-19 2021-06-22 恒大新能源汽车投资控股集团有限公司 Automatic lane changing processing method for vehicle, vehicle and equipment
CN113280822B (en) * 2021-04-30 2023-08-22 北京觉非科技有限公司 Vehicle positioning method and positioning device
CN113942522A (en) * 2021-05-31 2022-01-18 重庆工程职业技术学院 Intelligent driving safety protection system
CN113688935A (en) * 2021-09-03 2021-11-23 阿波罗智能技术(北京)有限公司 High-precision map detection method, device, equipment and storage medium
EP4257927A1 (en) * 2022-04-06 2023-10-11 Zenseact AB Vehicle pose assessment
CN114719872B (en) * 2022-05-13 2022-09-23 高德软件有限公司 Lane line processing method and device and electronic equipment
CN115143996B (en) * 2022-09-05 2023-01-17 北京智行者科技股份有限公司 Positioning information correction method, electronic device, and storage medium
CN115615444B (en) * 2022-12-02 2023-03-10 高德软件有限公司 Map data detection method, device and storage medium
CN116756264B (en) * 2023-08-18 2023-11-17 高德软件有限公司 Reconstruction data evaluation method and device, electronic equipment and storage medium
CN117490728B (en) * 2023-12-28 2024-04-02 合众新能源汽车股份有限公司 Lane line positioning fault diagnosis method and system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103954275B (en) * 2014-04-01 2017-02-08 西安交通大学 Lane line detection and GIS map information development-based vision navigation method
CN109323701A (en) * 2017-08-01 2019-02-12 郑州宇通客车股份有限公司 The localization method and system combined based on map with FUSION WITH MULTISENSOR DETECTION
CN109297500B (en) * 2018-09-03 2020-12-15 武汉中海庭数据技术有限公司 High-precision positioning device and method based on lane line feature matching
CN109186615A (en) * 2018-09-03 2019-01-11 武汉中海庭数据技术有限公司 Lane side linear distance detection method, device and storage medium based on high-precision map
CN109186616B (en) * 2018-09-20 2020-04-07 禾多科技(北京)有限公司 Lane line auxiliary positioning method based on high-precision map and scene retrieval

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114419590A (en) * 2022-01-17 2022-04-29 北京百度网讯科技有限公司 High-precision map verification method, device, equipment and storage medium
CN114419590B (en) * 2022-01-17 2024-03-19 北京百度网讯科技有限公司 Verification method, device, equipment and storage medium of high-precision map

Also Published As

Publication number Publication date
WO2020232648A1 (en) 2020-11-26

Similar Documents

Publication Publication Date Title
CN111742326A (en) Lane line detection method, electronic device, and storage medium
CN111947671B (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN109284348B (en) Electronic map updating method, device, equipment and storage medium
CN111161353B (en) Vehicle positioning method, device, readable storage medium and computer equipment
CN110415550B (en) Automatic parking method based on vision
CN112132896B (en) Method and system for detecting states of trackside equipment
US20200355513A1 (en) Systems and methods for updating a high-definition map
EP3842735B1 (en) Position coordinates estimation device, position coordinates estimation method, and program
CN112113574A (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN112292582A (en) Method and system for generating high definition map
CN111224710B (en) Virtual transponder capturing method and system based on satellite space distribution inspection
CN113223317A (en) Method, device and equipment for updating map
US20190360820A1 (en) Method and device for executing at least one measure for increasing the safety of a vehicle
CN110936959B (en) On-line diagnosis and prediction of vehicle perception system
CN114274972A (en) Scene recognition in an autonomous driving environment
CN111914691A (en) Rail transit vehicle positioning method and system
CN113405555B (en) Automatic driving positioning sensing method, system and device
CN113227713A (en) Method and system for generating environment model for positioning
US20230236020A1 (en) System and Method for Map Matching GNSS Positions of a Vehicle
EP4134623A1 (en) Drive device, vehicle, and method for automated driving and/or assisted driving
US20230204364A1 (en) Ascertaining a starting position of a vehicle for a localization
WO2021056185A1 (en) Systems and methods for partially updating high-definition map based on sensor data matching
CN116762094A (en) Data processing method and device
CN113345251A (en) Vehicle reverse running detection method and related device
JP6933069B2 (en) Pathfinding device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240522

Address after: Building 3, Xunmei Science and Technology Plaza, No. 8 Keyuan Road, Science and Technology Park Community, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 518057, 1634

Applicant after: Shenzhen Zhuoyu Technology Co.,Ltd.

Country or region after: China

Address before: 518057 Shenzhen Nanshan High-tech Zone, Shenzhen, Guangdong Province, 6/F, Shenzhen Industry, Education and Research Building, Hong Kong University of Science and Technology, No. 9 Yuexingdao, South District, Nanshan District, Shenzhen City, Guangdong Province

Applicant before: SZ DJI TECHNOLOGY Co.,Ltd.

Country or region before: China