CN107480592B - Multi-lane detection method and tracking method - Google Patents
Multi-lane detection method and tracking method Download PDFInfo
- Publication number
- CN107480592B CN107480592B CN201710568932.XA CN201710568932A CN107480592B CN 107480592 B CN107480592 B CN 107480592B CN 201710568932 A CN201710568932 A CN 201710568932A CN 107480592 B CN107480592 B CN 107480592B
- Authority
- CN
- China
- Prior art keywords
- line
- lane
- lane line
- straight line
- straight
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 38
- 238000000034 method Methods 0.000 title claims abstract description 26
- 239000002245 particle Substances 0.000 claims description 3
- 238000007781 pre-processing Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 abstract description 2
- 239000000872 buffer Substances 0.000 description 9
- 238000005259 measurement Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000002922 simulated annealing Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
Abstract
The invention provides a multi-lane detection method and a tracking method, wherein the detection method comprises the following steps: determining a candidate area in the image, and then detecting a first lane line on the left side and a first lane line on the right side of a vehicle in the candidate area by using a single lane line detection method; selecting a selected point under the vanishing point, and making a horizontal line through the selected point, wherein the horizontal line and the left and right first lane lines are respectively intersected at a first intersection point and a second intersection point; judging whether adjacent lanes exist on the left side and the right side of the vehicle, if so, determining each candidate straight line of the left second lane line and the right second lane line, calculating the distance between the first intersection point and the second intersection point and each candidate straight line of the left second lane line and the right second lane line, forming each candidate straight line with the distance within a preset range into a straight line cluster of the left second lane line and the right second lane line, and fitting the left second lane line and the right second lane line according to each straight line in the straight line cluster. The method can simply and quickly position the lane line, and has low calculation complexity and high effectiveness.
Description
Technical Field
The invention relates to an image detection method, in particular to a multi-lane detection method and a multi-lane tracking method.
Background
The basic method of lane line detection is to determine which edge points belong to the same straight line according to the result of image edge detection. For lane lines on a highway, the near end lane may be characterized by a straight line. Meanwhile, as part of intelligent traffic environment perception, multilane detection can provide support data for small-environment driving route planning for automatic driving assistance, intelligent driving and unmanned driving.
At present, methods for detecting a current straight lane include hough transform, straight line approximation, curve detection based on a simulated annealing algorithm, correlation detection, inverse perspective mapping and the like. In the methods, Hough transform can effectively make up for the loss of the characteristic points on the straight line, so that the accuracy rate of lane line detection is improved and the method is widely applied.
Since the ADAS (advanced driving assistance system) related algorithm mainly focuses on the current lane and the research on detecting lane lines of multiple lanes is very little, however, as the autodrive technology becomes mature, the related laws and regulations become more and more sound, and the detection of multiple lanes becomes more and more important for the path planning of the autodrive vehicle.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, an object of the present invention is to provide a forward-looking camera-based multi-lane detection method and tracking method for accurately detecting and tracking multiple lane lines and providing data support for unmanned driving.
In order to achieve the above object, an aspect of the present invention provides a multi-lane detection method, including:
step S1, shooting an image in front of the vehicle, determining a candidate area in the image, and then detecting a first lane line on the left side and the right side of the vehicle in the candidate area by using a single lane line detection method;
step S2, selecting a selected point under the vanishing point, and making a horizontal line through the selected point, wherein the horizontal line intersects with the first lane line on the left and right sides at a first intersection point and a second intersection point respectively;
step S3, judging whether adjacent lanes exist on the left side and the right side of the vehicle, if so, determining each candidate straight line of the second lane line on the left side, calculating the distance between the first intersection point and each candidate straight line of the second lane line on the left side, forming each candidate straight line with the distance within a preset range into a straight line cluster of the second lane line on the left side, and fitting the second lane line on the left side according to each straight line in the straight line cluster; and if the adjacent lanes exist on the right side, determining each candidate straight line of the second lane line on the right side, calculating the distance between the second intersection point and each candidate straight line of the second lane line on the right side, forming a straight line cluster of the second lane line on the right side by using each candidate straight line with the distance within a preset range, and fitting the second lane line on the right side according to each straight line in the straight line cluster.
Further, the single lane line detection method in step S1 includes the steps of:
step S11, reading an image in front of the vehicle, and preprocessing the image;
step S12 of detecting a straight line in the image, and setting a straight line with a positive slope on the left side of the vehicle as a left-side straight line and a straight line with a negative slope on the right side of the vehicle as a right-side straight line;
step S13, a reference point right below the missing point is cancelled, the distance between the reference point and each left side straight line and each right side straight line is calculated, each left side straight line with the distance within a preset range forms a straight line cluster of a first lane line on the left side, and each right side straight line with the distance within the preset range forms a straight line cluster of a first lane line on the right side;
and step S14, fitting the left and right first lane lines with the other straight lines in the straight line cluster of the left and right first lane lines as the assistance on the basis of the left and right straight lines closest to the reference point.
Further, the step S12 detects a straight line in the image using hough transform.
Further, in the step S3, candidate straight lines of the second lane line on the left side are determined by: and detecting a straight line with a positive slope and a smaller slope than the slope of the left first side lane line on the left side of the left first side lane line, and selecting a straight line with an included angle larger than a preset angle with the left first lane line as a candidate lane line of a left second lane line.
Further, in the step S3, candidate straight lines of the second lane line on the right side are determined by: and detecting straight lines with negative slope and larger than the slope of the right first side lane line on the right side of the right first side lane line, and selecting the straight lines with the included angle larger than a preset angle with the right first side lane line as candidate lane lines of a right second lane line.
Further, in the step S3, the second lane line on the left side is fitted by: and selecting all straight lines which are intersected with the left and right first lane lines at vanishing points from the straight line cluster of the left second lane line, and fitting the left second lane line according to the selected straight lines.
Further, in the step S4, a right second lane line is fitted by: and selecting all straight lines which are intersected with the left and right first lane lines at vanishing points from the straight line cluster of the right second lane line, and fitting the right second lane line according to the selected straight lines.
Another aspect of the present invention provides a multi-lane tracking method for tracking each lane line detected by the multi-lane detection method of any one of the preceding claims 1 to 7 to predict the position of each lane line at the next time.
Further, the multilane tracking method is characterized in that the method adopts a Kalman filter or a particle filter to track each lane line.
In conclusion, the multi-lane detection and tracking method can simply and quickly locate the lane line, has low calculation complexity, high effectiveness, low resource occupancy rate and capability of quickly locating and predicting the range of the lane line, and is suitable for vehicle-mounted data processing software with limited hardware resources and processing capacity.
Drawings
FIG. 1 is a schematic view of lane line fitting for a single lane;
FIG. 2 is a schematic diagram illustrating the determination of a left second lane and a right second lane according to the present invention;
3A-3C are schematic views of a vehicle in different lanes;
fig. 4A to 4C show the update of the lane marker buffer in three cases.
Detailed Description
In order to make the invention more comprehensible, preferred embodiments are described in detail below with reference to the accompanying drawings.
The invention discloses a multilane detection method, which comprises the following steps:
in step S1, an image of the front of the vehicle is captured, a candidate area in the image is determined, and then the first lane lines on the left and right sides of the vehicle in the candidate area are detected using a single lane line detection method. The single lane line detection method is realized by the following steps: step S11, reading an image in front of the vehicle and preprocessing the image; step S12, adopting Hough transform to detect straight lines in the image, and taking the straight line with positive slope at the left side of the vehicle as a left side straight line and taking the straight line with negative slope at the right side of the vehicle as a right side straight line; step S13, determining a vanishing point, canceling a reference point right below the vanishing point, calculating the distance between the reference point and each left straight line and each right straight line, forming each left straight line with the distance within a preset range into a straight line cluster of a first lane line on the left side, and forming each right straight line with the distance within the preset range into a straight line cluster of a first lane line on the right side; step S14, fitting the first lane line on the left and right sides (as shown in fig. 1) with the other straight lines in the straight line cluster of the first lane line on the left and right sides as an aid based on the left and right straight lines closest to the reference point.
In step S2, as shown in fig. 2, a selected point is selected directly below the vanishing point, and a horizontal line is drawn through the selected point, where the horizontal line intersects the left and right first lane lines at a first intersection point PL and a second intersection point PR, respectively.
Since the most affected are the vehicles and road conditions from the current lane and the adjacent lanes when the vehicles are driven on the road, the multi-lane line detection of the present invention only considers the detection of four lane lines at most. When the current lane is the middle lane, two lane lines at the left and right of the vehicle are detected, as shown in fig. 3A; when the vehicle is in the leftmost lane of the road, only the current lane and the right-adjacent lane are detected, and when the vehicle is in the rightmost lane of the road, only the current lane and the left-adjacent lane thereof are detected, as shown in fig. 3B and 3C, respectively.
Therefore, in step S3, it is first determined whether there are adjacent lanes on the left and right sides of the vehicle, if there are adjacent lanes on the left side, each candidate straight line of the second lane line on the left side is determined, the distance between the first intersection point PL and each candidate straight line of the second lane line on the left side is calculated, each candidate straight line with the distance within a predetermined range is formed into a straight line cluster of the second lane line on the left side, and then the second lane line on the left side is fitted according to each straight line in the straight line cluster; and if the adjacent lane exists on the right side, determining each candidate straight line of the second lane line on the right side, calculating the distance between the second intersection point PR and each candidate straight line of the second lane line on the right side, forming a straight line cluster of the second lane line on the right side by using each candidate straight line with the distance within a preset range, and fitting the second lane line on the right side according to each straight line in the straight line cluster.
In step S3, candidate straight lines for the second lane line on the left side are determined by: a straight line having a slope positive and smaller than that of the left first side lane line is detected on the left side of the left first side lane line, and a straight line having an angle larger than a predetermined angle (e.g., 10 degrees) with the left first side lane line is selected therefrom as a candidate lane line for the left second lane line. Meanwhile, determining each candidate straight line of the second lane line on the right side by the following steps: and detecting straight lines with negative slope and larger than the slope of the right first side lane line on the right side of the right first side lane line, and selecting the straight lines with the included angle larger than a preset angle (such as 10 degrees) with the right first side lane line as candidate lane lines of the right second lane line.
In addition, in step S3, the left second lane line is fitted by: and selecting all straight lines which are intersected with the left and right first lane lines at the vanishing points from the straight line cluster of the left second lane line, and fitting the left second lane line according to the selected straight lines. Meanwhile, fitting a second lane line on the right side by the following steps: and selecting all straight lines which are intersected with the left and right first lane lines at the vanishing points from the straight line cluster of the right second lane line, and fitting the right second lane line according to the selected straight lines.
On the other hand, the invention provides a multi-lane tracking method based on the multi-lane detection method, and the method adopts a kalman filter or a particle filter to track each lane line detected by the multi-lane detection method so as to predict the position of each lane line at the next moment, thereby facilitating the determination of the region of interest of the image at the next moment, and selecting a straight line in the region of interest of the image at the next moment as a candidate lane to provide a basis for lane line detection.
The principle of lane line tracking is explained below using a kalman filter as an example:
Xp=A*X (1)
in the formula (1), A is a state transition matrix, and the included parameters are the speed and steering angle information of the current vehicle; x is the current state matrix whose components include the slope and intercept of the lane lines. Xp is the predicted value.
Pp=A*P*At+Q (2)
In equation (2), P is process noise, and the value of Pp is the sum of the noise and the predicted value of the process error, collectively referred to as the process error.
K=Pp/(Pp+R) (3)
Equation (3) is used to calculate the kalman gain, Pp represents the process error, R represents the measurement error, and K is significant in measuring how large the process error is in proportion to the total error (process error + measurement error). The result of the tracking is expressed by equation (4):
Xk+1=Xp+K*(Z‐Xp) (4)
in the formula (4), Z is a measurement result, meaning that the tracking result is the sum of a predicted value and a correction value. Wherein:
correction value kalman gain (measured value-predicted value) (5)
After the tracking results are corrected, the process noise is also corrected:
Pk+1=(I–K)*Pp (6)。
in addition, the invention designs a Buffer for each lane line, the update conditions of the buffers in different states are different, and fig. 4A-4C illustrate the update conditions of the buffers in three conditions.
Fig. 4A shows the update of the buffer in the normal driving state. At Tk‐1Time of day, output Lk‐1As TkPredicting values of the lane line parameters at the moment; at TkAt the moment, after the lane detection, the lane following algorithm is applied again to output LkAs a result of tracking, used as Tk+1And (4) a predicted value of the lane line parameter at the moment.
FIG. 4B is a schematic view of a test stripAnd detecting the update condition of the buffer under the effective lane state. At Tk‐1Time of day, output Lk‐1As TkPredicting values of the lane line parameters at the moment; at TkAt the moment and in order to detect a lane line, lane tracking also has no result, in which case L will still bek‐1As TkThe result of the tracking of the time of day, used as Tk+1And (4) a predicted value of the lane line parameter at the moment.
Fig. 4C is a case where the buffer is cleared due to lane change. And if the vehicle is detected to be in the lane changing process, emptying the buffer, and implanting the predicted value into the buffer after the lane line is normally detected.
The foregoing is only a partial embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (9)
1. A multilane detection method, characterized in that it comprises the steps of:
step S1, shooting an image in front of the vehicle, determining a candidate area in the image, and then detecting a first lane line on the left side and the right side of the vehicle in the candidate area by using a single lane line detection method;
step S2, selecting a selected point under the vanishing point, and making a horizontal line through the selected point, wherein the horizontal line intersects with the first lane line on the left and right sides at a first intersection point and a second intersection point respectively;
step S3, judging whether adjacent lanes exist on the left side and the right side of the vehicle, if so, determining each candidate straight line of the second lane line on the left side, calculating the distance between the first intersection point and each candidate straight line of the second lane line on the left side, forming each candidate straight line with the distance within a preset range into a straight line cluster of the second lane line on the left side, and fitting the second lane line on the left side according to each straight line in the straight line cluster; and if the adjacent lanes exist on the right side, determining each candidate straight line of the second lane line on the right side, calculating the distance between the second intersection point and each candidate straight line of the second lane line on the right side, forming a straight line cluster of the second lane line on the right side by using each candidate straight line with the distance within a preset range, and fitting the second lane line on the right side according to each straight line in the straight line cluster.
2. The multilane detection method according to claim 1, wherein said single lane line detection method in step S1 includes the steps of:
step S11, reading an image in front of the vehicle, and preprocessing the image;
step S12 of detecting a straight line in the image, and setting a straight line with a positive slope on the left side of the vehicle as a left-side straight line and a straight line with a negative slope on the right side of the vehicle as a right-side straight line;
step S13, a reference point right below the missing point is cancelled, the distance between the reference point and each left side straight line and each right side straight line is calculated, each left side straight line with the distance within a preset range forms a straight line cluster of a first lane line on the left side, and each right side straight line with the distance within the preset range forms a straight line cluster of a first lane line on the right side;
and step S14, fitting the left and right first lane lines with the other straight lines in the straight line cluster of the left and right first lane lines as the assistance on the basis of the left and right straight lines closest to the reference point.
3. The multilane detection method according to claim 2, wherein said step S12 detects a straight line in said image using hough transform.
4. The multilane detection method according to claim 1, wherein in said step S3, candidate straight lines for the second lane line on the left side are determined by: and detecting a straight line with a positive slope and a smaller slope than the slope of the left first side lane line on the left side of the left first side lane line, and selecting a straight line with an included angle larger than a preset angle with the left first lane line as a candidate lane line of a left second lane line.
5. The multilane detection method according to claim 1, wherein in said step S3, candidate straight lines for the second lane line on the right side are determined by: and detecting straight lines with negative slope and larger than the slope of the right first side lane line on the right side of the right first side lane line, and selecting the straight lines with the included angle larger than a preset angle with the right first side lane line as candidate lane lines of a right second lane line.
6. The multilane detection method according to claim 1, characterized in that in said step S3, a left second lane line is fitted by: and selecting all straight lines which are intersected with the left and right first lane lines at vanishing points from the straight line cluster of the left second lane line, and fitting the left second lane line according to the selected straight lines.
7. The multilane detection method according to claim 1, characterized in that in said step S3, a right second lane line is fitted by: and selecting all straight lines which are intersected with the left and right first lane lines at vanishing points from the straight line cluster of the right second lane line, and fitting the right second lane line according to the selected straight lines.
8. A multilane tracking method characterized in that it tracks each lane line detected by the multilane detection method of any of the preceding claims 1-7 to predict the position of each lane line at the next instant.
9. The multilane tracking method according to claim 8, characterized in that it employs a kalman filter or a particle filter for tracking each lane line.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710568932.XA CN107480592B (en) | 2017-07-13 | 2017-07-13 | Multi-lane detection method and tracking method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710568932.XA CN107480592B (en) | 2017-07-13 | 2017-07-13 | Multi-lane detection method and tracking method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107480592A CN107480592A (en) | 2017-12-15 |
CN107480592B true CN107480592B (en) | 2020-06-12 |
Family
ID=60595584
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710568932.XA Active CN107480592B (en) | 2017-07-13 | 2017-07-13 | Multi-lane detection method and tracking method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107480592B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108961146B (en) * | 2018-07-19 | 2023-07-21 | 深圳地平线机器人科技有限公司 | Method and device for rendering perception map |
CN109284674B (en) | 2018-08-09 | 2020-12-08 | 浙江大华技术股份有限公司 | Method and device for determining lane line |
CN110967025B (en) * | 2018-09-30 | 2022-05-13 | 毫末智行科技有限公司 | Lane line screening method and system |
CN113254563B (en) * | 2021-06-18 | 2022-04-12 | 智道网联科技(北京)有限公司 | Road number generation method and related device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101837780A (en) * | 2009-03-18 | 2010-09-22 | 现代自动车株式会社 | A lane departure warning system using a virtual lane and a system according to the same |
CN102324017A (en) * | 2011-06-09 | 2012-01-18 | 中国人民解放军国防科学技术大学 | FPGA (Field Programmable Gate Array)-based lane line detection method |
CN103440649A (en) * | 2013-08-23 | 2013-12-11 | 安科智慧城市技术(中国)有限公司 | Detection method and device for lane boundary line |
CN106529443A (en) * | 2016-11-03 | 2017-03-22 | 温州大学 | Method for improving detection of lane based on Hough transform |
-
2017
- 2017-07-13 CN CN201710568932.XA patent/CN107480592B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101837780A (en) * | 2009-03-18 | 2010-09-22 | 现代自动车株式会社 | A lane departure warning system using a virtual lane and a system according to the same |
CN102324017A (en) * | 2011-06-09 | 2012-01-18 | 中国人民解放军国防科学技术大学 | FPGA (Field Programmable Gate Array)-based lane line detection method |
CN103440649A (en) * | 2013-08-23 | 2013-12-11 | 安科智慧城市技术(中国)有限公司 | Detection method and device for lane boundary line |
CN106529443A (en) * | 2016-11-03 | 2017-03-22 | 温州大学 | Method for improving detection of lane based on Hough transform |
Also Published As
Publication number | Publication date |
---|---|
CN107480592A (en) | 2017-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6978491B2 (en) | Image processing methods for recognizing ground markings, and systems for detecting ground markings | |
CN107480592B (en) | Multi-lane detection method and tracking method | |
CN105270410B (en) | Exact curvature algorithm for estimating for the path planning of autonomous land vehicle | |
US10776634B2 (en) | Method for determining the course of the road for a motor vehicle | |
US6718259B1 (en) | Adaptive Kalman filter method for accurate estimation of forward path geometry of an automobile | |
WO2021042856A1 (en) | Method and device for determining lane centerline | |
CN111932901B (en) | Road vehicle tracking detection apparatus, method and storage medium | |
US10325163B2 (en) | Vehicle vision | |
JP6838285B2 (en) | Lane marker recognition device, own vehicle position estimation device | |
CN112394725B (en) | Prediction and reaction field of view based planning for autopilot | |
JP5742558B2 (en) | POSITION DETERMINING DEVICE, NAVIGATION DEVICE, POSITION DETERMINING METHOD, AND PROGRAM | |
WO2018212292A1 (en) | Information processing device, control method, program and storage medium | |
JPWO2017163366A1 (en) | Runway estimation method and runway estimation apparatus | |
CN112781599A (en) | Method for determining the position of a vehicle | |
EP3605500B1 (en) | Output device, control method, program, and storage medium | |
WO2022078342A1 (en) | Dynamic occupancy grid estimation method and apparatus | |
JP2018084960A (en) | Self-position estimation method and self-position estimation device | |
US20200062252A1 (en) | Method and apparatus for diagonal lane detection | |
EP3288260B1 (en) | Image processing device, imaging device, equipment control system, equipment, image processing method, and carrier means | |
JP5682302B2 (en) | Traveling road estimation device, method and program | |
JP5742559B2 (en) | POSITION DETERMINING DEVICE, NAVIGATION DEVICE, POSITION DETERMINING METHOD, AND PROGRAM | |
CN114817765A (en) | Map-based target course disambiguation | |
JP4225242B2 (en) | Travel path recognition device | |
WO2020095673A1 (en) | Vehicle-mounted control device | |
CN112633124A (en) | Target vehicle judgment method for automatic driving vehicle and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |