CN113793356B - Lane line detection method and device - Google Patents

Lane line detection method and device Download PDF

Info

Publication number
CN113793356B
CN113793356B CN202111105791.0A CN202111105791A CN113793356B CN 113793356 B CN113793356 B CN 113793356B CN 202111105791 A CN202111105791 A CN 202111105791A CN 113793356 B CN113793356 B CN 113793356B
Authority
CN
China
Prior art keywords
edge
parameters
video frame
lane
edges
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111105791.0A
Other languages
Chinese (zh)
Other versions
CN113793356A (en
Inventor
李映辉
张丙林
周志鹏
李冰
廖瑞华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202111105791.0A priority Critical patent/CN113793356B/en
Publication of CN113793356A publication Critical patent/CN113793356A/en
Application granted granted Critical
Publication of CN113793356B publication Critical patent/CN113793356B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The embodiment of the application discloses a lane line detection method and a lane line detection device, and relates to the technical fields of automatic driving, internet of vehicles and intelligent cabins. One specific embodiment of the lane line detection method comprises the following steps: detecting edges in a current video frame; determining a set of candidate edges based on the detected edges; fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; calculating the error between the fitting result of each edge and the edge in the candidate edge set; selecting an edge with the calculated error less than or equal to the preset error; and determining the lane line based on the fitting result of the selected edges in response to the number of the selected edges being greater than or equal to 4. According to the embodiment, the fitting result can be obtained based on the edges in the candidate edge set, and the fitting can be carried out by effectively utilizing a plurality of pieces of edge information, so that the stability of the lane line determined according to the fitting result is improved.

Description

Lane line detection method and device
Case division information
The application is a divisional application of a Chinese patent application with the name of lane line detection method and device, wherein the application date is 2018, 9, 30, and the application number is 201811159602.6.
Technical Field
The application relates to the technical field of computers, in particular to the technical field of electronic maps, and especially relates to a lane line detection method and device.
Background
In lane line detection applications, it is necessary to fit the detected lane lines to obtain driving parameters of the current road.
At present, a plurality of lane lines are usually fitted by using straight lines or polynomials respectively, an RANSAC algorithm is used in the fitting process, and external calibration parameters of a camera are needed in the filtering process.
However, the current lane line fitting method has the following problems: (1) Each lane line is fitted independently, so that the fitting stability can not be increased by effectively utilizing the information of other lane lines; (2) The external calibration parameters of the camera are required to be used during filtering, and the application of occasions without external calibration is limited; (3) the inter-frame parameter tracking cannot be effectively performed; (4) the hardware performance consumption using the algorithm such as RANSAC is large.
Disclosure of Invention
The embodiment of the application provides a lane line detection method and a lane line detection device.
In a first aspect, an embodiment of the present application provides a lane line detection method, including: detecting edges in a current video frame; determining a set of candidate edges based on the detected edges; fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; calculating the error between the fitting result of each edge and the edge in the candidate edge set; selecting an edge with the calculated error less than or equal to the preset error; and determining the lane line based on the fitting result of the selected edges in response to the number of the selected edges being greater than or equal to 4.
In some embodiments, parameters of the lane imaging model are determined based on the following steps: in response to obtaining parameters of the lane imaging model of the previous video frame from the database, determining the parameters of the lane imaging model of the previous video frame as parameters of the lane imaging model of the current video frame; in response to not obtaining parameters of the lane imaging model of the previous video frame from the database, determining parameters based on the data fit step fits each edge in the candidate edge set, determining parameters of the lane imaging model of the current video frame.
In some embodiments, the step of fitting each edge in the candidate edge set based on the data fit determination parameters comprises: determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set; determining the number of lines with errors between fitting results of all edge lines in the candidate edge set and the edge lines smaller than a preset error based on the lane imaging model determined by the parameters of each group of lane imaging models; and determining the parameters of the lane imaging model with the maximum determined number of lines and the number of lines larger than 4 as the parameters of the lane imaging model of the current video frame.
In some embodiments, the step of determining parameters based on data fitting fits each edge in the set of candidate edges, the determining parameters of the lane imaging model for the current video frame further comprises: if the number of lines is the largest and the parameters of the lane imaging model with the number of lines being greater than 4 do not exist, taking the next frame of video frame as the current video frame, executing the steps of determining a candidate edge set based on the detected edges and determining the parameters of the lane imaging model of the current video frame based on fitting each edge in the candidate edge set.
In some embodiments, determining parameters of a set of lane imaging models using a data fitting method includes: determining parameters of a set of lane imaging models using at least one of the following data fitting methods: least square method, hough transform and maximum a posteriori estimation.
In some embodiments, in response to not obtaining parameters of the lane imaging model of the previous video frame from the database, the step of determining parameters based on the data fit fits each edge in the candidate edge set, determining parameters of the lane imaging model of the current video frame comprises: determining vanishing point parameters of the lane imaging model of the current video frame based on external parameters of the calibrated video camera shooting the video frame in response to the parameters of the lane imaging model of the previous video frame not acquired from the database; and when each edge in the candidate edge set is fitted based on the data fitting and parameter determining step, determining parameters of a lane imaging model of the current video frame by using vanishing point parameters.
In some embodiments, determining the candidate edge set based on the detected edges comprises: determining a candidate edge set based on the number of pixels included in each of the detected edges; or determining a candidate edge set based on the number of pixels included in each of the detected edges and the blank areas adjacent to each of the detected edges.
In some embodiments, determining the candidate edge set based on the number of pixels included by each of the detected edges and the blank area adjacent to each of the detected edges comprises: performing length sorting according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sorting; selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting; according to the detected blank areas adjacent to each edge in the edges, sequencing the adjacent blank areas of each edge from large to small to obtain each edge sequenced according to the adjacent blank areas; and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
In some embodiments, the lane imaging model comprises: u-u 0 =A(v-v 0 )+B/(v-v 0 ) Wherein (u) 0 ,v 0 ) For the image vanishing point position (v=v 0 Is the horizon), (u, v) is the coordinate point of the edge in the current video frame, A, B is the model coefficient, and in the same frame image, the different lane lines only have a different value.
In some embodiments, the lane imaging model comprises: u-u 0 =∑a i (v-v 0 ) i Wherein (u) 0 ,v 0 ) For the image vanishing point position (v=v 0 Is the horizon), (u, v) is the coordinate point of the edge in the current video frame, a i Refers to the coefficient of the ith term of the taylor series expansion of the hyperbolic model.
In some embodiments, the method further comprises: and in response to the calculated number of edges having an error greater than the predetermined error being less than 4, taking the next frame video frame as the current video frame, and performing a lane line detection method on the new current video frame.
In a second aspect, an embodiment of the present application provides a lane line detection apparatus, including: an edge detection unit configured to detect an edge in a current video frame; a set determination unit configured to determine a candidate edge set based on the detected edges; an edge fitting unit configured to fit each edge in the candidate edge set using the lane imaging model of the determined parameters; an error calculation unit configured to calculate, for each edge in the candidate edge set, an error of a fitting result of the edge and the edge; an edge selecting unit configured to select an edge having the calculated error less than or equal to a predetermined error; and a lane line determination unit configured to determine a lane line based on a fitting result of the selected edges in response to the number of the selected edges being 4 or more.
In some embodiments, the parameters of the lane imaging model in the edge fitting unit are determined based on the following determination steps: in response to obtaining parameters of the lane imaging model of the previous video frame from the database, determining the parameters of the lane imaging model of the previous video frame as parameters of the lane imaging model of the current video frame; in response to not obtaining parameters of the lane imaging model of the previous video frame from the database, determining parameters based on the data fit step fits each edge in the candidate edge set, determining parameters of the lane imaging model of the current video frame.
In some embodiments, the determining step based on the parameters of the lane imaging model in the edge fitting unit further comprises: determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set; determining the number of lines with errors between fitting results of all edge lines in the candidate edge set and the edge lines smaller than a preset error based on the lane imaging model determined by the parameters of each group of lane imaging models; and determining the parameters of the lane imaging model with the maximum determined number of lines and the number of lines larger than 4 as the parameters of the lane imaging model of the current video frame.
In some embodiments, the determining step based on the parameters of the lane imaging model in the edge fitting unit further comprises: if the number of lines is the largest and the parameters of the lane imaging model with the number of lines being greater than 4 do not exist, taking the next frame of video frame as the current video frame, executing the steps of determining a candidate edge set based on the detected edges and determining the parameters of the lane imaging model of the current video frame based on fitting each edge in the candidate edge set.
In some embodiments, the determining step based on the parameters of the lane imaging model in the edge fitting unit further comprises: determining parameters of a set of lane imaging models using at least one of the following data fitting methods: least square method, hough transform and maximum a posteriori estimation.
In some embodiments, the determining step based on the parameters of the lane imaging model in the edge fitting unit further comprises: determining vanishing point parameters of the lane imaging model of the current video frame based on external parameters of the calibrated video camera shooting the video frame in response to the parameters of the lane imaging model of the previous video frame not acquired from the database; and when each edge in the candidate edge set is fitted based on the data fitting and parameter determining step, determining parameters of a lane imaging model of the current video frame by using vanishing point parameters.
In some embodiments, the set determination unit is further configured to: determining a candidate edge set based on the number of pixels included in each of the detected edges; or determining a candidate edge set based on the number of pixels included in each of the detected edges and the blank areas adjacent to each of the detected edges.
In some embodiments, the set determination unit is further configured to: performing length sorting according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sorting; selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting; according to the detected blank areas adjacent to each edge in the edges, sequencing the adjacent blank areas of each edge from large to small to obtain each edge sequenced according to the adjacent blank areas; and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
In some embodiments, the lane imaging model in the edge fitting unit comprises: u-u 0 =A(v-v 0 )+B/(v-v 0 ) Wherein (u) 0 ,v 0 ) For the image vanishing point position (v=v 0 Is the horizon), (u, v) is the coordinate point of the edge in the current video frame, A, B is the model coefficient, and in the same frame image, the different lane lines only have a different value.
In some embodiments, the lane imaging model in the edge fitting unit comprises: u-u 0 =∑a i (v-v 0 ) i Wherein (u) 0 ,v 0 ) For the image vanishing point position (v=v 0 Is the horizon), (u, v) is the coordinate point of the edge in the current video frame, a i Refers to the coefficient of the ith term of the taylor series expansion of the hyperbolic model.
In some embodiments, the apparatus further comprises: and a video frame updating unit configured to take the next frame video frame as the current video frame and perform the lane line detection method on the new current video frame in response to the calculated number of edges having an error greater than the predetermined error being less than 4.
In a third aspect, embodiments of the present application provide an apparatus, including: one or more processors; a storage means for storing one or more programs; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any of the above.
In a fourth aspect, embodiments of the present application provide a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a method as described in any of the above.
The lane line detection method and device provided by the embodiment of the application firstly acquire a current video frame; obtaining a candidate edge set based on the obtained edges in the current video frame; then, fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; then, calculating the error between the fitting result of each edge and the edge for each edge in the candidate edge set; finally, in response to the calculated error being less than the predetermined error, a lane line is determined based on the fitting result of each edge. In the process, a fitting result can be obtained based on the edges in the candidate edge set, and a plurality of edge information can be effectively utilized for fitting, so that the stability of the lane line determined according to the fitting result is increased.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading the detailed description of non-limiting embodiments, made with reference to the following drawings, in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow chart diagram of one embodiment of a lane line detection method according to an embodiment of the present application;
fig. 3a to 3f are schematic views of an application scenario according to an embodiment of the present application;
FIG. 4 is a flow diagram of one embodiment of a method of determining parameters of a lane imaging model of a current video frame, according to an embodiment of the present application;
FIG. 5 is a schematic structural view of one embodiment of a lane line detection apparatus of the present application;
FIG. 6 is a schematic diagram of a computer system suitable for use in implementing embodiments of the present application.
Detailed Description
The present application is described in further detail below with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the lane line detection method or lane line detection apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and servers 105, 106. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the servers 105, 106. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user 110 may interact with the servers 105, 106 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as an electronic map type application, a search engine type application, a shopping type application, an instant messaging tool, a mailbox client, social platform software, a video playing type application, and the like, can be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablet computers, electronic book readers, MP3 players (Moving Picture Experts Group Audio Layer III, dynamic video expert compression standard audio plane 3), MP4 (Moving Picture Experts Group Audio Layer IV, dynamic video expert compression standard audio plane 4) players, laptop and desktop computers, and the like. When the terminal devices 101, 102, 103 are software, they can be installed in the above-listed electronic devices. Which may be implemented as multiple software or software modules (e.g., to provide distributed services), or as a single software or software module. The present invention is not particularly limited herein.
The servers 105, 106 may be servers providing various services, such as background servers providing support for the terminal devices 101, 102, 103. The background server can analyze, store or calculate the data submitted by the terminal and push the analysis, storage or calculation result to the terminal equipment.
It should be noted that, in practice, the lane line detection method provided in the embodiment of the present application may be performed by the terminal devices 101, 102, 103 or the servers 105, 106, and the lane line detection apparatus may also be provided in the terminal devices 101, 102, 103 or the servers 105, 106.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2, fig. 2 illustrates a flow 200 of one embodiment of a lane line detection method according to the present application. The lane line detection method comprises the following steps:
in step 201, edges in a current video frame are detected.
In this embodiment, the execution body (for example, the terminal or the server shown in fig. 1) on which the lane line detection method operates may read the video shot by the camera from the camera of the local or remote electronic device, and take the video frame that is currently required to process and determine the lane line in the pulled video as the current video frame.
Thereafter, the executing body may detect edges in the current video frame. The purpose of detecting edges in the current video frame is: points in the digital image where the brightness change is significant are identified. Significant changes in image attributes typically reflect important events and changes in the attributes. These include discontinuities in depth, surface direction discontinuities, material property changes, and scene lighting changes.
The method for detecting the edge in the current video frame may be a method for detecting the edge in the video frame in the prior art or a future developed technology, which is not limited in this application. For example, search-based and zero-crossing-based edge detection methods may be employed to detect edges.
The search-based edge detection method first calculates the edge intensity, typically expressed as a first derivative, e.g. a gradient modulus, then calculates the local direction of the estimated edge, typically using the direction of the gradient, and uses this direction to find the maximum of the local gradient modulus.
The zero-crossing based approach finds the zero-crossing of the second derivative from the image to locate the edge. Typically with zero crossing points of either the laplace operator or the nonlinear differential equation.
Filtering is often necessary as a pre-processing for edge detection, typically with gaussian filtering.
The edge detection method applies a metric that calculates the boundary strength, which is essentially different from smoothing filtering. Just as many edge detection methods rely on the computation of image gradients, they use different kinds of filters to estimate the gradients in the x-direction and the y-direction.
It should be appreciated that cameras for capturing video are typically required to meet installation requirements for the needs of the detection results. For example: the pitch angle (pitch) and yaw angle (yaw) of the camera should be within a certain range, so that vanishing points (points of intersection generated by extension lines of each side of the stereo pattern) in the image are as close as possible to the center of the image, and the roll angle (roll) of the camera cannot exceed 5 degrees, etc.
Step 202, determining a candidate edge set based on the detected edges.
In this embodiment, based on the detected edge in step 201, the detected edge may be directly used as a candidate edge set, or the detected edge may be screened to obtain a screened edge as a candidate edge set.
In an alternative implementation of the present embodiment, determining the candidate edge set based on the detected edges may include: determining a candidate edge set based on the number of pixels included in each of the detected edges; or determining a candidate edge set based on the number of pixels included in each of the detected edges and the blank areas adjacent to each of the detected edges.
In this implementation, since the number of pixels included in each edge may determine the length of each edge, then the edges that may be lane lines may be determined according to the length of each edge, and these determined edges may be added to the candidate edge set.
Considering that in a practical application scenario, the blank area adjacent to the lane line is generally larger than the blank area adjacent to the non-lane line, therefore, on the basis of determining the edge likely to be the lane line based on the length of each edge, the edge likely to be the lane line may be further determined based on the size of the blank area adjacent to each edge, and the edges determined separately for two times may be added to the candidate edge set.
In some optional implementations of the present embodiment, determining the candidate edge set based on the number of pixels included by each of the detected edges and the blank areas adjacent to each of the detected edges includes: performing length sorting according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sorting; selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting; according to the detected blank areas adjacent to each edge in the edges, sequencing the adjacent blank areas of each edge from large to small to obtain each edge sequenced according to the adjacent blank areas; and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
In this implementation, based on the ordering order of the lengths of the edges, it may be determined that a portion of the candidate edges are added to the candidate edge set; based on the ordering order of adjacent blank areas of the edges, it may also be determined that a portion of the candidate edges are added to the candidate edge set. The edges in the candidate edge set are edges of the lane lines that are candidates. The predetermined number and the preset number here may be set empirically or manually, respectively.
In a specific embodiment, 8 edges with the longest length may be set as candidate edges, and meanwhile, 8 edges with the largest adjacent blank area are determined to be also used as candidate edges, so as to obtain a candidate edge set. It should be appreciated that the 8 edges of the longest length may coincide with the 8 edges of the largest adjacent blank area, and thus the number of lines that may be included in the candidate edge set is 8 or more but less than 16.
Step 203, fitting each edge in the candidate edge set by using the lane imaging model with the determined parameters.
In this embodiment, the executing body may fit each edge in the candidate edge set using a lane imaging model with the determined parameters. The lane imaging model may generally be implemented using a function that models lane lines. For example, a linear equation or a polynomial is employed to implement a lane imaging model or the like.
Here, the parameters of the lane imaging model may be parameters of the lane imaging model of a similar picture frame or parameters of the lane imaging model determined based on the data fitting result of the current picture frame.
In some alternative implementations of the present embodiment, the parameters of the lane imaging model may be determined based on the following steps: in response to obtaining parameters of the lane imaging model of the previous video frame from the database, determining the parameters of the lane imaging model of the previous video frame as parameters of the lane imaging model of the current video frame; and in response to the parameters of the lane imaging model of the previous video frame not being acquired from the database, determining parameters based on the data fitting step fits each edge in the candidate edge set, and determines parameters of the lane imaging model of the current video frame.
In this implementation manner, since the fitting result obtained according to each edge in the candidate edge set refers to multiple edges, the application is wide in adaptability, and continuity exists between video frames, parameters of the lane imaging model of the previous video frame can be continuously used in the edge fitting process of the current video frame.
In some optional implementations of the present embodiment, in response to the parameters of the lane imaging model of the previous video frame not being obtained from the database, the step of determining parameters based on data fitting fits each edge in the candidate edge set, determining parameters of the lane imaging model of the current video frame includes: determining vanishing point parameters of the lane imaging model of the current video frame based on external parameters of the calibrated video camera shooting the video frame in response to the parameters of the lane imaging model of the previous video frame not acquired from the database; and when each edge in the candidate edge set is fitted based on the data fitting and parameter determining step, determining parameters of a lane imaging model of the current video frame by using vanishing point parameters.
In the implementation manner, if the external parameters of the calibrated camera shooting the video frame exist, the vanishing point parameters of the lane imaging model can be determined based on the external parameters, so that the calculation amount of the parameters for determining the lane imaging model is reduced, and the efficiency of determining the parameters of the lane imaging model is improved.
In some optional implementations of the present embodiment, the lane imaging model includes: u-u 0 =A(v-v 0 )+B/(v-v 0 ) Wherein (u) 0 ,v 0 ) For the image vanishing point position (v=v 0 Is the horizon), (u, v) is the coordinate point of the edge in the current video frame, A, B is the model coefficient, and in the same frame image, the different lane lines only have a different value.
In the implementation mode, the lane imaging model can model the straight lanes and the curves on the image frames at the same time, so that the accuracy of the lane imaging model is improved, the adaptability of the lane imaging model is wide, and the inter-frame tracking is facilitated. In a specific example, the candidate edge set includes k sets of lane line feature points, at v 0 Under the condition that the value and other parameter variances are known, the lane line fitting can be converted into weighted least square fitting, and the least square method is an efficient data fitting method.
In some optional implementations of the present embodiment, the lane imaging model includes: u-u 0 =∑a i (v-v 0 ) i Wherein (u) 0 ,v 0 ) For the image vanishing point position (v=v 0 Is the horizon), (u, v) is the coordinate point of the edge in the current video frame, a i Refers to the coefficient of the ith term of the taylor series expansion of the hyperbolic model.
In this implementation, the hyperbolic model is u-u 0 =A(v-v 0 )+B/(v-v 0 ). By adopting the Taylor expansion of the hyperbolic model as the hyperbolic model, the primary parameters are removed, the straight line lane and the curve can be modeled on the image frame at the same time, the accuracy of the lane imaging model is improved, and the lane imaging model has wide adaptability and is provided withAnd inter-frame tracking is facilitated.
Step 204, for each edge in the candidate edge set, calculating an error between the fitting result of the edge and the edge.
In this embodiment, the execution subject may calculate, for each edge, an error between the fitting result of the edge and the edge based on the fitting result of the lane imaging model of the determined parameter to each edge in the candidate edge set. Here, the error may be "sum of residuals", "sum of absolute values of residuals" or "sum of squares of residuals".
In step 205, an edge with a calculated error less than or equal to the predetermined error is selected.
In this embodiment, the calculated error is smaller than the predetermined error, which indicates that the fitting result accords with the actual lane line, so that the fitting result of the edge can be used as the estimated lane line edge, thereby determining the lane line.
In step 206, in response to the number of selected edges being greater than or equal to 4, a lane line is determined based on the fitting result of the selected edges.
In this embodiment, considering that one lane has two lane lines and each lane line includes two edges, when the number of edges selected is 4 or more, it is indicated that at least one lane is included in the current image frame. At this time, the lane line may be determined based on the fitting result of the selected edge. When determining the lane line according to the lane line edge, the width of the lane, the position of the lane center line and the like can be considered, the selected edge is selected and the final lane line is determined.
In some optional implementations of the present embodiment, the lane line detection method further includes: and in response to the calculated number of edges having an error greater than the predetermined error being less than 4, taking the next frame video frame as the current video frame, and performing a lane line detection method on the new current video frame.
In this implementation, if the calculated error is greater than the predetermined error and the number of edges is less than 4, the detected edges in the current video frame do not include the complete lane (e.g., the image acquired when the vehicle is merging), so the next video frame may be taken as the current video frame, and the lane line detection method as described above may be performed on the new current video frame to determine the lane line.
An exemplary application scenario of the lane line detection method of the present application is described below with reference to fig. 3a to 3 e.
As shown in fig. 3a to 3e, fig. 3a to 3e show schematic flowcharts of one application scenario of the lane line detection method according to the present application.
As shown in fig. 3a, the lane line detection method 300 running in the electronic device 310 may include:
first, detecting an edge 302 in a current video frame 301, to obtain an edge in the current video frame as shown in fig. 3 a;
then, according to the detected number of pixel points included in each edge 302 from high to low, selecting a predetermined number of edges 303 to be added to the candidate edge set 305, so as to obtain a candidate edge set as shown in fig. 3 b;
then, according to the detected blank areas adjacent to the edges 302 from large to small, selecting a preset number of edges 304 to be added to a candidate edge set 305, so as to obtain a candidate edge set as shown in fig. 3 c;
then, fitting each edge in the candidate edge set 305 using the lane imaging model 306 of the determined parameters;
then, for each edge in the candidate edge set 305, calculating the error 307 of the fitting result of the edge and the edge;
Then, selecting the edges with the calculated error 307 less than or equal to the predetermined error 308 to obtain selected edges 309;
thereafter, in response to the number of selected edges 309 being 4 or more, a lane line 311 is determined based on the fitting result 310 of the selected edges (the fitting result of the selected edges as shown in fig. 3 d), resulting in the lane line as shown in fig. 3 e.
It should be understood that the above application scenario of the lane line detection method shown in fig. 3 is merely an exemplary description of the lane line detection method, and does not represent a limitation of the method. For example, the steps illustrated in fig. 3 above may be further implemented in greater detail.
The lane line detection method of the above embodiment of the present application may first detect an edge in a current video frame; then, determining a candidate edge set based on the detected edges; then, fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; then, calculating the error between the fitting result of each edge and the edge for each edge in the candidate edge set; then, selecting the edge with the calculated error less than or equal to the preset error; finally, in response to the number of the selected edges being greater than or equal to 4, a lane line is determined based on the fitting result of the selected edges. In the process, as the plurality of edges in the candidate edge set are adopted for fitting, the stability of a fitting result is improved, the accuracy of the lane imaging model is improved, the adaptability of the lane imaging model is wide, and the inter-frame tracking is facilitated. And in the filtering process, external calibration parameters of the camera do not need to be considered, and the use occasion is not limited.
Referring to fig. 4, a flow chart of one embodiment of a method of determining parameters of a lane imaging model of a current video frame according to the present application is shown.
As shown in fig. 4, a flow 400 of the method for determining parameters of a lane imaging model of a current video frame of the present embodiment may include the following steps:
step 401, determining parameters of a set of lane imaging models by using a data fitting method for every two edge combinations in the candidate edge set.
In this implementation, for every two edge combinations in the candidate edge set, a lane imaging model with unknown parameters may be substituted, so as to solve the unknown parameters and obtain parameters of a set of lane imaging models.
The data fitting, also called curve fitting, is a method of substituting existing data into a numerical expression by a mathematical method. Scientific and engineering problems can be solved by means such as sampling, experimentation, etc., from which we often want to get a continuous function (i.e. curve) or more closely spaced discrete equations to fit to the known data, a process called fitting.
In some alternative implementations of the present embodiment, determining parameters of a set of lane imaging models using a data fitting method includes: determining parameters of a set of lane imaging models using at least one of the following data fitting methods: least square method, hough transform and maximum a posteriori estimation.
In this implementation, when the linear model is used to fit the data, the data size is generally larger than the number of unknowns in the equation set, so as to obtain an overdetermined equation set, and coefficients between the equations may not be compatible, so that the equation set has no solution. And the least square method obtains the optimal solution of the overdetermined equation set under the constraint of the least square error condition. The heteroscedastic problem of the least squares method can be solved using weights. The process of solving model parameters by the maximum likelihood method is to search a parameter space to find the parameter point with the highest possibility of the occurrence of the feature point set.
Unlike the voting process of the Hough transform feature points to the parameter space, the maximum a posteriori estimation is a matching process of the parameter space to the feature point set. For example, the data fitting method may include: maximum a posteriori estimation based on least squares implementation.
Step 402, determining the number of lines with errors between the fitting result of each edge line in the candidate edge set and the edge line less than a preset error based on the lane imaging model determined by the parameters of each set of lane imaging models.
In this embodiment, since the parameters of each set of lane imaging models are determined based on the combination of two edges, the number of edges in the candidate edges to which the parameters of each set of lane imaging models are applied is different, and in order to determine the parameters of the optimal lane imaging model, the number of lines to which each set of lane imaging models are applied needs to be determined, and then the applicability of determining which set of lane imaging models is more extensive according to the number of lines.
Step 403, determining the parameters of the lane imaging model with the maximum number of lines and the number of lines greater than 4 as the parameters of the lane imaging model of the current video frame.
In this embodiment, the determined number of lines is the largest, so that the applicability of the lane imaging model can be guaranteed to be the most extensive, and the determined number of lines is greater than 4, so that parameters of the lane imaging model can be guaranteed to be at least suitable for 4 edges included in one lane.
In some optional implementations of the present embodiment, the step of determining parameters based on data fitting fits each edge in the set of candidate edges, the determining parameters of the lane imaging model of the current video frame further comprises: if the number of lines is the largest and the parameters of the lane imaging model with the number of lines being greater than 4 do not exist, taking the next frame of video frame as the current video frame, executing the steps of determining a candidate edge set based on the detected edges and determining the parameters of the lane imaging model of the current video frame based on fitting each edge in the candidate edge set.
In this implementation manner, if the number of lines is the largest and the number of lines is greater than 4, the parameters of the lane imaging model do not exist, that is, the number of lines cannot meet four edges included in one lane, and there is no complete lane in the current video frame. Thus, the lane line may be determined based on the next frame of video frame.
According to the method for determining the parameters of the lane imaging model of the current video frame, a data fitting method is adopted to determine a group of model parameters for every two edge combinations in the candidate edge set; determining the number of lines with errors between fitting results of all edge lines in the candidate edge set and the edge lines smaller than a preset error based on the lane imaging model determined by the parameters of each group of lane imaging models; and determining the parameters of the lane imaging model with the maximum determined number of lines and the number of lines larger than 4 as the parameters of the lane imaging model of the current video frame. In the process, parameters of the lane imaging model which can adapt to the maximum edge are screened out, and the applicability of the parameters of the determined lane imaging model is improved.
With further reference to fig. 5, as an implementation of the method shown in the foregoing figures, an embodiment of a lane line detection apparatus is provided in an embodiment of the apparatus, which corresponds to the method embodiment shown in fig. 2-4, and is particularly applicable to various electronic devices.
As shown in fig. 5, the lane line detection apparatus 500 of the present embodiment may include: an edge detection unit 510 configured to detect an edge in a current video frame; a set determination unit 520 configured to determine a set of candidate edges based on the detected edges; an edge fitting unit 530 configured to fit each edge in the candidate edge set using the lane imaging model of the determined parameters; an error calculation unit 540 configured to calculate, for each edge in the candidate edge set, an error of a fitting result of the edge and the edge; an edge selection unit 550 configured to select an edge having the calculated error less than or equal to a predetermined error; the lane line determination unit 560 is configured to determine a lane line based on a fitting result of the selected edges in response to the number of the selected edges being 4 or more.
In some embodiments, the parameters of the lane imaging model in the edge fitting unit 530 are determined based on the following determination steps: in response to obtaining parameters of the lane imaging model of the previous video frame from the database, determining the parameters of the lane imaging model of the previous video frame as parameters of the lane imaging model of the current video frame; in response to not obtaining parameters of the lane imaging model of the previous video frame from the database, determining parameters based on the data fit step fits each edge in the candidate edge set, determining parameters of the lane imaging model of the current video frame.
In some embodiments, the determining step based on the parameters of the lane imaging model in the edge fitting unit 530 further comprises: determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set; determining the number of lines with errors between fitting results of all edge lines in the candidate edge set and the edge lines smaller than a preset error based on the lane imaging model determined by the parameters of each group of lane imaging models; and determining the parameters of the lane imaging model with the maximum determined number of lines and the number of lines larger than 4 as the parameters of the lane imaging model of the current video frame.
In some embodiments, the determining step based on the parameters of the lane imaging model in the edge fitting unit 530 further comprises: if the number of lines is the largest and the parameters of the lane imaging model with the number of lines being greater than 4 do not exist, taking the next frame of video frame as the current video frame, executing the steps of determining a candidate edge set based on the detected edges and determining the parameters of the lane imaging model of the current video frame based on fitting each edge in the candidate edge set.
In some embodiments, the determining step based on the parameters of the lane imaging model in the edge fitting unit 530 further comprises: determining parameters of a set of lane imaging models using at least one of the following data fitting methods: least square method, hough transform and maximum a posteriori estimation.
In some embodiments, the determining step based on the parameters of the lane imaging model in the edge fitting unit 530 further comprises: determining vanishing point parameters of the lane imaging model of the current video frame based on external parameters of the calibrated video camera shooting the video frame in response to the parameters of the lane imaging model of the previous video frame not acquired from the database; and when each edge in the candidate edge set is fitted based on the data fitting and parameter determining step, determining parameters of a lane imaging model of the current video frame by using vanishing point parameters.
In some embodiments, the set determination unit 520 is further configured to: determining a candidate edge set based on the number of pixels included in each of the detected edges; or determining a candidate edge set based on the number of pixels included in each of the detected edges and the blank areas adjacent to each of the detected edges.
In some embodiments, the set determination unit 520 is further configured to: performing length sorting according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sorting; selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting; according to the detected blank areas adjacent to each edge in the edges, sequencing the adjacent blank areas of each edge from large to small to obtain each edge sequenced according to the adjacent blank areas; and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
In some embodiments, the lane imaging model in the edge fitting unit 530 includes: u-u 0 =A(v-v 0 )+B/(v-v 0 ) Wherein (u) 0 ,v 0 ) For the image vanishing point position (v=v 0 Is the horizon), (u, v) is the coordinate point of the edge in the current video frame, A, B is the model coefficient, and in the same frame image, the different lane lines only have a different value.
In some embodiments, the lane imaging model in the edge fitting unit 530 includes: u-u 0 =∑a i (v-v 0 ) i Wherein (u) 0 ,v 0 ) For the image vanishing point position (v=v 0 Is the horizon), (u, v) is the coordinate point of the edge in the current video frame, a i Refers to the coefficient of the ith term of the taylor series expansion of the hyperbolic model.
In some embodiments, the apparatus further comprises: a video frame updating unit 570 configured to take the next frame video frame as the current video frame and perform the lane line detection method on the new current video frame in response to the calculated number of edges having an error greater than the predetermined error being less than 4.
It should be understood that the elements recited in apparatus 500 may correspond to the various steps in the methods described with reference to fig. 2-4. Thus, the operations and features described above with respect to the method are equally applicable to the apparatus 500 and the units contained therein, and are not described in detail herein.
Referring now to FIG. 6, there is illustrated a schematic diagram of a computer system 600 suitable for use in implementing a server of an embodiment of the present application. The terminal device or server illustrated in fig. 6 is merely an example, and should not impose any limitation on the functionality and scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU) 601, which can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other through a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, mouse, etc.; an output portion 607 including a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, a speaker, and the like; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The drive 610 is also connected to the I/O interface 605 as needed. Removable media 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on drive 610 so that a computer program read therefrom is installed as needed into storage section 608.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flowcharts. In such an embodiment, the computer program may be downloaded and installed from a network through the communication portion 609, and/or installed from the removable medium 611. The above-described functions defined in the method of the present application are performed when the computer program is executed by a Central Processing Unit (CPU) 601. It should be noted that, the computer readable medium described in the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by software, or may be implemented by hardware. The described units may also be provided in a processor, for example, described as: a processor includes an edge detection unit, a set determination unit, an edge fitting unit, an error calculation unit, an edge selection unit, and a lane line determination unit. The names of these units do not constitute a limitation on the unit itself in some cases, and for example, the edge detection unit may also be described as "a unit that detects an edge in the current video frame".
As another aspect, the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: detecting edges in a current video frame; determining a set of candidate edges based on the detected edges; fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; calculating the error between the fitting result of each edge and the edge in the candidate edge set; selecting an edge with the calculated error less than or equal to the preset error; and determining the lane line based on the fitting result of the selected edges in response to the number of the selected edges being greater than or equal to 4.
The foregoing description is only of the preferred embodiments of the present application and is presented as a description of the principles of the technology being utilized. It will be appreciated by persons skilled in the art that the scope of the invention referred to in this application is not limited to the specific combinations of features described above, but it is intended to cover other embodiments in which any combination of features described above or equivalents thereof is possible without departing from the spirit of the invention. Such as the above-described features and technical features having similar functions (but not limited to) disclosed in the present application are replaced with each other.

Claims (20)

1. A lane line detection method, comprising:
detecting edges in a current video frame;
determining a set of candidate edges based on the detected edges; the determining a set of candidate edges based on the detected edges comprises: directly taking the detected edges as candidate edge sets;
fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters;
calculating the error between the fitting result of each edge and the edge in the candidate edge set;
selecting an edge with the calculated error less than or equal to the preset error;
determining a lane line based on a fitting result of the selected edges in response to the number of the selected edges being greater than or equal to 4;
in response to the calculated number of edges having an error greater than a predetermined error being less than 4, taking the next frame of video frame as the current video frame, and performing the lane line detection method on the new current video frame; parameters of the lane imaging model are determined based on the following steps:
in response to obtaining parameters of the lane imaging model of the previous video frame from the database, determining the parameters of the lane imaging model of the previous video frame as parameters of the lane imaging model of the current video frame;
And in response to the parameters of the lane imaging model of the previous video frame not being acquired from the database, determining parameters based on the data fitting step fits each edge in the candidate edge set, and determines parameters of the lane imaging model of the current video frame.
2. The method of claim 1, wherein the determining parameters based on data fitting step fits each edge in the set of candidate edges, the determining parameters of the lane imaging model of the current video frame comprising:
determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set;
determining the number of lines with errors between fitting results of all edge lines in the candidate edge set and the edge lines smaller than a preset error based on the lane imaging model determined by the parameters of each group of lane imaging models;
and determining the parameters of the lane imaging model with the maximum determined number of lines and the number of lines larger than 4 as the parameters of the lane imaging model of the current video frame.
3. The method of claim 2, wherein the determining parameters based on data fitting step fits each edge in the set of candidate edges, the determining parameters of the lane imaging model of the current video frame further comprising:
And if the number of lines is maximum and the parameters of the lane imaging model with the number of lines being greater than 4 do not exist, taking the next frame of video frame as the current video frame, executing the method based on the detected edges on the current video frame, determining a candidate edge set, and determining the parameters of the lane imaging model of the current video frame based on fitting each edge in the candidate edge set.
4. The method of claim 2, wherein the determining parameters of a set of lane imaging models using a data fitting method comprises:
determining parameters of a set of lane imaging models using at least one of the following data fitting methods: least square method, hough transform and maximum a posteriori estimation.
5. The method of claim 2, wherein the determining parameters of the lane imaging model for the current video frame based on the data fit determining parameters step fitting each edge in the candidate edge set in response to parameters of the lane imaging model for the previous video frame not being obtained from the database comprises:
determining vanishing point parameters of the lane imaging model of the current video frame based on external parameters of the calibrated video camera shooting the video frame in response to the parameters of the lane imaging model of the previous video frame not acquired from the database;
And when each edge in the candidate edge set is fitted based on the data fitting and parameter determining step, determining parameters of a lane imaging model of the current video frame by adopting the vanishing point parameters.
6. The method of claim 1, wherein the determining a set of candidate edges based on the detected edges comprises:
determining a candidate edge set based on the number of pixels included in each of the detected edges; or (b)
A set of candidate edges is determined based on the number of pixels included by each of the detected edges and the blank areas adjacent to each of the detected edges.
7. The method of claim 6, wherein the determining the candidate edge set based on the number of pixels included by each of the detected edges and the blank area adjacent to each of the detected edges comprises:
performing length sorting according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sorting;
selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting;
according to the detected blank areas adjacent to each edge in the edges, sequencing the adjacent blank areas of each edge from large to small to obtain each edge sequenced according to the adjacent blank areas;
And selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
8. The method of claim 1, wherein the lane imaging model comprises:
u-u 0 =A(v-v 0 )+B/(v-v 0 ) Wherein (u) 0 ,v 0 ) And (u, v) is a coordinate point of an edge in the current video frame, A, B is a model coefficient, and in the same frame of image, only the A value of different lane lines is different.
9. The method of claim 1, wherein the lane imaging model comprises: u-u 0 =∑a i (v-v 0 ) i Wherein (u) 0 ,v 0 ) Is the vanishing point position of the image, (u, v) is the coordinate point of the edge in the current video frame, a i Refers to the coefficient of the ith term of the taylor series expansion of the hyperbolic model.
10. A lane line detection apparatus comprising:
an edge detection unit configured to detect an edge in a current video frame;
a set determination unit configured to determine a candidate edge set based on the detected edges; the set determination unit is further configured to: directly taking the detected edges as candidate edge sets;
an edge fitting unit configured to fit each edge in the candidate edge set using a lane imaging model of the determined parameters;
An error calculation unit configured to calculate, for each edge in the candidate edge set, an error of a fitting result of the edge and the edge;
an edge selecting unit configured to select an edge having the calculated error less than or equal to a predetermined error;
a lane line determination unit configured to determine a lane line based on a fitting result of the selected edges in response to the number of the selected edges being 4 or more;
a video frame updating unit configured to take a next frame video frame as a current video frame and perform the lane line detection method on the new current video frame in response to the calculated number of edges having an error greater than a predetermined error being less than 4; parameters of the lane imaging model in the edge fitting unit are determined based on the following determination steps:
in response to obtaining parameters of the lane imaging model of the previous video frame from the database, determining the parameters of the lane imaging model of the previous video frame as parameters of the lane imaging model of the current video frame;
and in response to the parameters of the lane imaging model of the previous video frame not being acquired from the database, determining parameters based on the data fitting step fits each edge in the candidate edge set, and determines parameters of the lane imaging model of the current video frame.
11. The apparatus of claim 10, wherein the determining step based on parameters of the lane imaging model in the edge fitting unit further comprises:
determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set;
determining the number of lines with errors between fitting results of all edge lines in the candidate edge set and the edge lines smaller than a preset error based on the lane imaging model determined by the parameters of each group of lane imaging models;
and determining the parameters of the lane imaging model with the maximum determined number of lines and the number of lines larger than 4 as the parameters of the lane imaging model of the current video frame.
12. The apparatus of claim 11, wherein the determining step based on parameters of the lane imaging model in the edge fitting unit further comprises:
and if the number of lines is maximum and the parameters of the lane imaging model with the number of lines being greater than 4 do not exist, taking the next frame of video frame as the current video frame, executing the method based on the detected edges on the current video frame, determining a candidate edge set, and determining the parameters of the lane imaging model of the current video frame based on fitting each edge in the candidate edge set.
13. The apparatus of claim 11, wherein the determining step based on parameters of the lane imaging model in the edge fitting unit further comprises:
determining parameters of a set of lane imaging models using at least one of the following data fitting methods: least square method, hough transform and maximum a posteriori estimation.
14. The apparatus of claim 11, wherein the determining step based on parameters of the lane imaging model in the edge fitting unit further comprises:
determining vanishing point parameters of the lane imaging model of the current video frame based on external parameters of the calibrated video camera shooting the video frame in response to the parameters of the lane imaging model of the previous video frame not acquired from the database;
and when each edge in the candidate edge set is fitted based on the data fitting and parameter determining step, determining parameters of a lane imaging model of the current video frame by adopting the vanishing point parameters.
15. The apparatus of claim 10, wherein the set determination unit is further configured to:
determining a candidate edge set based on the number of pixels included in each of the detected edges; or (b)
A set of candidate edges is determined based on the number of pixels included by each of the detected edges and the blank areas adjacent to each of the detected edges.
16. The apparatus of claim 15, wherein the set determination unit is further configured to:
performing length sorting according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sorting;
selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting;
according to the detected blank areas adjacent to each edge in the edges, sequencing the adjacent blank areas of each edge from large to small to obtain each edge sequenced according to the adjacent blank areas;
and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
17. The apparatus of claim 10, wherein the lane imaging model in the edge fitting unit comprises:
u-u 0 =A(v-v 0 )+B/(v-v 0 ) Wherein (u) 0 ,v 0 ) And (u, v) is a coordinate point of an edge in the current video frame, A, B is a model coefficient, and in the same frame of image, only the A value of different lane lines is different.
18. The apparatus of claim 10, wherein the lane imaging model in the edge fitting unit comprises: u-u 0 =∑a i (v-v 0 ) i Wherein (u) 0 ,v 0 ) Is the vanishing point position of the image, (u, v) is the coordinate point of the edge in the current video frame, a i Refers to the coefficient of the ith term of the taylor series expansion of the hyperbolic model.
19. A server, comprising:
one or more processors;
a storage means for storing one or more programs;
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-9.
20. A computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method of any of claims 1-9.
CN202111105791.0A 2018-09-30 2018-09-30 Lane line detection method and device Active CN113793356B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111105791.0A CN113793356B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111105791.0A CN113793356B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN201811159602.6A CN109300139B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201811159602.6A Division CN109300139B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Publications (2)

Publication Number Publication Date
CN113793356A CN113793356A (en) 2021-12-14
CN113793356B true CN113793356B (en) 2023-06-23

Family

ID=65161420

Family Applications (3)

Application Number Title Priority Date Filing Date
CN202111106274.5A Active CN113792690B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN201811159602.6A Active CN109300139B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN202111105791.0A Active CN113793356B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202111106274.5A Active CN113792690B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN201811159602.6A Active CN109300139B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Country Status (1)

Country Link
CN (3) CN113792690B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113792690B (en) * 2018-09-30 2023-06-23 百度在线网络技术(北京)有限公司 Lane line detection method and device
CN109934169A (en) * 2019-03-13 2019-06-25 东软睿驰汽车技术(沈阳)有限公司 A kind of Lane detection method and device
CN112050821B (en) * 2020-09-11 2021-08-20 湖北亿咖通科技有限公司 Lane line polymerization method
CN112560680A (en) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 Lane line processing method and device, electronic device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105741559A (en) * 2016-02-03 2016-07-06 安徽清新互联信息科技有限公司 Emergency vehicle lane illegal occupation detection method based on lane line model
CN106326822A (en) * 2015-07-07 2017-01-11 北京易车互联信息技术有限公司 Method and device for detecting lane line

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2735033B2 (en) * 1995-06-05 1998-04-02 日本電気株式会社 Lane change detection apparatus and method
US6819779B1 (en) * 2000-11-22 2004-11-16 Cognex Corporation Lane detection system and apparatus
US7409092B2 (en) * 2002-06-20 2008-08-05 Hrl Laboratories, Llc Method and apparatus for the surveillance of objects in images
CN101470801B (en) * 2007-12-24 2011-06-01 财团法人车辆研究测试中心 Vehicle shift inspection method
TWI452540B (en) * 2010-12-09 2014-09-11 Ind Tech Res Inst Image based detecting system and method for traffic parameters and computer program product thereof
CN102208019B (en) * 2011-06-03 2013-01-09 东南大学 Method for detecting lane change of vehicle based on vehicle-mounted camera
CN102314599A (en) * 2011-10-11 2012-01-11 东华大学 Identification and deviation-detection method for lane
KR101295077B1 (en) * 2011-12-28 2013-08-08 전자부품연구원 Lane Departure Warning System
CN102663744B (en) * 2012-03-22 2015-07-08 杭州电子科技大学 Complex road detection method under gradient point pair constraint
CN104008387B (en) * 2014-05-19 2017-02-15 山东科技大学 Lane line detection method based on feature point piecewise linear fitting
CN104008645B (en) * 2014-06-12 2015-12-09 湖南大学 One is applicable to the prediction of urban road lane line and method for early warning
CN104268860B (en) * 2014-09-17 2017-10-17 电子科技大学 A kind of method for detecting lane lines
CN105320927B (en) * 2015-03-25 2018-11-23 中科院微电子研究所昆山分所 Method for detecting lane lines and system
CN105069415B (en) * 2015-07-24 2018-09-11 深圳市佳信捷技术股份有限公司 Method for detecting lane lines and device
CN105760812B (en) * 2016-01-15 2019-06-07 北京工业大学 A kind of method for detecting lane lines based on Hough transform
CN106384085A (en) * 2016-08-31 2017-02-08 浙江众泰汽车制造有限公司 Calculation method for yaw angle of unmanned vehicle
CN106774328A (en) * 2016-12-26 2017-05-31 广州大学 A kind of automated driving system and method based on road Identification
CN107909007B (en) * 2017-10-27 2019-12-13 上海识加电子科技有限公司 lane line detection method and device
CN107832732B (en) * 2017-11-24 2021-02-26 河南理工大学 Lane line detection method based on treble traversal
CN108052880B (en) * 2017-11-29 2021-09-28 南京大学 Virtual and real lane line detection method for traffic monitoring scene
CN108009524B (en) * 2017-12-25 2021-07-09 西北工业大学 Lane line detection method based on full convolution network
CN108280450B (en) * 2017-12-29 2020-12-29 安徽农业大学 Expressway pavement detection method based on lane lines
CN108519605B (en) * 2018-04-09 2021-09-07 重庆邮电大学 Road edge detection method based on laser radar and camera
CN113792690B (en) * 2018-09-30 2023-06-23 百度在线网络技术(北京)有限公司 Lane line detection method and device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106326822A (en) * 2015-07-07 2017-01-11 北京易车互联信息技术有限公司 Method and device for detecting lane line
CN105741559A (en) * 2016-02-03 2016-07-06 安徽清新互联信息科技有限公司 Emergency vehicle lane illegal occupation detection method based on lane line model

Also Published As

Publication number Publication date
CN113793356A (en) 2021-12-14
CN113792690A (en) 2021-12-14
CN109300139A (en) 2019-02-01
CN113792690B (en) 2023-06-23
CN109300139B (en) 2021-10-15

Similar Documents

Publication Publication Date Title
CN113793356B (en) Lane line detection method and device
CN108710885B (en) Target object detection method and device
US20150294490A1 (en) System and method for relating corresponding points in images with different viewing angles
CN111598913B (en) Image segmentation method and system based on robot vision
CN108492284B (en) Method and apparatus for determining perspective shape of image
CN110909620A (en) Vehicle detection method and device, electronic equipment and storage medium
CN114550117A (en) Image detection method and device
CN110852250B (en) Vehicle weight removing method and device based on maximum area method and storage medium
CN110634155A (en) Target detection method and device based on deep learning
CN110363847B (en) Map model construction method and device based on point cloud data
CN112991388B (en) Line segment feature tracking method based on optical flow tracking prediction and convex geometric distance
CN110634159A (en) Target detection method and device
CN111765892B (en) Positioning method, positioning device, electronic equipment and computer readable storage medium
CN115713560A (en) Camera and vehicle external parameter calibration method and device, electronic equipment and storage medium
CN110852252B (en) Vehicle weight-removing method and device based on minimum distance and maximum length-width ratio
CN110688873A (en) Multi-target tracking method and face recognition method
CN114399532A (en) Camera position and posture determining method and device
CN112785651B (en) Method and apparatus for determining relative pose parameters
CN112868049B (en) Efficient self-motion estimation using patch-based projection correlation
Wang et al. An airlight estimation method for image dehazing based on gray projection
CN111383337B (en) Method and device for identifying objects
CN104700396B (en) The method and system of the parameter for estimating the volume of traffic is determined from image
CN110796698B (en) Vehicle weight removing method and device with maximum area and minimum length-width ratio
CN110826497B (en) Vehicle weight removing method and device based on minimum distance method and storage medium
CN113542800B (en) Video picture scaling method, device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant