CN109300139B - Lane line detection method and device - Google Patents

Lane line detection method and device Download PDF

Info

Publication number
CN109300139B
CN109300139B CN201811159602.6A CN201811159602A CN109300139B CN 109300139 B CN109300139 B CN 109300139B CN 201811159602 A CN201811159602 A CN 201811159602A CN 109300139 B CN109300139 B CN 109300139B
Authority
CN
China
Prior art keywords
edge
parameters
video frame
lane
edges
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811159602.6A
Other languages
Chinese (zh)
Other versions
CN109300139A (en
Inventor
李映辉
张丙林
周志鹏
李冰
廖瑞华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN202111106274.5A priority Critical patent/CN113792690B/en
Priority to CN202111105791.0A priority patent/CN113793356B/en
Priority to CN201811159602.6A priority patent/CN109300139B/en
Publication of CN109300139A publication Critical patent/CN109300139A/en
Application granted granted Critical
Publication of CN109300139B publication Critical patent/CN109300139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/181Segmentation; Edge detection involving edge growing; involving edge linking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The embodiment of the application discloses a lane line detection method and a lane line detection device. One specific embodiment of the lane line detection method includes: detecting an edge in a current video frame; determining a set of candidate edges based on the detected edges; fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; for each edge in the candidate edge set, calculating the error of the fitting result of the edge and the edge; selecting edges of which the calculated error is less than or equal to a preset error; and determining the lane line based on the fitting result of the selected edges in response to the number of the selected edges being greater than or equal to 4. According to the embodiment, the fitting result can be obtained based on the edges in the candidate edge set, the fitting can be performed by effectively utilizing a plurality of pieces of edge information, and the stability of the lane line determined according to the fitting result is improved.

Description

Lane line detection method and device
Technical Field
The application relates to the technical field of computers, in particular to the technical field of electronic maps, and particularly relates to a lane line detection method and device.
Background
In the application of lane line detection, the detected lane line needs to be fitted to obtain the driving parameters of the current road.
At present, a plurality of lane lines are usually fitted by using straight lines or polynomials respectively, algorithms such as RANSAC (random sample consensus) and the like are used in the fitting process, and camera external calibration parameters are required during filtering.
However, the current lane line fitting method has the following problems: (1) each lane line is fitted independently, so that the fitting stability cannot be improved by effectively utilizing the information of other lane lines; (2) the camera external calibration parameters are needed to be used during filtering, and the use of occasions without external calibration is limited; (3) inter-frame parameter tracking cannot be effectively carried out; (4) the hardware performance consumption using the algorithm such as RANSAC is large.
Disclosure of Invention
The embodiment of the application provides a lane line detection method and a lane line detection device.
In a first aspect, an embodiment of the present application provides a lane line detection method, including: detecting an edge in a current video frame; determining a set of candidate edges based on the detected edges; fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; for each edge in the candidate edge set, calculating the error of the fitting result of the edge and the edge; selecting edges of which the calculated error is less than or equal to a preset error; and determining the lane line based on the fitting result of the selected edges in response to the number of the selected edges being greater than or equal to 4.
In some embodiments, the parameters of the lane imaging model are determined based on the steps of: in response to the parameters of the lane imaging model of the previous video frame obtained from the database, determining the parameters of the lane imaging model of the previous video frame as the parameters of the lane imaging model of the current video frame; in response to the parameters of the lane imaging model of the last video frame not being acquired from the database, the step of determining parameters based on data fitting fits each edge in the candidate edge set, and the parameters of the lane imaging model of the current video frame are determined.
In some embodiments, the determining parameters based on data fitting step fits edges in the candidate edge set, and determining parameters of the lane imaging model for the current video frame comprises: determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set; determining the number of lines of which the error between the fitting result of each edge line in the candidate edge set and the edge line is smaller than the preset error based on the lane imaging model determined by the parameters of each group of lane imaging models; and determining the parameters of the lane imaging model with the maximum number of the determined lines and the number of the determined lines larger than 4 as the parameters of the lane imaging model of the current video frame.
In some embodiments, the determining parameters based on data fitting step fits each edge in the candidate edge set, the determining parameters of the lane imaging model for the current video frame further comprising: if the number of the lines is the largest and the parameters of the lane imaging model with the number of the lines larger than 4 do not exist, the next frame of video frame is used as the current video frame, the candidate edge set is determined based on the detected edges for the current video frame, and the parameters of the lane imaging model of the current video frame are determined based on the edges in the fitted candidate edge set.
In some embodiments, determining parameters of a set of lane imaging models using a data fitting method comprises: determining parameters of a set of lane imaging models by adopting at least one of the following data fitting methods: least squares, hough transform, and maximum a posteriori estimation.
In some embodiments, in response to the parameters of the lane imaging model for the previous video frame not being acquired from the database, the determining parameters based on data fitting step fits each edge in the candidate edge set, the determining parameters of the lane imaging model for the current video frame including: in response to the situation that the parameters of the lane imaging model of the last video frame are not acquired from the database, the vanishing point parameters of the lane imaging model of the current video frame are determined based on the external parameters of the calibrated camera for shooting the video frame; and when the step of determining parameters based on data fitting is used for fitting each edge in the candidate edge set, determining the parameters of the lane imaging model of the current video frame by adopting vanishing point parameters.
In some embodiments, determining the set of candidate edges based on the detected edges comprises: determining a candidate edge set based on the number of pixel points included in each edge in the detected edges; or determining a candidate edge set based on the number of pixel points included in each edge in the detected edges and the adjacent blank area of each edge in the detected edges.
In some embodiments, determining the candidate edge set based on the number of pixel points included in each of the detected edges and the blank area adjacent to each of the detected edges includes: performing length sequencing according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sequencing; selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting; sorting the adjacent blank areas of the edges according to the size of the blank areas adjacent to the edges in the detected edges to obtain the edges sorted according to the adjacent blank areas; and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
In some embodiments, the lane imaging model comprises: u-u0=A(v-v0)+B/(v-v0) Wherein (u)0,v0) For the vanishing point position of the image (v ═ v)0Is a horizon line), (u, v) are coordinate points of the edge in the current video frame, A, B are model coefficients, and in the same frame image, only the value of A is different for different lane lines.
In some embodiments, the lane imaging model comprises: u-u0=∑ai(v-v0)iWherein (u)0,v0) For the vanishing point position of the image (v ═ v)0Is the horizon), (u, v) are the coordinate points of the edge in the current video frame, aiRefers to the coefficients of the i-th term of the taylor series expansion of the hyperbolic model.
In some embodiments, the method further comprises: and in response to the number of edges with the calculated error being larger than the predetermined error being smaller than 4, taking the next frame of video frame as the current video frame, and executing the lane line detection method on the new current video frame.
In a second aspect, an embodiment of the present application provides a lane line detection apparatus, including: an edge detection unit configured to detect an edge in a current video frame; a set determination unit configured to determine a candidate edge set based on the detected edges; an edge fitting unit configured to fit each edge in the candidate edge set using the determined parametric lane imaging model; an error calculation unit configured to calculate, for each edge in the candidate edge set, an error of a fitting result of the edge and the edge; an edge selection unit configured to select an edge whose calculated error is equal to or less than a predetermined error; a lane line determination unit configured to determine a lane line based on a fitting result of the selected edges in response to the number of the selected edges being equal to or greater than 4.
In some embodiments, the parameters of the lane imaging model in the edge fitting unit are determined based on the following determination steps: in response to the parameters of the lane imaging model of the previous video frame obtained from the database, determining the parameters of the lane imaging model of the previous video frame as the parameters of the lane imaging model of the current video frame; in response to the parameters of the lane imaging model of the last video frame not being acquired from the database, the step of determining parameters based on data fitting fits each edge in the candidate edge set, and the parameters of the lane imaging model of the current video frame are determined.
In some embodiments, the determining step in the edge fitting unit on which the parameters of the lane imaging model are based further comprises: determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set; determining the number of lines of which the error between the fitting result of each edge line in the candidate edge set and the edge line is smaller than the preset error based on the lane imaging model determined by the parameters of each group of lane imaging models; and determining the parameters of the lane imaging model with the maximum number of the determined lines and the number of the determined lines larger than 4 as the parameters of the lane imaging model of the current video frame.
In some embodiments, the determining step in the edge fitting unit on which the parameters of the lane imaging model are based further comprises: if the number of the lines is the largest and the parameters of the lane imaging model with the number of the lines larger than 4 do not exist, the next frame of video frame is used as the current video frame, the candidate edge set is determined based on the detected edges for the current video frame, and the parameters of the lane imaging model of the current video frame are determined based on the edges in the fitted candidate edge set.
In some embodiments, the determining step in the edge fitting unit on which the parameters of the lane imaging model are based further comprises: determining parameters of a set of lane imaging models by adopting at least one of the following data fitting methods: least squares, hough transform, and maximum a posteriori estimation.
In some embodiments, the determining step in the edge fitting unit on which the parameters of the lane imaging model are based further comprises: in response to the situation that the parameters of the lane imaging model of the last video frame are not acquired from the database, the vanishing point parameters of the lane imaging model of the current video frame are determined based on the external parameters of the calibrated camera for shooting the video frame; and when the step of determining parameters based on data fitting is used for fitting each edge in the candidate edge set, determining the parameters of the lane imaging model of the current video frame by adopting vanishing point parameters.
In some embodiments, the set determination unit is further configured to: determining a candidate edge set based on the number of pixel points included in each edge in the detected edges; or determining a candidate edge set based on the number of pixel points included in each edge in the detected edges and the adjacent blank area of each edge in the detected edges.
In some embodiments, the set determination unit is further configured to: performing length sequencing according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sequencing; selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting; sorting the adjacent blank areas of the edges according to the size of the blank areas adjacent to the edges in the detected edges to obtain the edges sorted according to the adjacent blank areas; and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
In some embodiments, the lane imaging model in the edge fitting unit comprises: u-u0=A(v-v0)+B/(v-v0) Wherein (u)0,v0) For the vanishing point position of the image (v ═ v)0Is the horizon), (u, v) are the coordinate points of the edge in the current video frame, A, B are the model coefficients, the same frame mapIn the image, only the a value differs from lane line to lane line.
In some embodiments, the lane imaging model in the edge fitting unit comprises: u-v0=∑ai(v-v0)iWherein (u)0,v0) For the vanishing point position of the image (v ═ v)0Is the horizon), (u, v) are the coordinate points of the edge in the current video frame, aiRefers to the coefficients of the i-th term of the taylor series expansion of the hyperbolic model.
In some embodiments, the apparatus further comprises: and a video frame updating unit configured to take a next frame video frame as a current video frame and execute the lane line detection method on a new current video frame in response to the number of edges whose calculated errors are larger than the predetermined error being smaller than 4.
In a third aspect, an embodiment of the present application provides an apparatus, including: one or more processors; storage means for storing one or more programs; when executed by one or more processors, cause the one or more processors to implement a method as described in any above.
In a fourth aspect, embodiments of the present application provide a computer-readable medium, on which a computer program is stored, which when executed by a processor implements a method as described above.
According to the lane line detection method and device provided by the embodiment of the application, a current video frame is obtained at first; obtaining a candidate edge set based on the obtained edge in the current video frame; then, adopting a lane imaging model with determined parameters to fit each edge in the candidate edge set; then, for each edge in the candidate edge set, calculating the error between the fitting result of the edge and the edge; finally, in response to the calculated error being less than the predetermined error, a lane line is determined based on the fitting results of the edges. In the process, the fitting result can be obtained based on the edges in the candidate edge set, a plurality of pieces of edge information can be effectively utilized for fitting, and the stability of the lane line determined according to the fitting result is improved.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a schematic flow chart diagram illustrating one embodiment of a lane line detection method according to an embodiment of the present application;
fig. 3a to 3f are schematic diagrams of an application scenario according to an embodiment of the present application;
FIG. 4 is a schematic flow chart diagram illustrating one embodiment of a method for determining parameters of a lane imaging model for a current video frame in accordance with an embodiment of the present application;
fig. 5 is a schematic structural view of an embodiment of the lane line detecting device of the present application;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the lane line detection method or apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and servers 105, 106. The network 104 is used to provide a medium for communication links between the terminal devices 101, 102, 103 and the servers 105, 106. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user 110 may use the terminal devices 101, 102, 103 to interact with the servers 105, 106 via the network 104 to receive or send messages or the like. Various communication client applications, such as an electronic map application, a search engine application, a shopping application, an instant messaging tool, a mailbox client, social platform software, a video playing application, etc., may be installed on the terminal devices 101, 102, 103.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, e-book readers, MP3 players (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The servers 105, 106 may be servers providing various services, such as background servers providing support for the terminal devices 101, 102, 103. The background server can analyze, store or calculate the data submitted by the terminal and push the analysis, storage or calculation result to the terminal device.
It should be noted that, in practice, the lane line detection method provided in the embodiment of the present application may be executed by the terminal devices 101, 102, and 103 or the servers 105 and 106, and the lane line detection apparatus may also be disposed in the terminal devices 101, 102, and 103 or the servers 105 and 106.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2, fig. 2 illustrates a flow 200 of one embodiment of a lane line detection method according to the present application. The lane line detection method comprises the following steps:
step 201, detecting an edge in a current video frame.
In this embodiment, an execution subject (for example, the terminal or the server shown in fig. 1) on which the lane line detection method operates may read a video captured by a camera from a camera of a local or remote electronic device, and use a video frame that needs to be processed to determine a lane line in the pulled video as a current video frame.
The execution body may then detect an edge in the current video frame. The purpose of detecting edges in the current video frame is to: points in the digital image where the brightness change is significant are identified. Significant changes in image attributes typically reflect significant events and changes in the attributes. These include discontinuities in depth, surface orientation discontinuities, material property changes, and scene lighting changes.
The method for detecting the edge in the current video frame may be a method for detecting an edge in a video frame in the prior art or in a technology developed in the future, and the application is not limited thereto. For example, search-based and zero-crossing-based edge detection methods may be employed to detect edges.
Search-based edge detection methods first compute the edge strength, usually expressed in terms of a first derivative, such as a gradient mode, and then compute to estimate the local direction of the edge, usually the direction of the gradient, and use this direction to find the maximum of the local gradient mode.
The zero crossing based approach finds the zero crossing points of the second derivative derived from the image to locate the edges. Usually with the laplace operator or the zero crossing of a non-linear differential equation.
Filtering is usually necessary as a pre-processing for edge detection, and gaussian filtering is usually employed.
The edge detection method applies a metric that computes the strength of the boundary, which is essentially different from smoothing filtering. Just as many edge detection methods rely on the computation of image gradients, they use different kinds of filters to estimate the gradients in the x-and y-directions.
It will be appreciated that cameras used to capture video are generally required to meet installation requirements for the detection results. For example: the pitch angle (pitch) and yaw angle (yaw) of the camera should be within a certain range so that vanishing points (referring to intersection points generated by extension lines of each side of the stereo figure) in the image are as close to the center of the image as possible, and the roll angle (roll) of the camera cannot exceed 5 degrees, etc.
Based on the detected edges, a set of candidate edges is determined, step 202.
In this embodiment, based on the edge detected in step 201, the detected edge may be directly used as a candidate edge set, or the detected edge may be filtered to obtain a filtered edge as a candidate edge set.
In an optional implementation manner of this embodiment, determining the candidate edge set based on the detected edge may include: determining a candidate edge set based on the number of pixel points included in each edge in the detected edges; or determining a candidate edge set based on the number of pixel points included in each edge in the detected edges and the adjacent blank area of each edge in the detected edges.
In this implementation, since the number of the pixel points included in each edge may determine the length of each edge, the edges that may be lane lines may be determined according to the length of each edge, and the determined edges are added to the candidate edge set.
Considering that in a practical application scenario, the blank area adjacent to the lane line is generally larger than the blank area adjacent to the non-lane line, therefore, on the basis of determining the edge that is likely to be the lane line based on the length of each edge, the edge that is likely to be the lane line may be further determined based on the size of the blank area adjacent to each edge, and the two separately determined edges are added to the candidate edge set.
In some optional implementation manners of this embodiment, determining the candidate edge set based on the number of pixel points included in each of the detected edges and the blank area adjacent to each of the detected edges includes: performing length sequencing according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sequencing; selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting; sorting the adjacent blank areas of the edges according to the size of the blank areas adjacent to the edges in the detected edges to obtain the edges sorted according to the adjacent blank areas; and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
In this implementation, based on the sorting order of the lengths of the edges, it may be determined that a portion of the candidate edges are added to the candidate edge set; based on the sorting order of the adjacent blank areas of the edges, it can also be determined that a part of the candidate edges are added to the candidate edge set. The edges in the candidate edge set are the edges of the candidate lane lines. The predetermined number and the preset number may be set empirically or manually, respectively.
In a specific embodiment, 8 edges with the longest length may be set as candidate edges, and the 8 edges with the largest adjacent blank area are also determined as candidate edges, so as to obtain a candidate edge set. It should be understood that the 8 edges with the longest length may coincide with the 8 edges with the largest adjacent blank area, and therefore, the number of lines that may be included in the candidate edge set is greater than or equal to 8 but less than 16.
And step 203, fitting each edge in the candidate edge set by using the lane imaging model with the determined parameters.
In this embodiment, the executing subject may fit each edge in the candidate edge set with a lane imaging model of which the parameters have been determined. The lane imaging model may generally be implemented using a function that simulates a lane line. For example, a straight line equation or a polynomial is used to implement a lane imaging model or the like.
Here, the parameters of the lane imaging model may be parameters of the lane imaging model of a similar picture frame, or parameters of the lane imaging model determined based on a data fitting result of a current picture frame.
In some optional implementations of the present embodiment, the parameters of the lane imaging model may be determined based on the following steps: in response to the parameters of the lane imaging model of the previous video frame obtained from the database, determining the parameters of the lane imaging model of the previous video frame as the parameters of the lane imaging model of the current video frame; in response to the parameters of the lane imaging model of the last video frame not being acquired from the database, the step of determining parameters based on data fitting fits each edge in the candidate edge set, and the parameters of the lane imaging model of the current video frame are determined.
In this implementation manner, since a plurality of edges are referred to according to the fitting result obtained according to each edge in the candidate edge set in the present application, the adaptability is wide, and there is continuity between video frames, parameters of the lane imaging model of the previous video frame can be continuously used in the edge fitting process of the current video frame.
In some optional implementations of the embodiment, in response to the parameter of the lane imaging model of the previous video frame not being acquired from the database, the determining the parameter based on data fitting step fits each edge in the candidate edge set, and determining the parameter of the lane imaging model of the current video frame includes: in response to the situation that the parameters of the lane imaging model of the last video frame are not acquired from the database, the vanishing point parameters of the lane imaging model of the current video frame are determined based on the external parameters of the calibrated camera for shooting the video frame; and when the step of determining parameters based on data fitting is used for fitting each edge in the candidate edge set, determining the parameters of the lane imaging model of the current video frame by adopting vanishing point parameters.
In the implementation manner, if external parameters of the calibrated camera for shooting the video frame exist, vanishing point parameters of the lane imaging model can be determined based on the external parameters, so that the calculation amount for determining the parameters of the lane imaging model is reduced, and the efficiency for determining the parameters of the lane imaging model is improved.
In some optional implementations of the present embodiment, the lane imaging model includes: u-u0=A(v-v0)+B/(v-v0) Wherein (u)0,v0) For the vanishing point position of the image (v ═ v)0Is a horizon line), (u, v) are coordinate points of the edge in the current video frame, A, B are model coefficients, and in the same frame image, only the value of A is different for different lane lines.
In the implementation mode, the lane imaging model can simultaneously model a straight lane and a curve on the image frame, so that the accuracy of the lane imaging model is improved, the adaptability of the lane imaging model is wide, and the inter-frame tracking is facilitated. In one specific example, the candidate edge set includes k sets of lane line feature points, at v0Under the condition that the value and other parameter variances are known, the lane line fitting can be converted into weighted least square fitting, and the least square method is an efficient data fitting method.
In some optional implementations of the present embodiment, the lane imaging model includes: u-u0=∑ai(v-v0)iWherein (u)0,v0) For the vanishing point position of the image (v ═ v)0Is the horizon), (u, v) are the coordinate points of the edge in the current video frame, aiRefers to the coefficients of the i-th term of the taylor series expansion of the hyperbolic model.
In this implementation, the hyperbolic model is u-u0=A(v-v0)+B/(v-v0). By adopting the Taylor expansion of the hyperbolic model as the hyperbolic model, the primary parameter is removed, the linear lane and the curve can be simultaneously modeled on the image frame, the accuracy of the lane imaging model is also improved, the adaptability of the lane imaging model is wide, and the inter-frame tracking is facilitated.
Step 204, for each edge in the candidate edge set, calculating the error between the fitting result of the edge and the edge.
In this embodiment, the executing subject may calculate, for each edge, an error between a result of fitting the edge and each edge in the candidate edge set based on a result of fitting the determined parameter of the lane imaging model to the edge. Here, the error may be "sum of residuals", "sum of absolute values of residuals", or "sum of squares of residuals".
In step 205, edges with calculated errors less than or equal to the predetermined error are selected.
In this embodiment, the calculated error is smaller than the predetermined error, which indicates that the fitting result conforms to the actual lane line, so that the fitting result of the edge can be used as the estimated lane line edge, thereby determining the lane line.
In response to the number of the selected edges being greater than or equal to 4, a lane line is determined based on the fitting result of the selected edges, step 206.
In this embodiment, when it is considered that one lane has two lane lines and each lane line includes two edges, and therefore the number of the selected edges is greater than or equal to 4, it is determined that at least one lane is included in the current image frame. At this time, the lane line can be determined based on the fitting result of the selected edge. When the lane line is determined according to the lane line edge, the selected edge can be chosen by considering the width of the lane, the position of the center line of the lane and the like, and the final lane line is determined.
In some optional implementation manners of this embodiment, the lane line detection method further includes: and in response to the number of edges with the calculated error being larger than the predetermined error being smaller than 4, taking the next frame of video frame as the current video frame, and executing the lane line detection method on the new current video frame.
In this implementation, if the number of edges with calculated errors larger than the predetermined error is less than 4, the detected edges in the current video frame do not include a complete lane (e.g., an image captured when vehicles merge), so the next frame of video frame may be used as the current video frame, and the lane line detection method described above may be performed on the new current video frame to determine the lane line.
An exemplary application scenario of the lane line detection method of the present application is described below with reference to fig. 3a to 3 e.
As shown in fig. 3a to 3e, fig. 3a to 3e show a schematic flow chart of an application scenario of the lane line detection method according to the present application.
As shown in fig. 3a, the lane line detection method 300 is executed in the electronic device 310, and may include:
firstly, detecting an edge 302 in a current video frame 301 to obtain the edge in the current video frame as shown in fig. 3 a;
then, according to the number of the pixel points included in each edge of the detected edges 302 from high to low, selecting a predetermined number of edges 303 to add to the candidate edge set 305, so as to obtain a candidate edge set as shown in fig. 3 b;
then, according to the detected blank areas adjacent to each edge in the edges 302 from large to small, selecting a preset number of edges 304 to add to the candidate edge set 305, so as to obtain a candidate edge set as shown in fig. 3 c;
then, the lane imaging model 306 with the determined parameters is adopted to fit each edge in the candidate edge set 305;
then, for each edge in the candidate edge set 305, calculating the error 307 of the fitting result of the edge and the edge;
then, selecting the edge with the calculated error 307 less than or equal to the preset error 308 to obtain a selected edge 309;
then, in response to the number of the selected edges 309 being greater than or equal to 4, a lane line 311 is determined based on the fitting result 310 of the selected edges (the fitting result of the selected edges as shown in fig. 3 d), resulting in the lane line as shown in fig. 3 e.
It should be understood that the application scenario of the lane line detection method shown in fig. 3 is only an exemplary description of the lane line detection method, and does not represent a limitation to the method. For example, the steps shown in fig. 3 above may be implemented in further detail.
In the lane line detection method according to the embodiment of the present application, an edge in a current video frame may be detected first; then, based on the detected edge, determining a candidate edge set; then, fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; then, for each edge in the candidate edge set, calculating the error between the fitting result of the edge and the edge; then, selecting the edge with the calculated error less than or equal to the preset error; finally, in response to the number of the selected edges being greater than or equal to 4, a lane line is determined based on a fitting result of the selected edges. In the process, because a plurality of edges in the candidate edge set are adopted for fitting, the stability of the fitting result is improved, the accuracy of the lane imaging model is improved, the adaptability of the lane imaging model is wide, and the inter-frame tracking is facilitated. In addition, the external calibration parameters of the camera do not need to be considered in the filtering process, and the use occasion is not limited.
Referring to FIG. 4, a flow diagram of one embodiment of a method of determining parameters of a lane imaging model for a current video frame is shown, in accordance with the present application.
As shown in fig. 4, a flowchart 400 of the method for determining parameters of a lane imaging model of a current video frame according to the present embodiment may include the following steps:
step 401, for each two edge combinations in the candidate edge set, determining parameters of a set of lane imaging models by using a data fitting method.
In this implementation manner, for each two edge combinations in the candidate edge set, a lane imaging model with unknown parameters may be substituted, so as to solve the unknown parameters, and obtain parameters of a group of lane imaging models.
The data fitting, also called curve fitting, is a representation of substituting the existing data into a mathematical expression through a mathematical method. Scientific and engineering problems can be solved by obtaining several discrete data through methods such as sampling, experiment, etc., and from these data, we often want to obtain a continuous function (i.e. curve) or a more dense discrete equation fitting the known data, which is called fitting (fitting).
In some optional implementations of the present embodiment, determining the parameters of the set of lane imaging models using the data fitting method includes: determining parameters of a set of lane imaging models by adopting at least one of the following data fitting methods: least squares, hough transform, and maximum a posteriori estimation.
In the implementation mode, when the linear model is used for fitting data, the data volume is generally larger than the unknown number of the equation set, an overdetermined equation set is obtained, and coefficients among the equations may not be compatible, so that the equation set has no solution. And solving the optimal solution of the over-determined equation set by the least square method under the constraint of the minimum square error condition. The heteroscedastic problem of the least squares method can be solved using weighting. The process of solving the model parameters by the maximum likelihood method is to search the parameter space and find the parameter point with the maximum possibility of the feature point set.
Unlike the voting process from feature points to the parameter space of Hough transform, the maximum a posteriori estimation is a matching process from the parameter space to the set of feature points. Illustratively, the data fitting method may include: and (3) maximum posterior estimation realized based on a least square method.
And step 402, determining the number of lines of which the error between the fitting result of each edge line in the candidate edge set and the edge line is smaller than the preset error based on the lane imaging model determined by the parameters of each group of lane imaging models.
In this embodiment, since the parameters of each set of lane imaging models are determined based on the combination of two edges, the number of edges in candidate edges to which the parameters of each set of lane imaging models are applied is different, in order to determine the parameters of the optimal lane imaging model, the number of lines to which each set of lane imaging models is applied needs to be determined, and then the applicability of the parameters of which set of lane imaging models is determined according to the number of lines is wider.
And step 403, determining the parameters of the lane imaging model with the maximum number of the determined lines and the number of the determined lines larger than 4 as the parameters of the lane imaging model of the current video frame.
In this embodiment, the maximum number of lines is determined, which may ensure that the applicability of the lane imaging model is the widest, and the number of lines is greater than 4, which may ensure that the parameters of the lane imaging model are at least adapted to 4 edges included in one lane.
In some optional implementations of this embodiment, the determining parameters based on data fitting step fits each edge in the candidate edge set, and determining the parameters of the lane imaging model of the current video frame further includes: if the number of the lines is the largest and the parameters of the lane imaging model with the number of the lines larger than 4 do not exist, the next frame of video frame is used as the current video frame, the candidate edge set is determined based on the detected edges for the current video frame, and the parameters of the lane imaging model of the current video frame are determined based on the edges in the fitted candidate edge set.
In this implementation manner, if the number of lines is the largest and the number of lines is greater than 4, the parameters of the lane imaging model do not exist, that is, the number of lines does not satisfy the requirement of being applied to four edges included in one lane, and a complete lane does not exist in the current video frame. Thus, the lane line can be determined based on the next frame video frame.
In the method for determining the parameters of the lane imaging model of the current video frame according to the embodiment of the application, a data fitting method is adopted to determine a group of model parameters for every two edge combinations in the candidate edge set; determining the number of lines of which the error between the fitting result of each edge line in the candidate edge set and the edge line is smaller than the preset error based on the lane imaging model determined by the parameters of each group of lane imaging models; and determining the parameters of the lane imaging model with the maximum number of the determined lines and the number of the determined lines larger than 4 as the parameters of the lane imaging model of the current video frame. In the process, the parameters of the lane imaging model which can adapt to the most edges are screened out, and the applicability of the determined parameters of the lane imaging model is improved.
With further reference to fig. 5, as an implementation of the methods shown in the above-mentioned figures, an embodiment of the present application provides an embodiment of a lane line detection apparatus, which corresponds to the method embodiments shown in fig. 2 to fig. 4, and which can be applied to various electronic devices.
As shown in fig. 5, the lane line detection apparatus 500 of the present embodiment may include: an edge detection unit 510 configured to detect an edge in a current video frame; a set determining unit 520 configured to determine a candidate edge set based on the detected edges; an edge fitting unit 530 configured to fit each edge in the candidate edge set using the determined parametric lane imaging model; an error calculation unit 540 configured to calculate, for each edge in the candidate edge set, an error of a fitting result of the edge and the edge; an edge selection unit 550 configured to select an edge whose calculated error is equal to or less than a predetermined error; and a lane line determination unit 560 configured to determine a lane line based on a fitting result of the selected edges in response to the number of the selected edges being equal to or greater than 4.
In some embodiments, the parameters of the lane imaging model in the edge fitting unit 530 are determined based on the following determination steps: in response to the parameters of the lane imaging model of the previous video frame obtained from the database, determining the parameters of the lane imaging model of the previous video frame as the parameters of the lane imaging model of the current video frame; in response to the parameters of the lane imaging model of the last video frame not being acquired from the database, the step of determining parameters based on data fitting fits each edge in the candidate edge set, and the parameters of the lane imaging model of the current video frame are determined.
In some embodiments, the determining step in the edge fitting unit 530 on which the parameters of the lane imaging model are based further comprises: determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set; determining the number of lines of which the error between the fitting result of each edge line in the candidate edge set and the edge line is smaller than the preset error based on the lane imaging model determined by the parameters of each group of lane imaging models; and determining the parameters of the lane imaging model with the maximum number of the determined lines and the number of the determined lines larger than 4 as the parameters of the lane imaging model of the current video frame.
In some embodiments, the determining step in the edge fitting unit 530 on which the parameters of the lane imaging model are based further comprises: if the number of the lines is the largest and the parameters of the lane imaging model with the number of the lines larger than 4 do not exist, the next frame of video frame is used as the current video frame, the candidate edge set is determined based on the detected edges for the current video frame, and the parameters of the lane imaging model of the current video frame are determined based on the edges in the fitted candidate edge set.
In some embodiments, the determining step in the edge fitting unit 530 on which the parameters of the lane imaging model are based further comprises: determining parameters of a set of lane imaging models by adopting at least one of the following data fitting methods: least squares, hough transform, and maximum a posteriori estimation.
In some embodiments, the determining step in the edge fitting unit 530 on which the parameters of the lane imaging model are based further comprises: in response to the situation that the parameters of the lane imaging model of the last video frame are not acquired from the database, the vanishing point parameters of the lane imaging model of the current video frame are determined based on the external parameters of the calibrated camera for shooting the video frame; and when the step of determining parameters based on data fitting is used for fitting each edge in the candidate edge set, determining the parameters of the lane imaging model of the current video frame by adopting vanishing point parameters.
In some embodiments, the set determination unit 520 is further configured to: determining a candidate edge set based on the number of pixel points included in each edge in the detected edges; or determining a candidate edge set based on the number of pixel points included in each edge in the detected edges and the adjacent blank area of each edge in the detected edges.
In some embodiments, the set determination unit 520 is further configured to: performing length sequencing according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sequencing; selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting; sorting the adjacent blank areas of the edges according to the size of the blank areas adjacent to the edges in the detected edges to obtain the edges sorted according to the adjacent blank areas; and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
In some embodiments, the lane imaging model in the edge fitting unit 530 includes: u-u0=A(v-v0)+B/(v-v0) Wherein (u)0,v0) For the vanishing point position of the image (v ═ v)0Is a horizon line), (u, v) are coordinate points of the edge in the current video frame, A, B are model coefficients, and in the same frame image, only the value of A is different for different lane lines.
In some embodiments, the lane imaging model in the edge fitting unit 530 includes: u-u0=∑ai(v-v0)iWherein (u)0,v0) For the vanishing point position of the image (v ═ v)0Is the horizon), (u, v) are the coordinate points of the edge in the current video frame, aiRefers to the coefficients of the i-th term of the taylor series expansion of the hyperbolic model.
In some embodiments, the apparatus further comprises: a video frame update unit 570 configured to regard a next frame video frame as a current video frame and perform a lane line detection method on a new current video frame in response to the number of edges for which the calculated error is greater than the predetermined error being less than 4.
It should be understood that the elements recited in apparatus 500 may correspond to various steps in the methods described with reference to fig. 2-4. Thus, the operations and features described above for the method are equally applicable to the apparatus 500 and the units included therein, and are not described in detail here.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a server according to embodiments of the present application. The terminal device or the server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an edge detection unit, a set determination unit, an edge fitting unit, an error calculation unit, an edge selection unit, and a lane line determination unit. Where the names of these units do not in some cases constitute a definition of the unit itself, for example, an edge detection unit may also be described as a "unit that detects an edge in the current video frame".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: detecting an edge in a current video frame; determining a set of candidate edges based on the detected edges; fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters; for each edge in the candidate edge set, calculating the error of the fitting result of the edge and the edge; selecting edges of which the calculated error is less than or equal to a preset error; and determining the lane line based on the fitting result of the selected edges in response to the number of the selected edges being greater than or equal to 4.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (22)

1. A lane line detection method includes:
detecting an edge in a current video frame;
determining a set of candidate edges based on the detected edges;
fitting each edge in the candidate edge set by adopting a lane imaging model with determined parameters;
the parameters of the lane imaging model are determined based on the following steps:
in response to the parameters of the lane imaging model of the previous video frame obtained from the database, determining the parameters of the lane imaging model of the previous video frame as the parameters of the lane imaging model of the current video frame;
in response to the situation that the parameters of the lane imaging model of the last video frame are not acquired from the database, determining parameters based on data fitting, fitting each edge in the candidate edge set, and determining the parameters of the lane imaging model of the current video frame;
for each edge in the candidate edge set, calculating the error of the fitting result of the edge and the edge;
selecting edges of which the calculated error is less than or equal to a preset error;
and determining the lane line based on the fitting result of the selected edges in response to the number of the selected edges being greater than or equal to 4.
2. The method of claim 1, wherein the determining parameters based on data fitting step fits edges in the candidate edge set, determining parameters of a lane imaging model for a current video frame comprising:
determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set;
determining the number of lines of which the error between the fitting result of each edge line in the candidate edge set and the edge line is smaller than the preset error based on the lane imaging model determined by the parameters of each group of lane imaging models;
and determining the parameters of the lane imaging model with the maximum number of the determined lines and the number of the determined lines larger than 4 as the parameters of the lane imaging model of the current video frame.
3. The method of claim 2, wherein the determining parameters based on data fitting step fits edges in the candidate edge set, determining parameters of a lane imaging model for a current video frame further comprising:
if the number of lines is the largest and the parameters of the lane imaging model with the number of lines larger than 4 do not exist, taking the next frame of video frame as the current video frame, executing the step of determining a candidate edge set based on the detected edges for the current video frame, and determining the parameters of the lane imaging model of the current video frame based on fitting each edge in the candidate edge set.
4. The method of claim 2, wherein the determining parameters of a set of lane imaging models using a data fitting method comprises:
determining parameters of a set of lane imaging models by adopting at least one of the following data fitting methods: least squares, hough transform, and maximum a posteriori estimation.
5. The method of claim 2, wherein the step of determining parameters based on data fitting fits edges in the candidate edge set in response to parameters of the lane imaging model for which a previous video frame was not acquired from the database, the determining parameters of the lane imaging model for the current video frame comprising:
in response to the situation that the parameters of the lane imaging model of the last video frame are not acquired from the database, the vanishing point parameters of the lane imaging model of the current video frame are determined based on the external parameters of the calibrated camera for shooting the video frame;
and when the step of determining parameters based on data fitting is used for fitting each edge in the candidate edge set, determining the parameters of the lane imaging model of the current video frame by adopting the vanishing point parameters.
6. The method of claim 1, wherein the determining a set of candidate edges based on the detected edges comprises:
determining a candidate edge set based on the number of pixel points included in each edge in the detected edges; or
And determining a candidate edge set based on the number of pixel points included in each edge in the detected edges and the adjacent blank area of each edge in the detected edges.
7. The method of claim 6, wherein the determining the candidate edge set based on the number of pixels included in each of the detected edges and the blank area adjacent to each of the detected edges comprises:
performing length sequencing according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sequencing;
selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting;
sorting the adjacent blank areas of the edges according to the size of the blank areas adjacent to the edges in the detected edges to obtain the edges sorted according to the adjacent blank areas;
and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
8. The method of claim 1, wherein the lane imaging model comprises:
u-u0=A(v-v0)+B/(v-v0) Wherein (u)0,v0) For the vanishing point position of the image, (u, v) are coordinate points of the edge in the current video frame, A, B are model coefficients, and in the same frame of image, different lane lines only have different values of A.
9. The method of claim 1, wherein the lane imaging model comprises: u-u0=∑ai(v-v0)iWherein (u)0,v0) For the image vanishing point location, (u, v) are the coordinate points of the edge in the current video frame, aiRefers to the coefficients of the i-th term of the taylor series expansion of the hyperbolic model.
10. The method of claim 1, wherein the method further comprises:
and in response to the number of edges with the calculated error larger than the predetermined error being smaller than 4, taking the next frame of video frame as the current video frame, and executing the lane line detection method on the new current video frame.
11. A lane line detection apparatus comprising:
an edge detection unit configured to detect an edge in a current video frame;
a set determination unit configured to determine a candidate edge set based on the detected edges;
an edge fitting unit configured to fit each edge in the candidate edge set using a lane imaging model with determined parameters;
the parameters of the lane imaging model in the edge fitting unit are determined based on the following determination steps:
in response to the parameters of the lane imaging model of the previous video frame obtained from the database, determining the parameters of the lane imaging model of the previous video frame as the parameters of the lane imaging model of the current video frame;
in response to the situation that the parameters of the lane imaging model of the last video frame are not acquired from the database, determining parameters based on data fitting, fitting each edge in the candidate edge set, and determining the parameters of the lane imaging model of the current video frame;
an error calculation unit configured to calculate, for each edge in the candidate edge set, an error of a fitting result of the edge and the edge;
an edge selection unit configured to select an edge whose calculated error is equal to or less than a predetermined error;
a lane line determination unit configured to determine a lane line based on a fitting result of the selected edges in response to the number of the selected edges being equal to or greater than 4.
12. The apparatus of claim 11, wherein the determining step in the edge fitting unit on which the parameters of the lane imaging model are based further comprises:
determining parameters of a group of lane imaging models by adopting a data fitting method for every two edge combinations in the candidate edge set;
determining the number of lines of which the error between the fitting result of each edge line in the candidate edge set and the edge line is smaller than the preset error based on the lane imaging model determined by the parameters of each group of lane imaging models;
and determining the parameters of the lane imaging model with the maximum number of the determined lines and the number of the determined lines larger than 4 as the parameters of the lane imaging model of the current video frame.
13. The apparatus of claim 12, wherein the determining step in the edge fitting unit on which the parameters of the lane imaging model are based further comprises:
if the number of lines is the largest and the parameters of the lane imaging model with the number of lines larger than 4 do not exist, taking the next frame of video frame as the current video frame, executing the step of determining a candidate edge set based on the detected edges for the current video frame, and determining the parameters of the lane imaging model of the current video frame based on fitting each edge in the candidate edge set.
14. The apparatus of claim 12, wherein the determining step in the edge fitting unit on which the parameters of the lane imaging model are based further comprises:
determining parameters of a set of lane imaging models by adopting at least one of the following data fitting methods: least squares, hough transform, and maximum a posteriori estimation.
15. The apparatus of claim 12, wherein the determining step in the edge fitting unit on which the parameters of the lane imaging model are based further comprises:
in response to the situation that the parameters of the lane imaging model of the last video frame are not acquired from the database, the vanishing point parameters of the lane imaging model of the current video frame are determined based on the external parameters of the calibrated camera for shooting the video frame;
and when the step of determining parameters based on data fitting is used for fitting each edge in the candidate edge set, determining the parameters of the lane imaging model of the current video frame by adopting the vanishing point parameters.
16. The apparatus of claim 11, wherein the set determination unit is further configured to:
determining a candidate edge set based on the number of pixel points included in each edge in the detected edges; or
And determining a candidate edge set based on the number of pixel points included in each edge in the detected edges and the adjacent blank area of each edge in the detected edges.
17. The apparatus of claim 16, wherein the set determination unit is further configured to:
performing length sequencing according to the number of pixel points included in each edge in the detected edges from high to low to obtain each edge after the length sequencing;
selecting a preset number of edges to be added to the candidate edge set according to the sorting sequence of the edges after the length sorting;
sorting the adjacent blank areas of the edges according to the size of the blank areas adjacent to the edges in the detected edges to obtain the edges sorted according to the adjacent blank areas;
and selecting a preset number of edges to be added to the candidate edge set based on the sorting sequence of the edges sorted according to the adjacent blank areas.
18. The apparatus of claim 11, wherein the lane imaging model in the edge fitting unit comprises:
u-u0=A(v-v0)+B/(v-v0) Wherein (u)0,v0) For the vanishing point position of the image, (u, v) are coordinate points of the edge in the current video frame, A, B are model coefficients, and in the same frame of image, different lane lines only have different A values。
19. The apparatus of claim 11, wherein the lane imaging model in the edge fitting unit comprises: u-u0=∑ai(v-v0)iWherein (u)0,v0) For the image vanishing point location, (u, v) are the coordinate points of the edge in the current video frame, aiRefers to the coefficients of the i-th term of the taylor series expansion of the hyperbolic model.
20. The apparatus of claim 11, wherein the apparatus further comprises:
and a video frame updating unit configured to take a next frame video frame as a current video frame and execute the lane line detection method on a new current video frame in response to the number of edges whose calculated errors are larger than the predetermined error being smaller than 4.
21. A server, comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-10.
22. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-10.
CN201811159602.6A 2018-09-30 2018-09-30 Lane line detection method and device Active CN109300139B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202111106274.5A CN113792690B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN202111105791.0A CN113793356B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN201811159602.6A CN109300139B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811159602.6A CN109300139B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202111105791.0A Division CN113793356B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN202111106274.5A Division CN113792690B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Publications (2)

Publication Number Publication Date
CN109300139A CN109300139A (en) 2019-02-01
CN109300139B true CN109300139B (en) 2021-10-15

Family

ID=65161420

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201811159602.6A Active CN109300139B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN202111106274.5A Active CN113792690B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN202111105791.0A Active CN113793356B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202111106274.5A Active CN113792690B (en) 2018-09-30 2018-09-30 Lane line detection method and device
CN202111105791.0A Active CN113793356B (en) 2018-09-30 2018-09-30 Lane line detection method and device

Country Status (1)

Country Link
CN (3) CN109300139B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109300139B (en) * 2018-09-30 2021-10-15 百度在线网络技术(北京)有限公司 Lane line detection method and device
CN109934169A (en) * 2019-03-13 2019-06-25 东软睿驰汽车技术(沈阳)有限公司 A kind of Lane detection method and device
CN112050821B (en) * 2020-09-11 2021-08-20 湖北亿咖通科技有限公司 Lane line polymerization method
CN112560680A (en) * 2020-12-16 2021-03-26 北京百度网讯科技有限公司 Lane line processing method and device, electronic device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104008387A (en) * 2014-05-19 2014-08-27 山东科技大学 Lane line detection method based on feature point piecewise linear fitting
CN105069415A (en) * 2015-07-24 2015-11-18 深圳市佳信捷技术股份有限公司 Lane line detection method and device
CN105760812A (en) * 2016-01-15 2016-07-13 北京工业大学 Hough transform-based lane line detection method
CN106326822A (en) * 2015-07-07 2017-01-11 北京易车互联信息技术有限公司 Method and device for detecting lane line
CN106384085A (en) * 2016-08-31 2017-02-08 浙江众泰汽车制造有限公司 Calculation method for yaw angle of unmanned vehicle
CN107832732A (en) * 2017-11-24 2018-03-23 河南理工大学 Method for detecting lane lines based on ternary tree traversal

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2735033B2 (en) * 1995-06-05 1998-04-02 日本電気株式会社 Lane change detection apparatus and method
US6819779B1 (en) * 2000-11-22 2004-11-16 Cognex Corporation Lane detection system and apparatus
US7409092B2 (en) * 2002-06-20 2008-08-05 Hrl Laboratories, Llc Method and apparatus for the surveillance of objects in images
CN101470801B (en) * 2007-12-24 2011-06-01 财团法人车辆研究测试中心 Vehicle shift inspection method
TWI452540B (en) * 2010-12-09 2014-09-11 Ind Tech Res Inst Image based detecting system and method for traffic parameters and computer program product thereof
CN102208019B (en) * 2011-06-03 2013-01-09 东南大学 Method for detecting lane change of vehicle based on vehicle-mounted camera
CN102314599A (en) * 2011-10-11 2012-01-11 东华大学 Identification and deviation-detection method for lane
KR101295077B1 (en) * 2011-12-28 2013-08-08 전자부품연구원 Lane Departure Warning System
CN102663744B (en) * 2012-03-22 2015-07-08 杭州电子科技大学 Complex road detection method under gradient point pair constraint
CN104008645B (en) * 2014-06-12 2015-12-09 湖南大学 One is applicable to the prediction of urban road lane line and method for early warning
CN104268860B (en) * 2014-09-17 2017-10-17 电子科技大学 A kind of method for detecting lane lines
CN105320927B (en) * 2015-03-25 2018-11-23 中科院微电子研究所昆山分所 Method for detecting lane lines and system
CN105741559B (en) * 2016-02-03 2018-08-31 安徽清新互联信息科技有限公司 A kind of illegal occupancy Emergency Vehicle Lane detection method based on track line model
CN106774328A (en) * 2016-12-26 2017-05-31 广州大学 A kind of automated driving system and method based on road Identification
CN107909007B (en) * 2017-10-27 2019-12-13 上海识加电子科技有限公司 lane line detection method and device
CN108052880B (en) * 2017-11-29 2021-09-28 南京大学 Virtual and real lane line detection method for traffic monitoring scene
CN108009524B (en) * 2017-12-25 2021-07-09 西北工业大学 Lane line detection method based on full convolution network
CN108280450B (en) * 2017-12-29 2020-12-29 安徽农业大学 Expressway pavement detection method based on lane lines
CN108519605B (en) * 2018-04-09 2021-09-07 重庆邮电大学 Road edge detection method based on laser radar and camera
CN109300139B (en) * 2018-09-30 2021-10-15 百度在线网络技术(北京)有限公司 Lane line detection method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104008387A (en) * 2014-05-19 2014-08-27 山东科技大学 Lane line detection method based on feature point piecewise linear fitting
CN106326822A (en) * 2015-07-07 2017-01-11 北京易车互联信息技术有限公司 Method and device for detecting lane line
CN105069415A (en) * 2015-07-24 2015-11-18 深圳市佳信捷技术股份有限公司 Lane line detection method and device
CN105760812A (en) * 2016-01-15 2016-07-13 北京工业大学 Hough transform-based lane line detection method
CN106384085A (en) * 2016-08-31 2017-02-08 浙江众泰汽车制造有限公司 Calculation method for yaw angle of unmanned vehicle
CN107832732A (en) * 2017-11-24 2018-03-23 河南理工大学 Method for detecting lane lines based on ternary tree traversal

Also Published As

Publication number Publication date
CN113792690B (en) 2023-06-23
CN113792690A (en) 2021-12-14
CN113793356B (en) 2023-06-23
CN109300139A (en) 2019-02-01
CN113793356A (en) 2021-12-14

Similar Documents

Publication Publication Date Title
CN109300139B (en) Lane line detection method and device
CN108960090B (en) Video image processing method and device, computer readable medium and electronic equipment
CN108710885B (en) Target object detection method and device
US20150294490A1 (en) System and method for relating corresponding points in images with different viewing angles
CN113869293B (en) Lane line recognition method and device, electronic equipment and computer readable medium
CN109255337B (en) Face key point detection method and device
US20140133746A1 (en) Background understanding in video data
US11756224B2 (en) Circle center detection in imagery
CN109118456B (en) Image processing method and device
CN113607185B (en) Lane line information display method, lane line information display device, electronic device, and computer-readable medium
CN108492284B (en) Method and apparatus for determining perspective shape of image
CN111598913B (en) Image segmentation method and system based on robot vision
Jog et al. Automated computation of the fundamental matrix for vision based construction site applications
CN112001357B (en) Target identification detection method and system
CN112785651B (en) Method and apparatus for determining relative pose parameters
CN110852250B (en) Vehicle weight removing method and device based on maximum area method and storage medium
CN111754467A (en) Hough transform-based parking space detection method and device, computer equipment and storage medium
CN109523564B (en) Method and apparatus for processing image
CN110634155A (en) Target detection method and device based on deep learning
WO2015151553A1 (en) Change detection assistance device, change detection assistance method, and computer-readable recording medium
CN110634159A (en) Target detection method and device
CN112868049B (en) Efficient self-motion estimation using patch-based projection correlation
CN110852252B (en) Vehicle weight-removing method and device based on minimum distance and maximum length-width ratio
CN111383337B (en) Method and device for identifying objects
CN112991179B (en) Method, apparatus, device and storage medium for outputting information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20211011

Address after: 100176 Room 101, 1st floor, building 1, yard 7, Ruihe West 2nd Road, economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Apollo Zhilian (Beijing) Technology Co.,Ltd.

Address before: 100085 Baidu Building, 10 Shangdi Tenth Street, Haidian District, Beijing

Applicant before: BAIDU ONLINE NETWORK TECHNOLOGY (BEIJING) Co.,Ltd.