CN109583365B - Method for detecting lane line fitting based on imaging model constrained non-uniform B-spline curve - Google Patents

Method for detecting lane line fitting based on imaging model constrained non-uniform B-spline curve Download PDF

Info

Publication number
CN109583365B
CN109583365B CN201811427546.XA CN201811427546A CN109583365B CN 109583365 B CN109583365 B CN 109583365B CN 201811427546 A CN201811427546 A CN 201811427546A CN 109583365 B CN109583365 B CN 109583365B
Authority
CN
China
Prior art keywords
image
lane line
line
coordinate system
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811427546.XA
Other languages
Chinese (zh)
Other versions
CN109583365A (en
Inventor
穆柯楠
赵祥模
王会峰
惠飞
卢勇
杨澜
景首才
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changan University
Original Assignee
Changan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changan University filed Critical Changan University
Priority to CN201811427546.XA priority Critical patent/CN109583365B/en
Publication of CN109583365A publication Critical patent/CN109583365A/en
Application granted granted Critical
Publication of CN109583365B publication Critical patent/CN109583365B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration by the use of histogram techniques
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details
    • G06T2207/20032Median filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Abstract

The method comprises the steps of constraining a non-uniform B-spline curve to fit a lane line detection method based on an imaging model, firstly, carrying out median filtering and histogram equalization processing on an image to obtain an enhanced lane line image; secondly, performing edge detection on the image by adopting a Canny operator to obtain a lane line edge image; then, Hough transformation straight line detection is carried out on the edge image, so that the background interference edge is reduced while the edge continuity is improved; deducing a control point estimation model under the constraint of a lane line-camera imaging model on the basis of a camera geometric imaging model based on the assumptions of 'the optical axis of the camera is parallel to the road plane' and 'the left lane line and the right lane line are parallel'; and finally, solving the parameters of the non-uniform B-spline curve model by combining the position information of the pixels at the edge of the lane line to realize the fitting of the lane line. The method can effectively improve the positioning precision of the control points and the detection accuracy of the lane lines, and improve the robustness of the lane line detection algorithm based on curve fitting to background interference.

Description

Method for detecting lane line fitting based on imaging model constrained non-uniform B-spline curve
Technical Field
The invention belongs to the field of traffic video detection, and particularly relates to a method for detecting a lane line based on imaging model constrained non-uniform B-spline curve fitting.
Background
The current unmanned technology is a research hotspot in the field of intelligent transportation, and a great deal of scientific research investment of many scientific research institutes and enterprises at home and abroad greatly promotes the rapid development of the unmanned technology. The lane line is used as necessary information of driving behaviors of a vehicle such as lane keeping and lane changing and is important environmental data in environment sensing of the unmanned vehicle, so that the performance of the lane line detection method has a non-negligible influence on the performance of an environment sensing system of the unmanned vehicle and the safety of the whole unmanned vehicle driving system.
The main purpose of lane line detection is to extract the position information of the lane line from the video image. Currently, the lane line detection methods commonly used in the art can be roughly classified into region-based, feature-based and model-based methods, and the most common lane line detection method is the model-based method. The method is usually based on the idea that the trend of the lane line of the structured road can be approximated by a specific mathematical model, and the linear, parabolic, snake-shaped and other mathematical models are adopted for fitting the lane lines with different trends, such as a straight line, a parabola, a hyperbolic curve, a spline curve and the like, so that the detection cost is greatly reduced while the detection accuracy of the lane line is ensured. The spline curve is expressed by a piecewise polynomial, and can be accurately fitted with curves of any shapes, so that the spline curve is widely applied to lane line detection. The determination of the control points can be known through analysis of related researches to be the key of B-spline curve fitting of the lane lines, however, the interference of vehicle shielding, tree shading, building shading, road surface damage and the like increases a lot of difficulty for the extraction of the model control points, so that the accuracy of the lane line fitting is influenced, and even the fitting fails. Therefore, how to effectively improve the robustness of the control point extraction on background interference (vehicle shielding, tree shading, building shadows, other road marks, road damage and the like), improve the positioning accuracy of the control points and simultaneously consider the time cost of the algorithm, and becomes a key problem for improving the efficiency of the lane line detection method based on the spline curve model.
Disclosure of Invention
The invention aims to overcome the problem that the traditional lane line detection method based on a spline curve model is easy to be interfered by a background to cause inaccurate or failed control point positioning, and provides the lane line detection method based on imaging model constrained non-uniform B-spline curve fitting.
In order to achieve the purpose, the invention adopts the following technical scheme to realize the purpose:
the method for detecting the lane line based on the imaging model constrained non-uniform B-spline curve fitting comprises the following steps:
the method comprises the following steps: preprocessing an image;
acquiring an original lane line image I from a lane line standard image library in an intra-card Chiilong image database, performing median filtering on the original lane line image I to remove salt-pepper noise, performing histogram equalization to enhance the brightness and contrast of the image to make edge features prominent, and obtaining an enhanced lane line image I 1
Step two: detecting edges;
for the enhanced lane line image I 1 Adopting a Canny operator to carry out edge detection to obtain an initial lane line edge image I 2
Step three: detecting a Hough straight line;
for the initial lane line edge image I obtained in the step two 2 Carrying out Hough linear detection, reserving edges containing linear detection results, removing other interference edges, and obtaining an edge image I 3
Step four: deducing imaging model constraint conditions;
the derived imaging model constraints are as follows:
the length delta v of the line segment in the u-th row in the image coordinate system corresponding to the lane line segment with the length delta Y in the world coordinate system is as follows:
Figure BDA0001881976450000021
the space width delta u in the v-th line in the image coordinate system corresponding to the left and right lane line space with the width delta X in the world coordinate system is as follows:
Figure BDA0001881976450000031
step five: extracting control points of the non-uniform B-spline curve;
scanning lines are set in the image according to the length of delta Y in the corresponding world coordinate system and the constraint of an imaging model, and the intersection point of each scanning line and the edges of the left lane and the right lane is a pair of control points;
step six: fitting a lane line;
and fifthly, obtaining the non-uniform B-spline curve control point information, and fitting the left lane line and the right lane line by using a NUBS interpolation method to complete the detection of the lane lines.
The further improvement of the invention is that in the step one, when the median filtering is carried out, the median filtering function f ^ (x, y) is adopted as follows:
Figure BDA0001881976450000032
where f ^ (x, y) is the median filter output, S xy Representing the set of coordinates of a rectangular sub-image window centered at (x, y) and of size M x N, and f (a, b) is the pixel gray-scale value at coordinates (a, b).
The invention is further improved in that, in the first step, when histogram equalization is performed, the histogram equalization function s is used k Comprises the following steps:
Figure BDA0001881976450000033
wherein s is k For histogram equalized output, r k Representing discrete gray levels, 0 ≦ r k ≤255,k=0,1,2,…,n-1,n i For the occurrence of grey levels r in the image i N is the total number of pixels in the image,
Figure BDA0001881976450000034
is the frequency in probability theory.
The invention has the further improvement that the specific steps of the second step are as follows:
(1) smoothing image I with Gaussian filter 1
The gaussian smoothing function G (x, y) is:
Figure BDA0001881976450000035
using G (x, y) and the enhanced lane line image I 1 Performing convolution to obtain a smooth image f 1
f 1 (x,y)=I 1 (x,y)*G(x,y) (4)
(2) Calculating the amplitude and direction of the gradient by using finite difference of first-order partial derivatives to obtain a gradient image f 2
First order differential convolution template
Figure BDA0001881976450000041
(3) Carrying out non-maximum suppression on the gradient amplitude to obtain a non-maximum suppression image f 3
In the gradient image f 2 Compares the central pixel S of the 8 neighbourhood with the two pixels along the gradient line at each point of (a); if the gradient value of S is not larger than the gradient values of two adjacent pixels along the gradient line, making S equal to 0;
(4) detecting and connecting edges by using a dual-threshold algorithm;
suppression of images f for non-maxima 3 Setting two thresholds T 1 And T 2 ,T 1 =0.4T 2 Making the gradient value less than T 1 The gray value of the pixel (d) is set to 0, and an image f is obtained 4 (ii) a Then the gradient value is less than T 2 The gray value of the pixel (d) is set to 0, and an image f is obtained 5 (ii) a With image f 5 Based on the image f 4 For supplement, connecting the edges of the images to obtain an initial lane line edge image I 2
The invention has the further improvement that the concrete steps of the third step are as follows:
(1) detecting a Hough straight line;
for any point A (x) in the rectangular coordinate system 0 ,y 0 ) Straight line passing through point A satisfies
y=kx+l(5)
Where k is the slope and l is the intercept, then the point A (X) is crossed in the X-Y plane 0 ,y 0 ) The linear clusters of (a) are all expressed by formula (5), and cannot be expressed if the slope of a straight line perpendicular to the X axis is infinite; therefore, the rectangular coordinate system is converted into a polar coordinate system;
the equation representing a straight line in the polar coordinate system is ρ ═ xcos θ + ysin θ (6)
Wherein rho is the normal distance from the original point to the straight line, and theta is the positive included angle between the normal and the X axis; one point in the image space corresponds to one sine curve in the polar coordinate system rho-theta; detecting a straight line in an image space by detecting an intersection point of a ρ - θ space; discretizing rho and theta, respectively calculating the value of the corresponding parameter rho according to a formula (6) at each value corresponding to the parameter theta, and then adding 1 to the corresponding parameter accumulation unit; finally, counting the value of each accumulation unit, and if the value is larger than a preset threshold value H, considering that the group of parameters are parameters of straight lines in an image space, so as to mark the straight lines in the image;
(2) removing interference edges;
searching and reserving the whole edge containing the pixel for each edge pixel in the straight line marked in the step (1), and eliminating the edge without common pixel points with the marked straight line, thereby obtaining an edge image I 3
The invention has the further improvement that the concrete process of the step four is as follows:
the optical axis of the camera is assumed to be parallel to the plane of the driving road of the vehicle and parallel to the left lane line and the right lane line;
knowing a world coordinate system (X, Y, Z) and an image coordinate system (U, V), the maximum horizontal viewing angle of the camera is alpha, the maximum vertical viewing angle is beta, and the coordinates of the camera mounting position in the world coordinate system are C (d,0, h), where h is the camera mounting height, i.e., the value of the camera on the Z axis of the world coordinate system, and d is the camera mounting horizontal offset, i.e., the value of the camera on the X axis of the world coordinate system; the optical axis of the camera is parallel to the plane of a driving road of the vehicle, and the included angle between the optical axis of the camera and the lane line is gamma; according to the camera geometric imaging model, the mapping model of a certain point P (X, Y,0) on the road surface in the world coordinate system (X, Y, Z) and a relative coordinate point Q (U, V) in the image coordinate system (U, V) is:
Figure BDA0001881976450000051
Figure BDA0001881976450000052
Figure BDA0001881976450000053
Figure BDA0001881976450000054
in the formula H I ,W I Respectively the horizontal resolution and the vertical resolution of the image after the camera imaging;
according to the camera imaging principle, the length of the lane line segment in the imaged image is shortened along with the increase of the distance between the lane line segment and the camera in the world coordinate system, and similarly, the same left lane line and right lane line spacing on the road surface in the world coordinate system is wider in the lane line spacing obtained by imaging in the near vision field and narrower in the lane line spacing obtained by imaging in the far vision field; and deducing the length delta v of a line segment in the u-th row in an image coordinate system corresponding to the lane line segment with the length delta Y in the world coordinate system by combining a geometric camera imaging model as follows:
Figure BDA0001881976450000061
the space width delta u in the v-th line in the image coordinate system corresponding to the left and right lane line space with the width delta X in the world coordinate system is as follows:
Figure BDA0001881976450000062
the invention has the further improvement that the concrete process of the step five is as follows:
starting from the bottom of the lane line edge image at v i Arranging horizontal scanning lines Line i in rows, wherein m is more than or equal to i and less than or equal to n, and obtaining control point pairs (Li, R) at the intersection points of the Line i and the left and right lane lines i ) Wherein L is i Has the coordinate of (u) i ,v i ),R i The coordinate is (u) i ',v i ) (ii) a Defining v from imaging model constraints i The calculation formula is as follows:
Figure BDA0001881976450000063
the following equations (8), (10), (11) are derived:
Figure BDA0001881976450000064
wherein v is 1 ,Δv 1 Is a preset value; by substituting formula (14) for formula (13) to obtain v i A value of (d); thereby finding that the vertical coordinate of the control point determined by the ith scanning Line i is equal to v i (ii) a Searching edge points from the middle point of the scanning line to the left and right sides respectively, obtaining the intersection point of the first pair of scanning lines and the left and right lane lines as a control point, and determining the coordinates (u) of the pair of control points i ,v i ) And (u' i ,v i )。
The invention has the further improvement that the concrete process of the step five is as follows:
assuming that the left lane Line and the right lane Line are parallel, solving the control point L determined by the ith scanning Line i i 、R i Abscissa u of i And u' i (ii) a Deriving Δ u from the imaging model constraints and equation (14) i+1 And Δ u i The relation of (A) is as follows:
Figure BDA0001881976450000065
u′ i =u i +Δu i (16)
the included angle gamma between the optical axis of the camera and the lane line is calculated according to the following formula:
Figure BDA0001881976450000071
for the case of control point loss due to lane line edge loss, L is paired with adjacent control points 1 、L 2 When known, the abscissa u of the control point is calculated according to equations (15) to (17) 2 (ii) a In the case of control point mislocation caused by false edges, it is verified whether all the adjacent control point pair pitch width ratios satisfy equation (15), thereby detecting the wrong control point coordinates and relocating them according to equations (15) - (17).
The invention has the further improvement that the specific process of the sixth step is as follows:
suppose B-spline curve S is composed of n +1 control point sets { P } 0 ,P 1 ,...P n And (c) then each point on the curve S satisfies:
Figure BDA0001881976450000072
wherein B is i,m (o) is a basic B-spline function, m is not less than 2 and not more than n +1, t min ≤u≤t max ,t j Is a node, j is 0, when each node t is a node t j When the distance between the two splines is equal, the B spline curve is called as a uniform B spline curve, otherwise, the B spline curve is called as a non-uniform B spline curve; according to the NUBS interpolation method, if m is known to be more than or equal to 3 for the control points, the lane lines are fitted by adopting an m-1 order polynomial function; if 4 pairs of control points can be determined, carrying out NUBS interpolation by adopting a third-order polynomial function so as to fit a lane line; if only 3 pairs of control points are determined, a second order polynomial function is used to fit the lane lines.
Compared with the prior art, the invention has the following beneficial effects: firstly, performing median filtering and histogram equalization processing on an image to obtain an enhanced lane line image; secondly, performing edge detection on the image by adopting a Canny operator to obtain a lane line edge image; then, Hough transformation straight line detection is carried out on the edge image, so that the background interference edge is reduced while the edge continuity is improved; and finally, solving the parameters of the non-uniform B-spline curve model by combining the position information of the pixels at the edge of the lane line, and realizing the fitting of the lane line. The method can effectively improve the positioning precision of the control points and the detection accuracy of the lane lines, and improve the robustness of the lane line detection algorithm based on curve fitting to background interference.
Further, based on the assumption that the optical axis of the camera is parallel to the road plane and the assumption that the left lane line and the right lane line are parallel, a control point estimation model under the constraint of a lane line-camera imaging model is deduced on the basis of a camera geometric imaging model; the method can reduce the interference of vehicle shielding, tree shading, building shading, road surface damage and various non-lane line surface mark lines to the determination of the control points, improve the robustness of the non-uniform B-spline curve control point extraction method and improve the lane line detection accuracy.
Drawings
FIG. 1 is a schematic diagram of the position of a camera in a world coordinate system;
FIG. 2 is a schematic diagram of a control point determination process in the case of lane line continuity;
FIG. 3 shows a first lane marking detection result;
FIG. 4 is a schematic diagram of a control point determination process in the case of a lane line discontinuity;
FIG. 5 shows the second lane marking detection result;
FIG. 6 is a flow chart of a non-uniform B-spline curve fitting lane line detection algorithm based on imaging model constraints.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings.
The invention provides a method for detecting a lane line based on imaging model constrained non-uniform B-spline curve fitting, which comprises the following steps:
the method comprises the following steps: preprocessing an image;
acquiring an original lane line image I from a lane line standard image library in an intra-card Chiilong image database, performing median filtering on the original lane line image I to remove salt-pepper noise, performing histogram equalization to enhance the brightness and contrast of the image to make edge features prominent, and obtaining an enhanced lane line image I 1
Median filter function f ^ (x, y)
Figure BDA0001881976450000081
Where f ^ (x, y) is the median filter output, S xy Representing the set of coordinates of a rectangular sub-image window centered at (x, y) and of size M × N, and f (a, b) is the pixel gray scale value with coordinates (a, b).
Histogram equalization function s k
Figure BDA0001881976450000091
Wherein s is k For histogram equalized output, r k Representing discrete gray levels (0 ≦ r) k ≤255,k=0,1,2,…,n-1),n i For the occurrence of r in the image i The number of pixels of such a gray scale, n being the total number of pixels in the image,
Figure BDA0001881976450000092
is the frequency in probability theory.
Step two: detecting edges;
for the enhanced lane line image I 1 Adopting a Canny operator to carry out edge detection to obtain an initial lane line edge image I 2
The method comprises the following specific steps:
(1) smoothing image I with Gaussian filter 1
Gaussian smoothing function G (x, y)
Figure BDA0001881976450000093
Using G (x, y) and image I 1 Performing convolution to obtain a smooth image f 1
f 1 (x,y)=I 1 (x,y)*G(x,y) (4)
(2) Calculating the amplitude and direction of the gradient by using the finite difference of the first-order partial derivatives to obtain a gradient image f 2
First order differential convolution template
Figure BDA0001881976450000094
(3) Carrying out non-maximum suppression on the gradient amplitude to obtain a non-maximum suppression image f 3
In the gradient image f 2 The central pixel S of the 8 neighbourhood is compared to the two pixels along the gradient line at each point of (a). If the gradient value of S is not larger than the gradient values of two adjacent pixels along the gradient line, S is made equal to 0.
(4) Edges are detected and connected using a dual threshold algorithm.
Suppression of non-maximum images f 3 Setting two thresholds T 1 And T 2 ,T 1 =0.4T 2 . Setting the gradient value less than T 1 The gray value of the pixel (d) is set to 0 to obtain an image f 4 . Then the gradient value is less than T 2 The gray value of the pixel (d) is set to 0 to obtain an image f 5 . With image f 5 Based on the image f 4 To complement the edges of the images, an edge image I is obtained 2
Step three: detecting a Hough straight line;
for the lane line edge image I obtained in the step two 2 Performing Hough linear detection, only keeping edges containing linear detection results, removing other interference edges to obtain an edge image I 3 . The method comprises the following specific steps:
(1) detecting a Hough straight line;
for any point A (x) in the rectangular coordinate system 0 ,y 0 ) Straight line passing through point A satisfies
y=kx+l(5)
Where k is the slope and l is the intercept. Then point A (X) is crossed in the X-Y plane 0 ,y 0 ) The linear clusters of (2) can be expressed by equation (5), but cannot be expressed if the slope of the line perpendicular to the X axis is infinite. The special case can be solved by converting the rectangular coordinate system to the polar coordinate system.
The equation representing a straight line in the polar coordinate system is ρ ═ xcos θ + ysin θ (6)
Wherein rho is the distance from the original point to the normal of the straight line, and theta is the positive included angle between the normal and the X axis. Then a point in image space corresponds to a sinusoid in the polar coordinate system p-theta. The straight line in the image space is detected by detecting the intersection point of the ρ - θ space. Discretizing rho and theta, respectively calculating the value of the corresponding parameter rho according to a formula (6) at each value corresponding to the parameter theta, and then adding 1 to the corresponding parameter accumulation unit. Finally, counting the value of each accumulation unit, and if the value is larger than a preset threshold value H, considering that the group of parameters are parameters of straight lines in the image space, so as to mark the straight lines in the image;
(2) removing interference edges;
for each edge pixel in the straight line marked in the step (1), searching the whole edge containing the pixel and reserving the whole edge, and eliminating the edges which have no common pixel point with the marked straight line, thereby obtaining an edge image I 3
Step four: deducing constraint conditions of the imaging model;
based on a camera geometric imaging model, imaging model constraint conditions are deduced on the basis of the assumptions that the optical axis of a camera is parallel to the plane of a vehicle driving road and that the left lane line and the right lane line are parallel.
Knowing the world coordinate system (X, Y, Z) and the image coordinate system (U, V), the camera maximum horizontal viewing angle is α, the maximum vertical viewing angle is β, and the coordinates of the camera mounting position in the world coordinate system are C (d,0, h), where h is the camera mounting height, i.e., the value of the camera on the Z-axis of the world coordinate system, and d is the camera mounting horizontal offset, i.e., the value of the camera on the X-axis of the world coordinate system. The optical axis of the camera is parallel to the plane of the driving road of the vehicle, and the included angle between the optical axis of the camera and the lane line is gamma. According to the camera geometric imaging model, the mapping model of a certain point P (X, Y,0) on the road surface in the world coordinate system (X, Y, Z) and a relative coordinate point Q (U, V) in the image coordinate system (U, V) is known as follows:
Figure BDA0001881976450000111
Figure BDA0001881976450000112
Figure BDA0001881976450000113
Figure BDA0001881976450000114
in the formula H I ,W I Respectively the horizontal resolution and the vertical resolution of the image after the camera imaging.
According to the imaging principle of the camera, the length of the lane line segment in the imaged image is shortened along with the increase of the distance between the lane line segment and the camera in the world coordinate system, namely the lane line segments with the same length on the road in the world coordinate system are longer in the near-vision field and shorter in the far-vision field. Similarly, the same distance between the left and right lane lines on the road surface in the world coordinate system is wider in the near vision field and narrower in the far vision field. According to the imaging fact, in combination with the camera imaging model, the length Δ v of the line segment in the u-th column in the image coordinate system corresponding to the lane line segment with the length Δ Y in the world coordinate system can be deduced as follows:
Figure BDA0001881976450000115
and the space width delta u in the v-th line in the image coordinate system corresponding to the left and right lane line space with the width delta X in the world coordinate system is as follows:
Figure BDA0001881976450000121
step five: extracting control points of the non-uniform B-spline curve;
in order to solve the parameters of the NUBS curve model, a proper control point needs to be determined firstly, and the method comprises the following steps: and (3) from the bottom of the image of the lane line edge, setting scanning lines in the image according to the imaging model constraint by using the length corresponding to delta Y in the world coordinate system, wherein the intersection point of each scanning line and the left and right lane edges is a pair of control points. The mathematical description of the process is as follows: starting from the bottom of the lane line edge image at v i Horizontal scanning lines Line i are arranged in rows (i is more than or equal to m and less than or equal to n), and control point pairs (Li, R) are obtained at the intersection points of the Line i and the left and right lane lines i ) Wherein L is i Has the coordinates of (u) i ,v i ),R i The coordinate is (u) i ',v i ). The lane line is in a longitudinal extending trend in the image, the ordinate of each control point pair is reduced in sequence, the trend is not influenced by the included angle gamma between the optical axis of the camera and the lane line, the value of the gamma is ignored for simplifying the calculation process, and v is defined according to the constraint of the imaging model i The calculation formula is as follows:
Figure BDA0001881976450000122
derived from equations (8), (10), (11):
Figure BDA0001881976450000123
wherein v is 1 ,Δv 1 Is a preset value. By substituting formula (14) for formula (13), v can be determined sequentially i The value of (c). It can be determined from this that the control point ordinate determined by the ith scanning Line i is equal to v i . From the midpoint of the scan lineSearching edge points to the left and right sides, obtaining the intersection point of the first scanning line pair and the left and right lane lines as a control point, and then obtaining the abscissa of the control point pair so as to determine the coordinate (u) i ,v i ) And (u' i ,v i )。
In the control point determining process, it is assumed that the left lane line and the right lane line are not shielded, that is, the control point determined by each scanning line is the intersection point of the scanning line and the left lane line and the right lane line. However, the actually detected lane lines often have interference edges or discontinuous edges due to vehicle occlusion, tree shadows, building shadows, and road surface damage. In addition, the virtual lane line edges are also discontinuous. This may cause the intersection of the scan line with the left lane line or the right lane line to be not the actual correct intersection or to be absent.
Aiming at the problems, the control point L determined by the ith scanning Line i is solved by combining the assumption of parallel left and right lane lines on the basis of the control point determination method i 、R i Abscissa u of i And u' i . Deducing delta u according to the imaging model constraint and the formula (12) i+1 And Δ u i The relation of (A) is as follows:
Figure BDA0001881976450000131
u′ i =u i +Δu i (16)
the included angle gamma between the optical axis of the camera and the lane line is calculated according to the following formula:
Figure BDA0001881976450000132
for the case that the control point is lost due to the missing of the lane line edge, the L can be paired at the adjacent control point 1 、L 2 If known, the abscissa u of the control point is calculated according to equations (15) to (17) 2 (ii) a For the case of control point mislocation caused by false edges, all neighboring control point pair spacings can be verifiedWhether the width ratio satisfies equation (15) or not, thereby detecting the wrong control point coordinates and repositioning them according to equations (15) - (17).
Step six: and (6) lane line fitting.
And fifthly, the control point determining method obtains the control point information of the lane line edge, and the NUBS interpolation method can be used for fitting the left lane line and the right lane line. The B spline curve mathematical model is as follows:
suppose B-spline curve S is composed of n +1 control point sets { P } 0 ,P 1 ,...P n And (c) then each point on the curve S satisfies:
Figure BDA0001881976450000133
wherein B is i,m (o) is a basic B-spline function, m is more than or equal to 2 and less than or equal to n +1, t min ≤u≤t max ,t j (j ═ 0., i + m) is a node, when each node t is a node j And when the B spline curve is equidistant, the B spline curve is called as a uniform B spline curve, otherwise, the B spline curve is called as a non-uniform B spline curve. According to the NUBS interpolation method, the following steps are known: if m (m ≧ 3) pairs of control points are known, the lane lines can be fitted with m-1 order polynomial functions. If 4 pairs of control points can be determined, NUBS interpolation can be carried out by adopting a third-order polynomial function so as to fit a lane line; if only 3 pairs of control points are determined, a second order polynomial function may be used to fit the lane lines.
And (4) substituting the coordinates of the control points determined in the step five into an equation (18), solving a spline curve S (u), and displaying the spline curve S (u) in the original lane line image I to finish the detection of the lane line.
This is illustrated below by means of a specific example.
Referring to FIG. 6, the method of the present invention is applied to a web of size W I ×H I And (240 × 256) performing edge detection on the lane line image, and determining control points by combining the imaging model constraint and the lane line edge position information, so as to solve the non-uniform B-spline curve parameters and realize lane line detection.
The method is realized by the following steps:
the method comprises the following steps: obtaining an original lane line image I from a lane line standard image library in an intra-card Chiilong image database, carrying out median filtering on the image I to remove salt-pepper noise, then carrying out histogram equalization to enhance the brightness and contrast of the image so as to enable edge features to be prominent, and obtaining an enhanced lane line image I 1 . In the invention, a 3 multiplied by 3 rectangular sub-image window is taken during median filtering, and a gray discrete parameter r is taken during histogram equalization k In the range of 0 to r k ≤255。
Step two: for the enhanced original lane line image I 1 Adopting a Canny operator to carry out edge detection to obtain an initial lane line edge image I 2 . The smoothing parameter sigma of the Gaussian smoothing function adopted when the edge detection is carried out in the invention is 1, and the first order differential convolution template
Figure BDA0001881976450000141
Double threshold T 1 And T 2 As a default value, T is satisfied 1 =0.4T 2
Step three: for the initial lane line edge image I obtained in the step two 2 Hough linear detection is carried out by adopting a Hough algorithm, each edge pixel in the marked linear is searched for the whole edge containing the pixel and is reserved, the edges which have no common pixel point with the marked linear are eliminated, and therefore an edge image I is obtained 3 . The parameters of the Hough algorithm are all default values;
step four: from lane line edge image I 3 Starting from the bottom, arranging horizontal scanning lines Line i (i is more than or equal to 2 and less than or equal to 6) in the vi rows, and obtaining control point pairs (Li, R) at the intersection points of the Line i and the left and right lane lines i ) Wherein L is i Has the coordinates of (u) i ,v i ),R i The coordinate is (u) i ',v i ). V in the invention 1 =0,Δv 1 20 is a preset value. By substituting formula (14) for formula (13), v can be determined sequentially i The value of (c). It can be determined from this that the control point ordinate determined by the ith scanning Line i is equal to v i . Searching edge points from the middle point of the scanning line to the left and right sides respectively to obtain the intersection point of the first pair of scanning lines and the left and right lane lines as the intersection pointThe control point can know the abscissa of the control point pair, and determine its coordinate (u) i ,v i ) And (u' i ,v i )。
And (3) actually detecting the interference edge or discontinuous edge of the obtained lane line caused by vehicle occlusion, tree shadow, building shadow, road surface damage or a virtual lane line. For the case that the control point is lost due to the missing of the lane line edge, L can be paired at the adjacent control point 1 、L 2 If known, the abscissa u of the control point is calculated according to equations (15) to (17) 2 (ii) a In the case of control point mislocation caused by false edges, it is possible to verify whether all the neighboring control point pair pitch width ratios satisfy equation (15), thereby detecting the wrong control point coordinates and relocating them according to equations (15) - (17).
Step five: and fourthly, the control point determining method obtains the control point information of the lane line edge, and the NUBS interpolation method can be used for fitting the left lane line and the right lane line. If 4 pairs of control points can be determined, NUBS interpolation can be carried out by adopting a third-order polynomial function so as to fit a lane line; if only 3 pairs of control points are determined, a second order polynomial function may be used to fit the lane lines. And (4) substituting the coordinates of the control points determined in the step four into an equation (18), solving a spline curve S (u), and displaying the spline curve S (u) in the original lane line image I to finish the detection of the lane line.
And (3) performing median filtering and histogram equalization operation on a lane line image with the size of 240 multiplied by 256, and extracting edge characteristics of the lane line by adopting a Canny edge detection operator. And then, carrying out Hough straight line detection on the lane line edge image: and (3) carrying out Hough linear detection on the lane line edge image by combining the characteristic that the lane line edge near the field of view is linear, and only keeping the edges containing the linear detection result, thereby further eliminating the interference edges formed by background buildings, tree shades, obstacles, pavement holes, cracks and the like.
Fig. 1 is a schematic diagram of the position of a camera in a world coordinate system, and an imaging model is derived according to the imaging principle of the camera, wherein the imaging model is expressed by formulas (11) to (12).
FIG. 2 is a lane line edge continuityThe control point determination process for the case is schematically illustrated. Preset v 1 =0,Δv 1 When the formula (14) is substituted for the formula (13) in 20, v can be obtained in sequence i The value of (c): v. of 2 =167,v 3 =117,v 4 =78,v 5 =46,v 6 21. Starting from the bottom of the lane line edge image at v i Horizontal scanning lines Line i (i is more than or equal to 2 and less than or equal to 6) are arranged in rows, and control point pairs (Li, R) are obtained at the intersection points of the Line i and the left and right lane lines i ) Wherein L is i Has the coordinates of (u) i ,v i ),R i The coordinate is (u' i ,v i ). It can be determined from this that the control point ordinate determined by the ith scanning Line i is equal to v i . Searching edge points from the middle point of the scanning line to the left and right sides respectively, obtaining the intersection point of the first pair of scanning lines and the left and right lane lines as a control point, and then obtaining the abscissa of the control point pair so as to determine the coordinate (u) i ,v i ) And (u) i ',v i ). Accordingly, there are obtained 5 sets of control point pair coordinates { (92,21) (117,21) }, { (80,46) (145,46) }, { (67,78) (175,78) }, { (51,117) (212,117) }, { (33,167) (254,167) }. Then, according to the NUBS interpolation algorithm, a fourth-order polynomial function is adopted to carry out NUBS interpolation, and the obtained fitting curve is displayed in the lane line image as shown in figure 3.
Fig. 4 is a schematic diagram of the control point determination process in the case of a discontinuity in the lane line edge. Same preset v 1 =0,Δv 1 When the formula (14) is substituted for the formula (13) in 20, v can be obtained in sequence i The value of (c): v. of 2 =167,v 3 =117,v 4 =78,v 5 =46,v 6 21. Starting from the bottom of the lane line edge image at v i Line setting horizontal scanning lines Line i (2 is less than or equal to i is less than or equal to 6), wherein intersection points of Line 5 and Line 6 and the left lane Line are absent, and only 3 left lane Line control points are preliminarily determined to be { (15,167), (39,117), (59,78) }; the 5 right lane control points are { (253,167), (213,117), (181,78), (155,46), (134,21) }, respectively. Estimation of Δ u from equations (10) to (12) 5 =80,u 5 =75;Δu 6 =46,u 6 88. Thereby determining that the coordinates of the two control points of which the left lane line is missing are (75,46), (88,21). Then, according to the NUBS interpolation algorithm, a fourth-order polynomial function is adopted to carry out NUBS interpolation, and the obtained fitting curve is displayed in the lane line image as shown in FIG. 5.
As can be seen from fig. 3 and 5, the lane line detection is performed according to the above method, and a better detection result is achieved. The embodiment shows that the scheme of the invention can effectively improve the positioning accuracy and the success rate of the control point of the non-uniform B-spline curve model, simultaneously considers the calculated amount and has better detection accuracy and real-time property.
While specific embodiments of the present invention have been described, it should be understood that the present invention is not limited to the specific embodiments described below, and all equivalent modifications based on the technical solutions of the present invention are intended to fall within the scope of the present invention.

Claims (7)

1. The method for detecting the lane line based on the imaging model constrained non-uniform B-spline curve fitting is characterized by comprising the following steps:
the method comprises the following steps: preprocessing an image;
acquiring an original lane line image I from a lane line standard image library in an intra-card Chiilong image database, performing median filtering on the original lane line image I to remove salt-pepper noise, performing histogram equalization to enhance the brightness and contrast of the image to make edge features prominent, and obtaining an enhanced lane line image I 1
Step two: detecting edges;
for the enhanced lane line image I 1 Adopting a Canny operator to carry out edge detection to obtain an initial lane line edge image I 2
Step three: detecting a Hough straight line;
for the initial lane line edge image I obtained in the step two 2 Performing Hough linear detection, reserving edges containing linear detection results, removing other interference edges to obtain an edge image I 3
Step four: deducing constraint conditions of the imaging model;
the derived imaging model constraints are as follows:
the length delta v of the line segment in the u-th column in the image coordinate system corresponding to the lane line segment with the length delta Y in the world coordinate system is as follows:
Figure FDA0003640946150000011
the pitch width Δ u in the v-th line in the image coordinate system corresponding to the left and right lane line pitch having the width Δ X in the world coordinate system is:
Figure FDA0003640946150000012
the concrete process of the step four is as follows:
the optical axis of the camera is assumed to be parallel to the plane of the driving road of the vehicle and parallel to the left lane line and the right lane line;
knowing a world coordinate system (X, Y, Z) and an image coordinate system (U, V), the maximum horizontal viewing angle of the camera is alpha, the maximum vertical viewing angle is beta, and the coordinates of the camera mounting position in the world coordinate system are C (d,0, h), where h is the camera mounting height, i.e., the value of the camera on the Z axis of the world coordinate system, and d is the camera mounting horizontal offset, i.e., the value of the camera on the X axis of the world coordinate system; the optical axis of the camera is parallel to the plane of the driving road of the vehicle, and the included angle between the optical axis of the camera and the lane line is gamma; according to the camera geometric imaging model, the mapping model of a certain point P (X, Y,0) on the road surface in the world coordinate system (X, Y, Z) and a relative coordinate point Q (U, V) in the image coordinate system (U, V) is:
Figure FDA0003640946150000021
Figure FDA0003640946150000022
Figure FDA0003640946150000023
Figure FDA0003640946150000024
in the formula H I ,W I Respectively the horizontal resolution and the vertical resolution of the image after the camera imaging;
according to the camera imaging principle, the length of the lane line segment in the imaged image is shortened along with the increase of the distance between the lane line segment and the camera in the world coordinate system, and similarly, the same left lane line and right lane line spacing on the road surface in the world coordinate system is wider in the lane line spacing obtained by imaging in the near vision field and narrower in the lane line spacing obtained by imaging in the far vision field; and deducing the length delta v of a line segment in the u-th column in an image coordinate system corresponding to the lane line segment with the length delta Y in the world coordinate system by combining a geometric camera imaging model, wherein the length delta v is as follows:
Figure FDA0003640946150000025
the pitch width Δ u in the v-th line in the image coordinate system corresponding to the left and right lane line pitch having the width Δ X in the world coordinate system is:
Figure FDA0003640946150000026
step five: extracting control points of the non-uniform B-spline curve;
setting scanning lines in an image according to the length of delta Y in a corresponding world coordinate system and the constraint of an imaging model, wherein the intersection point of each scanning line and the edges of the left lane and the right lane is a pair of control points; the specific process is as follows:
starting from the bottom of the lane line edge image at v i Horizontal scanning lines Line i are arranged in rows, m is less than or equal to i and less than or equal to n, and control point pairs (Li, R) are obtained at the intersection points of the Line i and the left and right lane lines i ) Wherein L is i Has the coordinates of (u) i ,v i ),R i The coordinate is (u) i ',v i ) (ii) a According to the imaging modelConstraint definition v i The calculation formula is as follows:
Figure FDA0003640946150000031
derived from equations (8), (10), (11):
Figure FDA0003640946150000032
wherein v is 1 ,Δv 1 Is a preset value; v is obtained by substituting formula (14) for formula (13) i A value of (d); thereby finding that the vertical coordinate of the control point determined by the ith scanning Line i is equal to v i (ii) a Searching edge points from the middle point of the scanning line to the left and right sides respectively, and determining the coordinates (u) of a pair of control points by using the intersection points of the first pair of scanning lines and the left and right lane lines as the control points i ,v i ) And (u) i ',v i );
Step six: fitting a lane line;
and fifthly, obtaining the non-uniform B-spline curve control point information, and fitting the left lane line and the right lane line by using a NUBS interpolation method to complete the detection of the lane lines.
2. The imaging model constraint-based non-uniform B-spline curve fitting lane line detection method according to claim 1, wherein in the step one, during median filtering, a median filtering function f ^ (x, y) is adopted as:
Figure FDA0003640946150000033
wherein f is ^ (x, y) is the median filtered output, S xy Representing the set of coordinates of a rectangular sub-image window centered at (x, y) and of size M × N, and f (a, b) is the pixel gray scale value with coordinates (a, b).
3. According to the claimSolving 1 the method for detecting the lane line based on the imaging model constrained non-uniform B-spline curve fitting is characterized in that in the step one, when histogram equalization is carried out, a histogram equalization function s k Comprises the following steps:
Figure FDA0003640946150000041
wherein s is k For histogram equalized output, r k Representing discrete gray levels, 0 ≦ r k ≤255,k=0,1,2,…,n-1,n i For the occurrence of grey levels r in an image i N is the total number of pixels in the image,
Figure FDA0003640946150000042
is the frequency in probability theory.
4. The imaging model constraint-based non-uniform B-spline curve fitting lane line detection method according to claim 1, wherein the specific steps of step two are as follows:
(1) smoothing image I with Gaussian filter 1
The gaussian smoothing function G (x, y) is:
Figure FDA0003640946150000043
using G (x, y) and the enhanced lane line image I 1 Performing convolution to obtain a smooth image f 1
f 1 (x,y)=I 1 (x,y)*G(x,y)(4)
(2) Calculating the amplitude and direction of the gradient by using finite difference of first-order partial derivatives to obtain a gradient image f 2
First order differential convolution template
Figure FDA0003640946150000044
(3) For gradient amplitudeThe non-maximum value is suppressed to obtain a non-maximum value suppressed image f 3
In the gradient image f 2 Compares the central pixel S of the 8 neighbourhood with the two pixels along the gradient line at each point of (a); if the gradient value of S is not larger than the gradient values of two adjacent pixels along the gradient line, making S equal to 0;
(4) detecting and connecting edges by using a dual-threshold algorithm;
suppression of non-maximum images f 3 Setting two thresholds T 1 And T 2 ,T 1 =0.4T 2 Making the gradient value less than T 1 The gray value of the pixel (d) is set to 0 to obtain an image f 4 (ii) a Then the gradient value is less than T 2 The gray value of the pixel (d) is set to 0 to obtain an image f 5 (ii) a With image f 5 Based on the image f 4 For supplement, the edges of the images are connected to obtain an initial lane edge image I 2
5. The imaging model constraint-based non-uniform B-spline curve fitting lane line detection method according to claim 1, characterized in that the concrete steps of step three are as follows:
(1) detecting a Hough straight line;
for any point A (x) in the rectangular coordinate system 0 ,y 0 ) Straight line passing through point A satisfies
y=kx+l(5)
Where k is the slope and l is the intercept, then the point A (X) is crossed in the X-Y plane 0 ,y 0 ) The linear clusters of (a) are all expressed by formula (5), and cannot be expressed if the slope of a straight line perpendicular to the X axis is infinite; therefore, the rectangular coordinate system is converted into a polar coordinate system;
the equation representing a straight line in a polar coordinate system is
ρ=xcosθ+ysinθ(6)
Wherein rho is the normal distance from the original point to the straight line, and theta is the positive included angle between the normal line and the X axis; one point in the image space corresponds to one sine curve in the polar coordinate system rho-theta; detecting a straight line in an image space by detecting an intersection point of the ρ - θ space; discretizing rho and theta, respectively calculating the value of the corresponding parameter rho according to a formula (6) at each value corresponding to the parameter theta, and then adding 1 to the corresponding parameter accumulation unit; finally, counting the value of each accumulation unit, and if the value is larger than a preset threshold value H, considering that the parameter is the parameter of a straight line in an image space, so as to mark the straight line in the image;
(2) removing interference edges;
searching and reserving the whole edge containing the pixel for each edge pixel in the straight line marked in the step (1), and eliminating the edge without common pixel points with the marked straight line so as to obtain an edge image I 3
6. The imaging model constraint-based non-uniform B-spline curve fitting lane line detection method according to claim 1, characterized in that the concrete process of step five is as follows:
assuming that the left lane Line and the right lane Line are parallel, solving the control point L determined by the ith scanning Line i i 、R i Abscissa u of i And u' i (ii) a Deducing delta u according to the constraint condition of the imaging model and the formula (14) i+1 And Δ u i The relation of (A) is as follows:
Figure FDA0003640946150000051
u’ i =u i +Δu i (16)
the included angle gamma between the optical axis of the camera and the lane line is calculated according to the following formula:
Figure FDA0003640946150000061
for the case of control point loss due to lane line edge loss, L is paired with adjacent control points 1 、L 2 When known, the abscissa u of the control point is calculated according to equations (15) to (17) 2 (ii) a For the case of control point mislocation caused by false edges, it is verified whether all adjacent control point pair interval width ratios satisfyEquation (15), thereby detecting the erroneous control point coordinates and repositioning them according to equations (15) - (17).
7. The imaging model constraint-based non-uniform B-spline curve fitting lane line detection method according to claim 1, wherein the specific process of the sixth step is as follows:
suppose B-spline curve S is composed of n +1 control point sets { P } 0 ,P 1 ,...P n And (c) then each point on the curve S satisfies:
Figure FDA0003640946150000062
wherein B is i,m (o) is a basic B-spline function, m is more than or equal to 2 and less than or equal to n +1, t min ≤u≤t max ,t j Is a node, j is 0.. times.i + m, when each node t is j When the distance between the two splines is equal, the B spline curve is called as a uniform B spline curve, otherwise, the B spline curve is called as a non-uniform B spline curve; according to the NUBS interpolation method, if m is known to be more than or equal to 3 for the control points, the lane lines are fitted by adopting an m-1 order polynomial function; if 4 pairs of control points can be determined, carrying out NUBS interpolation by adopting a third-order polynomial function so as to fit out a lane line; if only 3 pairs of control points are determined, a second order polynomial function is used to fit the lane lines.
CN201811427546.XA 2018-11-27 2018-11-27 Method for detecting lane line fitting based on imaging model constrained non-uniform B-spline curve Active CN109583365B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811427546.XA CN109583365B (en) 2018-11-27 2018-11-27 Method for detecting lane line fitting based on imaging model constrained non-uniform B-spline curve

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811427546.XA CN109583365B (en) 2018-11-27 2018-11-27 Method for detecting lane line fitting based on imaging model constrained non-uniform B-spline curve

Publications (2)

Publication Number Publication Date
CN109583365A CN109583365A (en) 2019-04-05
CN109583365B true CN109583365B (en) 2022-07-26

Family

ID=65924533

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811427546.XA Active CN109583365B (en) 2018-11-27 2018-11-27 Method for detecting lane line fitting based on imaging model constrained non-uniform B-spline curve

Country Status (1)

Country Link
CN (1) CN109583365B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135252A (en) * 2019-04-11 2019-08-16 长安大学 A kind of adaptive accurate lane detection and deviation method for early warning for unmanned vehicle
WO2022082571A1 (en) * 2020-10-22 2022-04-28 华为技术有限公司 Lane line detection method and apparatus
EP4224361A4 (en) * 2020-10-22 2023-08-16 Huawei Technologies Co., Ltd. Lane line detection method and apparatus
CN112818873B (en) * 2021-02-04 2023-05-26 苏州魔视智能科技有限公司 Lane line detection method and system and electronic equipment
CN113450380A (en) * 2021-07-17 2021-09-28 普达迪泰(天津)智能装备科技有限公司 Track calibration method based on airport runway scribed lines
CN114912159B (en) * 2022-07-18 2022-09-13 中国铁路设计集团有限公司 Method for fitting geometric line shape of rail transit line plane
CN115123218B (en) * 2022-09-02 2022-11-22 小米汽车科技有限公司 Vehicle detection method and device and electronic equipment thereof
CN115311314B (en) * 2022-10-13 2023-02-17 深圳市华汉伟业科技有限公司 Resampling method, system and storage medium for line laser contour data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102103747A (en) * 2009-12-16 2011-06-22 中国科学院电子学研究所 Method for calibrating external parameters of monitoring camera by adopting reference height
EP2602744A1 (en) * 2011-12-08 2013-06-12 Delphi Technologies, Inc. Method for detecting and tracking lane markings
CN103177246A (en) * 2013-03-26 2013-06-26 北京理工大学 Dual-model lane line identification method based on dynamic area division
CN108280450A (en) * 2017-12-29 2018-07-13 安徽农业大学 A kind of express highway pavement detection method based on lane line

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8676494B2 (en) * 2010-09-29 2014-03-18 Navteq B.V. Multi-dimensional road representation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102103747A (en) * 2009-12-16 2011-06-22 中国科学院电子学研究所 Method for calibrating external parameters of monitoring camera by adopting reference height
EP2602744A1 (en) * 2011-12-08 2013-06-12 Delphi Technologies, Inc. Method for detecting and tracking lane markings
CN103177246A (en) * 2013-03-26 2013-06-26 北京理工大学 Dual-model lane line identification method based on dynamic area division
CN108280450A (en) * 2017-12-29 2018-07-13 安徽农业大学 A kind of express highway pavement detection method based on lane line

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于成像模型的车道线检测与跟踪方法;陈龙等;《中国公路学报》;20111115;第24卷(第6期);第96-102页 *
基于车—路视觉协同的行车环境感知方法研究;穆柯楠;《中国优秀博硕士学位论文全文数据库(博士)》;20170515(第5期);第25-41页 *

Also Published As

Publication number Publication date
CN109583365A (en) 2019-04-05

Similar Documents

Publication Publication Date Title
CN109583365B (en) Method for detecting lane line fitting based on imaging model constrained non-uniform B-spline curve
CN107679520B (en) Lane line visual detection method suitable for complex conditions
CN107045629B (en) Multi-lane line detection method
CN109886896B (en) Blue license plate segmentation and correction method
CN110866924B (en) Line structured light center line extraction method and storage medium
CN110516550B (en) FPGA-based lane line real-time detection method
CN110569704A (en) Multi-strategy self-adaptive lane line detection method based on stereoscopic vision
CN109784344A (en) A kind of non-targeted filtering method of image for ground level mark identification
CN110197153B (en) Automatic wall identification method in house type graph
CN109785291A (en) A kind of lane line self-adapting detecting method
CN111179232A (en) Steel bar size detection system and method based on image processing
CN110414385B (en) Lane line detection method and system based on homography transformation and characteristic window
CN103400150A (en) Method and device for road edge recognition based on mobile platform
CN105719306A (en) Rapid building extraction method from high-resolution remote sensing image
CN106875430B (en) Single moving target tracking method and device based on fixed form under dynamic background
CN110245600B (en) Unmanned aerial vehicle road detection method for self-adaptive initial quick stroke width
CN112991420A (en) Stereo matching feature extraction and post-processing method for disparity map
CN114972575A (en) Linear fitting algorithm based on contour edge
CN111354047A (en) Camera module positioning method and system based on computer vision
CN114037970A (en) Sliding window-based lane line detection method, system, terminal and readable storage medium
CN108492306A (en) A kind of X-type Angular Point Extracting Method based on image outline
CN114612412A (en) Processing method of three-dimensional point cloud data, application of processing method, electronic device and storage medium
CN114241436A (en) Lane line detection method and system for improving color space and search window
CN110223356A (en) A kind of monocular camera full automatic calibration method based on energy growth
CN111428538B (en) Lane line extraction method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant