CN109272536B - Lane line vanishing point tracking method based on Kalman filtering - Google Patents

Lane line vanishing point tracking method based on Kalman filtering Download PDF

Info

Publication number
CN109272536B
CN109272536B CN201811110435.6A CN201811110435A CN109272536B CN 109272536 B CN109272536 B CN 109272536B CN 201811110435 A CN201811110435 A CN 201811110435A CN 109272536 B CN109272536 B CN 109272536B
Authority
CN
China
Prior art keywords
line
sub
vanishing point
point
intersection point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811110435.6A
Other languages
Chinese (zh)
Other versions
CN109272536A (en
Inventor
陈卫刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Gongshang University
Original Assignee
Zhejiang Gongshang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Gongshang University filed Critical Zhejiang Gongshang University
Priority to CN201811110435.6A priority Critical patent/CN109272536B/en
Publication of CN109272536A publication Critical patent/CN109272536A/en
Application granted granted Critical
Publication of CN109272536B publication Critical patent/CN109272536B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a Kalman filtering-based line vanishing point tracking method, and belongs to the field of image processing. The invention takes the image sequence collected by the vehicle-mounted camera as input, detects the straight line segments in the image, and eliminates the line segments which do not contain or contain less line blocks by a machine learning method, thereby achieving the aim of really determining the vanishing points of the line by the straight line at the edge of the line. Compared with the prior art, the method can reduce the influence of various interference objects appearing in the vision field on the track line vanishing point estimation precision.

Description

Lane line vanishing point tracking method based on Kalman filtering
Technical Field
The invention relates to the field of image processing, in particular to a line vanishing point tracking method based on Kalman filtering.
Background
The Advanced Driving Assistance System (ADAS) senses the surrounding environment in the driving process of the automobile by using various sensors mounted on the automobile, collects data, identifies, detects and tracks static and dynamic objects, and lets the driver perceive the possible danger in advance, thereby effectively increasing the comfort and safety of automobile driving. In the images acquired by the cameras equipped in the ADAS system, the parallel lines on the road surface converge at one point in the image, the vanishing point. For applications such as lane line detection, lane line deviation warning, and leading vehicle detection, the position of the vanishing point in the image is very important input information.
Chinese patent 201610492617.9 discloses a vanishing point calibration method based on horizon line search, which requires defining a plurality of horizon line position templates, and determining the position of the horizon line in a verification manner; chinese patent 201710651702.X discloses a method of constructing a line-stacked image with a video as input, searching for a maximum value in the stacked image, determining a lane line from the position of the maximum value, and determining a vanishing point from the intersection of the lane lines. In addition to the lane marking, objects such as various vehicles, shadows cast by vehicles, lane separation guardrails, and the like often appear in the field of view observed by the camera of the ADAS system, and the intersection points formed by straight lines determined by the edges of the objects and the lane vanishing points usually have a large deviation, and the accuracy of vanishing point estimation is seriously affected by estimating the vanishing points with the intersection points.
Disclosure of Invention
The invention provides a method for tracking a lane vanishing point based on Kalman filtering, which takes an image sequence acquired by a vehicle-mounted camera as input, detects a straight line segment in an image, and eliminates a line segment which does not contain or contains a few lane blocks by a machine learning method, thereby achieving the aim of really determining the lane vanishing point by a lane edge straight line; and modeling the track line vanishing point coordinates into a state that a discrete dynamic system changes along with time, and tracking the track line vanishing point by using a Kalman filtering algorithm.
The technical scheme adopted by the invention is as follows:
a line vanishing point tracking method based on Kalman filtering comprises the following steps:
detecting edge pixels of an input image, and detecting line segments by the edge pixels through a Hough transform algorithm;
secondly, extracting image blocks of the detected line segments by taking points on the line segments as anchor points and by multiple scales and offsets, and identifying whether the extracted image blocks are road line blocks or not by using a pre-trained classifier;
counting the number of points of the image blocks on the line segment which are identified as the track line blocks, if the number is greater than a preset threshold value, adding the corresponding line segment into a candidate line segment set, and calculating that the weight value corresponding to the line segment is equal to the number of points of the image blocks which are identified as the track line blocks divided by the length of the line segment;
extending each line segment in the candidate line segment set to form a straight line, calculating an intersection point between every two straight lines for the non-parallel straight lines, and calculating a weighted average value and a covariance matrix of all samples by taking a two-dimensional coordinate of the intersection point as a sample;
and fifthly, setting the current frame as the t-th frame, estimating the lane vanishing point by using a Kalman filtering algorithm according to the weighted average value and the covariance of the intersection point coordinate samples of the current frame obtained by calculation in the first step to the fourth step and the lane vanishing point continuously tracked from the 0 th frame to the t-1 th frame, and outputting the estimated vanishing point as the tracking result of the t-th frame.
The steps in the above technical scheme can be realized in the following specific manner.
Taking the points on the line segments as anchor points, extracting image blocks by a plurality of scales and offsets, and identifying whether the extracted image blocks are line blocks by a pre-trained classifier, wherein the method comprises the following steps:
let (X, Y) be a point on the line segment, and the image block extracted with this point as the anchor point is I (X-delta)X,Y-δYW/s, H/s) represents by (X-delta)X,Y-δY) A rectangular image area with upper left corner coordinates, W/s and H/s being width and height, where W and H are the preset reference window width and height, respectively, and deltaXAnd deltaYRespectively the offset in the horizontal and vertical directions, s is a preset scale coefficient;
the classifier for identifying whether the extracted image block is a line block is a cascade classifier, wherein each stage is a strong classifier formed by combining a plurality of weak classifiers;
each weak classifier corresponds to a feature and is calculated according to the following formula:
Figure BDA0001808950100000021
wherein x is an image block to be detected, p is +/-1 and is used for controlling unequal sign directions, theta is a threshold value, and f is a characteristic value calculation function;
in the training process, the weighting error-division loss function of each weak classifier to be selected is calculated according to the following formula,
εt=minf,p,θiwi|h(xi,f,p,θ)-yi|
wherein x isiAnd yiRespectively, sample and corresponding label, if xiIs a positive sample, then yi1, otherwise yiH (-) is the weak classifier, and if the output value of the weak classifier is inconsistent with the marked symbol, the weak classifier is wrongly classified; and selecting the weak classifier with the minimum misclassification loss function value as the optimal weak classifier to form a strong classifier.
The corresponding characteristic of the weak classifier can adopt a Haar-like characteristic, and the calculation method comprises the following steps: firstly, taking a rectangular area of an image block x to be detected, and dividing the area into 2, 3 or 4 sub-areas with the same size; if the sub-area is divided into 2 sub-areas, the 2 sub-areas can be distributed left and right or up and down, and the characteristic value is the difference between the pixel value accumulation sum in one sub-area and the pixel value accumulation sum in the other sub-area; if the sub-area is divided into 3 sub-areas, the 3 sub-areas can be distributed in a left-middle-right mode or in an upper-middle-lower mode, and the characteristic value is a difference value obtained by subtracting the pixel value accumulation sum of the middle sub-area from the pixel value accumulation sum of the left sub-area, the right sub-area or the upper-lower sub-area; if the image is divided into four sub-regions, the horizontal direction and the vertical direction are respectively divided into two parts, and the characteristic value is the difference value obtained by subtracting the pixel value accumulated sum of the upper right sub-region and the lower right sub-region from the pixel value accumulated sum of the upper right sub-region and the lower left sub-region.
The fourth step is that the intersection point between every two non-parallel straight lines is calculated, the two-dimensional coordinate of the intersection point is taken as a sample, and the weighted average value and the covariance matrix of all the samples are calculated, wherein the method comprises the following steps:
for any two line segments LiAnd LjThe weights are ηiAnd ηjAnd extend the line segment LiAnd LjIf there is an intersection point in the formed straight line, the intersection point is given a weight: etaij
Let the intersection sample set be
Figure BDA0001808950100000031
Wherein (X)k,Yk) Coordinates of the kth intersection point; the weight set corresponding to the intersection point is
Figure BDA0001808950100000032
ηkIs the weight of the kth intersection point, N is the total number of the intersection point samples in the intersection point sample set, and the average value of the samples in the intersection point sample set is
Figure BDA0001808950100000033
Wherein
Figure BDA0001808950100000034
Figure BDA0001808950100000035
The covariance matrix of the samples is:
Figure BDA0001808950100000036
wherein
Figure BDA0001808950100000041
Figure BDA0001808950100000042
In the fifth step, estimating the lane vanishing point by a Kalman filtering algorithm according to the mean value and covariance of the intersection point coordinate samples of the current frame calculated in the first step to the fourth step and the lane vanishing point continuously tracked from the 0 th frame to the t-1 th frame, comprising:
firstly, modeling the vanishing point coordinates of the road line into the state of a discrete dynamic system changing along with time, and recordingthe vanishing point of the t frame is VtVanishing point V from the previous momentt-1The relation of (A) is as follows:
Vt=Vt-1+z
wherein z represents the process noise of the system, and conforms to the normal distribution with the mean value of 0 and the covariance matrix of Q;
secondly, from the lane vanishing point V of the previous momentt-1Predicting the vanishing point at time t from the covariance matrix P of the state errors at the previous timet-1Sum matrix Q predicts the state error covariance matrix P at time tt -
Figure BDA0001808950100000043
Pt -=Pt-1+Q
Wherein the content of the first and second substances,
Figure BDA0001808950100000044
representing the estimated trace vanishing point at time t, Q representing the noise covariance matrix of the system;
calculating Kalman gain according to the following formula:
Kt=Pt -(Pt -+Σ)-1
wherein Σ is the sample covariance matrix described in step four;
and thirdly, updating the vanishing point of the track line, wherein the calculation formula is as follows:
Figure BDA0001808950100000045
wherein u is the weighted average of the samples described in step four,
Figure BDA0001808950100000046
the updated time t is the vanishing point of the line;
and finally, updating the state error covariance matrix at the time t, and calculating the formula as follows:
Pt=(D-Kt)Pt -
where D is a 2X 2 identity matrix, PtIs the updated state error covariance matrix at time t.
The invention discloses a method for tracking a lane vanishing point based on Kalman filtering, which takes an image sequence acquired by a vehicle-mounted camera as input, detects straight line segments in an image, and eliminates line segments which do not contain or contain a few lane blocks by a machine learning method, thereby achieving the aim of really determining the lane vanishing point by a lane edge straight line. Compared with the prior art, the method can reduce the influence of various interference objects appearing in the vision field on the track line vanishing point estimation precision.
Drawings
FIG. 1 is a schematic flow chart of a trace vanishing point tracking method based on Kalman filtering according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of Haar-like feature calculation;
FIG. 3 is a schematic diagram of trace marking and positive sample extraction;
fig. 4 is a schematic flow chart of training a classifier by using the Adaboost algorithm.
Detailed Description
The invention provides a method for tracking a lane vanishing point based on Kalman filtering, which takes an image sequence acquired by a vehicle-mounted camera as input, detects a straight line segment in an image, and eliminates a line segment which does not contain or contains a few lane blocks by a machine learning method, thereby achieving the aim of really determining the lane vanishing point by a lane edge straight line; and modeling the track line vanishing point coordinates into a state that a discrete dynamic system changes along with time, and tracking the track line vanishing point by using a Kalman filtering algorithm.
As shown in FIG. 1, the flow of the trace vanishing point tracking method based on Kalman filtering of the present invention may include the following steps 101-105:
step 101, detecting edge pixels of an input image, and detecting line segments by the edge pixels through a Hough transform algorithm;
102, extracting image blocks of the detected line segments by taking points on the line segments as anchor points and by multiple scales and offsets, and identifying whether the extracted image blocks are road line blocks or not by using a pre-trained classifier;
103, counting the number of the points of the image blocks on the line segment, which are identified as the track line blocks, adding the corresponding line segment into a candidate line segment set if the number is greater than a preset threshold value, and calculating that the weight value corresponding to the line segment is equal to the number of the points of the image blocks, which are identified as the track line blocks, divided by the length of the line segment;
step 104, extending each line segment in the candidate line segment set to form a straight line, calculating an intersection point between every two straight lines for the non-parallel straight lines, and calculating a weighted average value and a covariance matrix of all samples by taking a two-dimensional coordinate of the intersection point as a sample;
and 105, setting the current frame as the tth frame, estimating a lane vanishing point by using a Kalman filtering algorithm according to the weighted average value and covariance of the intersection point coordinate samples of the tth frame obtained by calculation from the step 101 to the step 104 and the lane vanishing point continuously tracked from the 0 th frame to the t-1 th frame, and outputting the estimated vanishing point as the result of the tth frame.
The specific implementation of the above steps in this embodiment will be described below with reference to the drawings.
In step 102, for the detected line segment, points on the line segment are used as anchor points, image blocks are extracted in multiple scales and offsets, a pre-trained classifier is used to identify whether the extracted image blocks are road line blocks, specifically, (X, Y) is used as a point on the line segment, and the image blocks extracted by using the point as an anchor point are I (X-delta)X,Y-δYW/s, H/s) represents by (X-delta)X,Y-δY) A rectangular image area with upper left corner coordinates, W/s and H/s being width and height, where W and H are the preset reference window width and height, respectively, and deltaXAnd deltaYThe offset is in the horizontal and vertical directions, respectively, s is a preset scale factor, and one embodiment of the invention takes W as 24, H as 10, δXY∈{-8,-4,0,+4,+8},s∈{0.8,0.9,1.0,1.1,1.25}。
In step 102, a pre-trained classifier is used to identify whether an extracted image block is a road line block, the classifier is a cascade classifier, each stage of the classifier is a strong classifier formed by combining a plurality of weak classifiers, each weak classifier corresponds to a feature and is calculated according to the following formula:
Figure BDA0001808950100000061
where x is an image block to be detected, p is ± 1, and is used to control an unequal sign direction, θ is a threshold, and f is a feature value calculation function, in the embodiment of the present invention, a Haar-like feature is adopted, see fig. 2, and the calculation method is: firstly, taking a rectangular area of x, and dividing the area into 2, 3 or 4 sub-areas with equal size; if the image is divided into 2 sub-regions, the 2 sub-regions can be distributed left and right or up and down, and the characteristic value is the difference between the pixel value accumulation sum in the white sub-region and the pixel value accumulation sum in the black sub-region; if the image is divided into 3 sub-areas, the 3 sub-areas can be distributed in a left-middle-right mode or in an upper-middle-lower mode, and the characteristic value is a difference value obtained by subtracting the pixel value accumulation sum of the middle black sub-area from the pixel value accumulation sum of the left white sub-area, the right white sub-area or the upper white sub-area and the lower white sub-area; if the image is divided into four sub-regions, the horizontal direction and the vertical direction are respectively divided into two parts, and the characteristic value is the difference value obtained by subtracting the pixel value accumulated sum of the upper right black sub-region and the lower right white sub-region from the pixel value accumulated sum of the upper right black sub-region and the lower left black sub-region.
Referring to fig. 3, the positive sample is a rectangular image block with a certain width and height, L in the figure is a horizontal straight line, intersection points of the straight line and the edge of a lane line are a and B, the length of a line segment AB is w, and a mark area of the lane line approximately occupies the middle part of the image block; the negative samples are road surface area images without lane marks, and the positive samples and the negative samples are scaled to preset sizes.
In the embodiment of the present invention, the training of the strong classifier by using the Adaboost algorithm, specifically, referring to fig. 4, may include the following steps:
step 401, initializing the weight of each sample, wherein the weight of each positive sample is 1/2NpThe weight of the negative sample is 1/2NfIn which N ispIs the number of positive samples, NfLoad sampleThe number is that the strong classifiers to be trained are initialized to contain 0 weak classifiers;
step 402, T iterates from 1 to T, wherein T is the number of weak classifiers which are allowed to be contained in a preset strong classifier at most, and a weak classifier is selected in each iteration;
step 403, selecting the optimal weak classifier, firstly, calculating the weighted misclassification loss function of each weak classifier to be selected according to the following formula,
εt=minf,p,θiwi|h(xi,f,p,θ)-yi| (2)
wherein x isiAnd yiRespectively, sample and corresponding label, if xiIs a positive sample, then yi1, otherwise yiH (-) is a weak classifier shown in formula (1), and if the output value of the weak classifier is inconsistent with the marked symbol, the weak classifier is mistakenly classified; secondly, selecting the weak classifier with the minimum error division loss function value as the optimal weak classifier, and recording the optimal weak classifier as ht
Step 404, if εtIf the value is more than 0.5, the iteration is ended; otherwise, go to step 405;
step 405, updating the weight of each sample according to the following formula,
Figure BDA0001808950100000071
wherein, when the sample xiIf correctly classified, the corresponding ei0, otherwise ei=1,
Figure BDA0001808950100000072
Step 406, calculating the weight of the current weak classifier in the strong classifier according to the following formula,
Figure BDA0001808950100000073
and combining each weak classifier according to the weight of the weak classifier to form a strong classifier:
F(x)=sign(∑tαtht(x)) (5)
step 407, classifying the test samples by using the current strong classifier, if the result of the classifier reaches the expected target, ending the iteration, and outputting the strong classifier shown in formula (5); otherwise, go to step 402.
In step 104, each line segment in the extension candidate line segment set becomes a straight line, for a non-parallel straight line, an intersection point between every two straight lines is calculated, and a weighted average and a covariance matrix of all samples are calculated by taking a two-dimensional coordinate of the intersection point as a sample, which specifically includes:
for any two line segments LiAnd LjThe weights are ηiAnd ηjAnd extend the line segment LiAnd LjThe straight lines formed intersect, then the intersection point is given a weight: etaij
Let the intersection sample set be
Figure BDA0001808950100000081
Wherein (X)k,Yk) Coordinates of the kth intersection point; the weight set corresponding to the intersection point is
Figure BDA0001808950100000082
ηkIs the weight of the kth intersection point, N is the total number of the intersection point samples in the intersection point sample set, and the average value of the samples in the intersection point sample set is
Figure BDA0001808950100000083
Wherein
Figure BDA0001808950100000084
Figure BDA0001808950100000085
The covariance matrix of the samples is:
Figure BDA0001808950100000086
wherein
Figure BDA0001808950100000087
Figure BDA0001808950100000088
In step 105, estimating a lane vanishing point by using a Kalman filter algorithm, where the mean and covariance of the intersection coordinate samples of the t-th frame calculated in steps 101 to 104, and the lane vanishing point continuously tracked from the 0-th frame to the t-1-th frame may specifically include:
firstly, the coordinate of the vanishing point of the track line is modeled into the state of a discrete dynamic system changing along with the time, and the vanishing point of the t-th frame is recorded as VtVanishing point V from the previous momentt-1The following relationship exists:
Vt=Vt-1+z (6)
wherein z represents the process noise of the system, and conforms to the normal distribution with a mean of 0 and a covariance matrix of Q;
secondly, from the lane vanishing point V of the previous momentt-1Predicting the vanishing point at time t from the covariance matrix P of the state errors at the previous timet-1Sum matrix Q predicts the state error covariance matrix P at time tt -
Figure BDA0001808950100000091
Pt -=Pt-1+Q (8)
Wherein the content of the first and second substances,
Figure BDA0001808950100000092
representing a trace vanishing point at the time t, wherein Q represents a covariance matrix of the system noise;
calculating Kalman gain according to the following formula:
Kt=Pt -(Pt -+Σ)-1 (9)
wherein Σ is the sample covariance matrix of step 104;
and thirdly, updating the vanishing point of the track line, wherein the calculation formula is as follows:
Figure BDA0001808950100000093
where u is the sample weighted average described in step 104,
Figure BDA0001808950100000094
the updated time t is the vanishing point of the line;
and finally, updating the state error covariance matrix at the time t, and calculating the formula as follows:
Pt=(D-Kt)Pt - (11)
where D is a 2 x 2 identity matrix.
Through the processing flow, the aim of accurately determining the vanishing point of the road line by the edge straight line of the road line can be achieved.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any modification or replacement within the spirit and principle of the present invention should be covered within the scope of the present invention.

Claims (4)

1. A line vanishing point tracking method based on Kalman filtering is characterized by comprising the following steps:
detecting edge pixels of an input image, and detecting line segments by the edge pixels through a Hough transform algorithm;
secondly, extracting image blocks of the detected line segments by taking points on the line segments as anchor points and by multiple scales and offsets, and identifying whether the extracted image blocks are road line blocks or not by using a pre-trained classifier;
counting the number of points of the image blocks on the line segment which are identified as the track line blocks, if the number is greater than a preset threshold value, adding the corresponding line segment into a candidate line segment set, and calculating that the weight value corresponding to the line segment is equal to the number of points of the image blocks which are identified as the track line blocks divided by the length of the line segment;
extending each line segment in the candidate line segment set to form a straight line, calculating an intersection point between every two straight lines for the non-parallel straight lines, and calculating a weighted average value and a covariance matrix of all samples by taking a two-dimensional coordinate of the intersection point as a sample;
for the non-parallel straight lines, calculating the intersection point between every two straight lines, taking the two-dimensional coordinate of the intersection point as a sample, and calculating the weighted average value and the covariance matrix of all samples, wherein the method comprises the following steps:
for any two line segments LiAnd LjThe weights are ηiAnd ηjAnd extend the line segment LiAnd LjIf there is an intersection point in the formed straight line, the intersection point is given a weight: etaij
Let the intersection sample set be
Figure FDA0003143408220000011
Wherein (X)k,Yk) Coordinates of the kth intersection point; the weight set corresponding to the intersection point is
Figure FDA0003143408220000012
ηkIs the weight of the kth intersection point, N is the total number of the intersection point samples in the intersection point sample set, and the average value of the samples in the intersection point sample set is
Figure FDA0003143408220000013
Wherein
Figure FDA0003143408220000014
Figure FDA0003143408220000015
The covariance matrix of the samples is:
Figure FDA0003143408220000016
wherein the content of the first and second substances,
Figure FDA0003143408220000021
Figure FDA0003143408220000022
and fifthly, setting the current frame as the t-th frame, estimating the lane vanishing point by using a Kalman filtering algorithm according to the weighted average value and the covariance of the intersection point coordinate samples of the current frame obtained by calculation in the first step to the fourth step and the lane vanishing point continuously tracked from the 0 th frame to the t-1 th frame, and outputting the estimated vanishing point as the tracking result of the t-th frame.
2. The Kalman filtering-based line vanishing point tracking method according to claim 1, wherein in the second step, the points on the line segment are taken as anchor points, the image block is extracted by a plurality of scales and offsets, and a pre-trained classifier is used for identifying whether the extracted image block is a line block, comprising:
let (X, Y) be a point on the line segment, and the image block extracted with this point as the anchor point is I (X-delta)X,Y-δYW/s, H/s) represents by (X-delta)X,Y-δY) A rectangular image area with upper left corner coordinates, W/s and H/s being width and height, where W and H are the preset reference window width and height, respectively, and deltaXAnd deltaYRespectively the offset in the horizontal and vertical directions, s is a preset scale coefficient;
the classifier for identifying whether the extracted image block is a line block is a cascade classifier, wherein each stage is a strong classifier formed by combining a plurality of weak classifiers;
each weak classifier corresponds to a feature and is calculated according to the following formula:
Figure FDA0003143408220000023
wherein x is an image block to be detected, p is +/-1 and is used for controlling unequal sign directions, theta is a threshold value, and f is a characteristic value calculation function;
in the training process, the weighting error-division loss function of each weak classifier to be selected is calculated according to the following formula,
εt=minf,p,θiwi|h(xi,f,p,θ)-yi|
wherein x isiAnd yiRespectively, sample and corresponding label, if xiIs a positive sample, then yi1, otherwise yiSelecting the weak classifier with the minimum error loss function value as the optimal weak classifier to form a strong classifier, wiIs the weight of sample i.
3. The Kalman filtering-based trace line vanishing point tracking method according to claim 2, wherein the features corresponding to the weak classifiers adopt Haar-like features, and the calculation method comprises the following steps: firstly, taking a rectangular area of an image block x to be detected, and dividing the area into 2, 3 or 4 sub-areas with the same size; if the sub-area is divided into 2 sub-areas, the 2 sub-areas can be distributed left and right or up and down, and the characteristic value is the difference between the pixel value accumulation sum in one sub-area and the pixel value accumulation sum in the other sub-area; if the sub-area is divided into 3 sub-areas, the 3 sub-areas can be distributed in a left-middle-right mode or in an upper-middle-lower mode, and the characteristic value is a difference value obtained by subtracting the pixel value accumulation sum of the middle sub-area from the pixel value accumulation sum of the left sub-area, the right sub-area or the upper-lower sub-area; if the image is divided into four sub-regions, the horizontal direction and the vertical direction are respectively divided into two parts, and the characteristic value is the difference value obtained by subtracting the pixel value accumulated sum of the upper right sub-region and the lower right sub-region from the pixel value accumulated sum of the upper right sub-region and the lower left sub-region.
4. The trace line vanishing point tracking method based on Kalman filtering as claimed in claim 1, wherein the estimating of trace line vanishing point by Kalman filtering algorithm according to the mean and covariance of the intersection point coordinate samples of the current frame calculated in the step five and the trace line vanishing point continuously tracked from the frame 0 to the frame t-1 comprises:
firstly, the coordinate of the vanishing point of the track line is modeled into the state of a discrete dynamic system changing along with the time, and the vanishing point of the t-th frame is recorded as VtVanishing point V from the previous momentt-1The relation of (A) is as follows:
Vt=Vt-1+z
wherein z represents the process noise of the system, and conforms to the normal distribution with the mean value of 0 and the covariance matrix of Q;
secondly, from the lane vanishing point V of the previous momentt-1Predicting the vanishing point at time t from the covariance matrix P of the state errors at the previous timet-1Sum matrix Q predicts the state error covariance matrix P at time tt -
Figure FDA0003143408220000031
Pt -=Pt-1+Q
Wherein the content of the first and second substances,
Figure FDA0003143408220000032
representing the estimated trace vanishing point at the time t, and Q represents a covariance matrix of the system process noise;
calculating Kalman gain according to the following formula:
Kt=Pt -(Pt -+∑)-1
wherein Σ is the sample covariance matrix described in step four;
and thirdly, updating the vanishing point of the track line, wherein the calculation formula is as follows:
Figure FDA0003143408220000041
wherein u is the weighted average of the samples described in step four,
Figure FDA0003143408220000042
the updated time t is the vanishing point of the line;
and finally, updating the state error covariance matrix at the time t, and calculating the formula as follows:
Pt=(D-Kt)Pt -
where D is a 2X 2 identity matrix, PtIs the updated state error covariance matrix at time t.
CN201811110435.6A 2018-09-21 2018-09-21 Lane line vanishing point tracking method based on Kalman filtering Active CN109272536B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811110435.6A CN109272536B (en) 2018-09-21 2018-09-21 Lane line vanishing point tracking method based on Kalman filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811110435.6A CN109272536B (en) 2018-09-21 2018-09-21 Lane line vanishing point tracking method based on Kalman filtering

Publications (2)

Publication Number Publication Date
CN109272536A CN109272536A (en) 2019-01-25
CN109272536B true CN109272536B (en) 2021-11-09

Family

ID=65198756

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811110435.6A Active CN109272536B (en) 2018-09-21 2018-09-21 Lane line vanishing point tracking method based on Kalman filtering

Country Status (1)

Country Link
CN (1) CN109272536B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11373063B2 (en) * 2018-12-10 2022-06-28 International Business Machines Corporation System and method for staged ensemble classification
CN111968038B (en) * 2020-10-23 2021-01-12 网御安全技术(深圳)有限公司 Method and system for rapidly searching vanishing points in image

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366156A (en) * 2012-04-09 2013-10-23 通用汽车环球科技运作有限责任公司 Road structure detection and tracking
CN103839264A (en) * 2014-02-25 2014-06-04 中国科学院自动化研究所 Detection method of lane line
CN104318258A (en) * 2014-09-29 2015-01-28 南京邮电大学 Time domain fuzzy and kalman filter-based lane detection method
CN106228125A (en) * 2016-07-15 2016-12-14 浙江工商大学 Method for detecting lane lines based on integrated study cascade classifier
CN106529415A (en) * 2016-10-16 2017-03-22 北海益生源农贸有限责任公司 Characteristic and model combined road detection method
CN106529443A (en) * 2016-11-03 2017-03-22 温州大学 Method for improving detection of lane based on Hough transform
CN106682586A (en) * 2016-12-03 2017-05-17 北京联合大学 Method for real-time lane line detection based on vision under complex lighting conditions
CN107316331A (en) * 2017-08-02 2017-11-03 浙江工商大学 For the vanishing point automatic calibration method of road image
CN107796373A (en) * 2017-10-09 2018-03-13 长安大学 A kind of distance-finding method of the front vehicles monocular vision based on track plane geometry model-driven

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9996752B2 (en) * 2016-08-30 2018-06-12 Canon Kabushiki Kaisha Method, system and apparatus for processing an image

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103366156A (en) * 2012-04-09 2013-10-23 通用汽车环球科技运作有限责任公司 Road structure detection and tracking
CN103839264A (en) * 2014-02-25 2014-06-04 中国科学院自动化研究所 Detection method of lane line
CN104318258A (en) * 2014-09-29 2015-01-28 南京邮电大学 Time domain fuzzy and kalman filter-based lane detection method
CN106228125A (en) * 2016-07-15 2016-12-14 浙江工商大学 Method for detecting lane lines based on integrated study cascade classifier
CN106529415A (en) * 2016-10-16 2017-03-22 北海益生源农贸有限责任公司 Characteristic and model combined road detection method
CN106529443A (en) * 2016-11-03 2017-03-22 温州大学 Method for improving detection of lane based on Hough transform
CN106682586A (en) * 2016-12-03 2017-05-17 北京联合大学 Method for real-time lane line detection based on vision under complex lighting conditions
CN107316331A (en) * 2017-08-02 2017-11-03 浙江工商大学 For the vanishing point automatic calibration method of road image
CN107796373A (en) * 2017-10-09 2018-03-13 长安大学 A kind of distance-finding method of the front vehicles monocular vision based on track plane geometry model-driven

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
Fast and Robust Vanishing Point Detection for Unstructured Road Following;Jinjin Shi et al.;《IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS》;20160430;第17卷(第4期);第970-979页 *
Real-time Rear of Vehicle Detection from a Moving Camera;Qing Xu et al.;《2014 CCDC》;20140714;第4575-4578页 *
单目视觉结构化道路车道线检测和跟踪技术研究;付永春;《中国优秀硕士学位论文全文数据库 信息科技辑》;20120715;第2012年卷(第7期);第I138-2212页 *
基于Android平台的车道线检测技术的实现;陈茜;《中国优秀硕士学位论文全文数据库 信息科技辑》;20131215;第2013年卷(第S2期);第I138-1253页 *
基于机器视觉的行车安全预警系统研究与实现;黄惠迪;《中国优秀硕士学位论文全文数据库 信息科技辑》;20150715;第2015年卷(第7期);第I138-1153页 *
基于计算机视觉的前方车辆检测及测距研究;李佳旺;《中国优秀硕士学位论文全文数据库 工程科技II辑》;20180615;第2018年卷(第6期);第C035-92页 *

Also Published As

Publication number Publication date
CN109272536A (en) 2019-01-25

Similar Documents

Publication Publication Date Title
Marzougui et al. A lane tracking method based on progressive probabilistic Hough transform
Yang et al. Robust superpixel tracking
Keller et al. The benefits of dense stereo for pedestrian detection
Xu et al. Detection of sudden pedestrian crossings for driving assistance systems
Alon et al. Off-road path following using region classification and geometric projection constraints
Kim et al. A Novel On-Road Vehicle Detection Method Using $\pi $ HOG
US9898677B1 (en) Object-level grouping and identification for tracking objects in a video
Shi et al. Fast and robust vanishing point detection for unstructured road following
CN101344922B (en) Human face detection method and device
JP5604256B2 (en) Human motion detection device and program thereof
US20170032676A1 (en) System for detecting pedestrians by fusing color and depth information
CN102598057A (en) Method and system for automatic object detection and subsequent object tracking in accordance with the object shape
Tian et al. A two-stage character segmentation method for Chinese license plate
CN109284664B (en) Driver assistance system and guardrail detection method
CN110458158B (en) Text detection and identification method for assisting reading of blind people
Morris et al. Improved vehicle classification in long traffic video by cooperating tracker and classifier modules
CN115240130A (en) Pedestrian multi-target tracking method and device and computer readable storage medium
CN109272536B (en) Lane line vanishing point tracking method based on Kalman filtering
Spinello et al. Multimodal People Detection and Tracking in Crowded Scenes.
CN111832349A (en) Method and device for identifying error detection of carry-over object and image processing equipment
Ho et al. Intelligent speed bump system with dynamic license plate recognition
Lee et al. A cumulative distribution function of edge direction for road-lane detection
He et al. Segmentation of characters on car license plates
CN110827319B (en) Improved Staple target tracking method based on local sensitive histogram
CN108830182B (en) Lane line detection method based on cascade convolution neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant