CN107832674A - A kind of method for detecting lane lines - Google Patents
A kind of method for detecting lane lines Download PDFInfo
- Publication number
- CN107832674A CN107832674A CN201710957864.6A CN201710957864A CN107832674A CN 107832674 A CN107832674 A CN 107832674A CN 201710957864 A CN201710957864 A CN 201710957864A CN 107832674 A CN107832674 A CN 107832674A
- Authority
- CN
- China
- Prior art keywords
- point
- fine
- straight line
- pixel
- datum mark
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The present invention relates to a kind of method for detecting lane lines, wherein, methods described includes:The image of vehicle front is obtained, described image is switched into gray-scale map;Coarse positioning is carried out to the lane line using the half-tone information of described image, and records coarse positioning point;The positioning that becomes more meticulous is carried out to the coarse positioning point, and retains fine-point;The fine-point is sorted out, obtains multiple straight line classes;The lane line is obtained according to multiple straight line classes.Technical solution of the present invention is down-sampled by carrying out longitudinal slitting band piecemeal to whole image, the characteristic of longitudinal arrangement is utilized during detection edge, each band inside lane horizontal direction change is violent, determine the suspicious marginal point in track, then suspicious marginal point is handled, therefore reduce operand, and can exclusive PCR point, improve the real-time of algorithm and the reliability in track.
Description
Technical field
The invention belongs to technical field of image processing, and in particular to a kind of method for detecting lane lines.
Background technology
With the rapid development of automobile assistant driving and unmanned vehicle technology, can machine vision sensor accurately obtain vehicle
Mark, sign or the lane line information of surrounding are the most important ring of automobile assistant driving system, the wherein real-time inspection of lane line
Survey and early warning technology has ensured that vehicle is taken their own roads, important function has been played deviateing early warning, track holding etc..
At present, in the research of lane detection, track is mainly divided into straight line model, conic model or segmentation and cut
Mold changing type etc..It is most that track edge is first detected using Edge Search method in straight line model, recycle Hough transform to know
Track is not gone out.Edge Search method is frequently with Canny edge detection algorithms, wavelet transformation and Sobel algorithms etc., but these algorithms
The marginal information of the non-lane line object of a large amount of non-longitudinal arrangements is will detect that, brings interference for Hough transform below, significantly
Waste CPU calculation resources so that algorithm speed is slower, can not be used in computational universal not strong embedded hardware.
Carried out simultaneously using the line detection method based on Hough transform, it is necessary to which the straight line in image space is transformed into parameter space
Description, is counted to the point being possible in straight border, finally determines that these points belong to the probability of this straight line, because
This, it is computationally intensive cause to take, poor real be the lane detection based on Hough transform a big bottleneck.And onboard system
Majority is to be based on embedded system, based on the lane detection method of Hough transform it is difficult to widely apply in embedded system.
The shortcomings that being taken for Hough transform, most processing methods are that the region being likely to occur to track in image is carried out
Limit, image or so certain limit and the certain background of image upper part are such as rejected, using remaining region as area-of-interest
(region of interest, abbreviation ROI), lane detection only is carried out to ROI during subsequent treatment.But this method is not from this
The problem of annual reporting law is time-consuming is solved in matter.Therefore the characteristic information in further discussion track is needed, makes full use of track and non-track
The Difference selection of object goes out track edge.Further, since the method is to pick image background block region from the angle of the overall situation
Remove, do not reduce the interference of non-lane information inherently, do not include extended background for the image that sensor gathers in itself
During region, the method does not apply to simultaneously.
Therefore work out it is a kind of do not limited by picture material and operand is small, being capable of the lane line based on embedded system
Detection method, it is the hot research direction of those skilled in the art.
The content of the invention
For the problem present on, the present invention proposes a kind of method for detecting lane lines, and specific embodiment is as follows.
A kind of method for detecting lane lines provided in an embodiment of the present invention, wherein, methods described includes:
Step 1:The image of vehicle front is obtained, described image is switched into gray-scale map;
Step 2:Coarse positioning is carried out to the lane line using the half-tone information of described image, and records coarse positioning point;
Step 3:The positioning that becomes more meticulous is carried out to the coarse positioning point, and retains fine-point;
Step 4:The fine-point is sorted out, obtains multiple straight line classes;
Step 5:The lane line is obtained according to multiple straight line classes.
In one embodiment of the invention, the step 2 includes:
Step 21, using the half-tone information of described image described image is divided into multiple bands, will each band
It is divided into multiple block of pixels;
Step 22, the grey scale pixel value to each block of pixels are summed, the Sum values of each block of pixels;
The Sum values of each block of pixels of step 23, basis obtain the gradient data of each band;
Step 24, maximum point and minimum point are searched in the gradient data of each band;
Step 25, the maximum point and the minimum point be recorded as the coarse positioning point.
In one embodiment of the invention, the step 3 includes:
Step 31, choose coarse positioning point Pi(x, y), with the coarse positioning point PiCentered on (x, y), M pixel is respectively taken up and down
OK, left and right respectively takes xOffset pixel;
Step 32, the coarse positioning point to each pixel column carry out convolution algorithm, obtain the picture of each pixel column
Plain extreme point;
Step 33, the abscissa to multiple pixel extreme points in upper M pixel column are averaged X1, to lower M pixel
The abscissa of multiple pixel extreme points in row is averaged X2;
Step 34, the difference for judging each pixel extreme point and the average value X1 in the upper M pixel column
Absolute value whether be more than preset value, if so, then giving up the pixel extreme point, if it is not, then retaining the pixel extreme point;
Step 35, each pixel extreme point in the lower M pixel column and the average value X2's are judged respectively
Whether the absolute value of difference is more than preset value, if so, then giving up the pixel extreme point, if it is not, then retaining the pixel extreme value
Point;
Step 36, step 31 is performed successively to step 35 to all coarse positioning points;
Step 37, the coordinate of multiple pixel extreme points of reservation switched into fine point coordinates.
In one embodiment of the invention, the step 36 includes:
The fine-point PiaThe abscissa of (x, y) is:
Pia.x=Pim.x+(sumx1+sumx2)/(N1+N2)
Wherein, Pim.xFor the value of the abscissa of the pixel extreme point of reservation;Sumx1 is to retain in upper M pixel column
Multiple pixel extreme points abscissa sum;N1 is the multiple pixel extreme points retained in upper M pixel column
Number;Sumx2 is the abscissa sum of the multiple pixel extreme points retained in lower M pixel column;N2 is lower M pixel column
The number of multiple pixel extreme points of interior reservation.
In one embodiment of the invention, the step 4 includes:
Step 41, using i-th point in multiple fine-points as datum mark, determine that first is straight according to the datum mark
Line class, wherein i are 1,2,3,4 ...;
Step 42, by i+1 fine-point compared with the datum mark;
If the i+1 fine-point records the i+1 fine-point in the range of datum mark determination
In the first straight line class, and the new datum mark using the i+1 fine-point as the first straight line class;
If the i+1 fine-point not in the range of datum mark determination, increases straight line class newly, and with described the
Datum mark of the i+1 fine-point as the newly-increased straight line class;
Step 43, by unclassified multiple fine-points respectively with the datum mark of the first straight line class and described newly-increased
The datum mark of straight line class is compared, and is completed until multiple fine-points are sorted out.
In one embodiment of the invention, the step 42 includes:
Step 421, judge whether the Y value of the fine point coordinates of the i+1 is equal with the Y value of the benchmark point coordinates;
If equal, the i+1 fine-point is not in the range of datum mark determination;
If unequal, step 422 is performed;
Step 422, judge the slope of the i+1 fine-point whether in default slope range;
If in the default slope range, the i+1 fine-point is in the range of datum mark determination;
If not in the default slope range, the i+1 fine-point is not in the model of datum mark determination
In enclosing.
In one embodiment of the invention, the default slope range is the preset areas where the benchmark point coordinates
Corresponding slope range in domain, including:
The slope average Kavg of predeterminable area according to where default slope information allocation list obtains the datum mark;
The slope average is multiplied by greatest gradient coefficient and minimum slope coefficient, to obtain in the datum mark region
Slope range;
Wherein, the greatest gradient coefficient is 1.2, and the minimum slope coefficient is 0.8.
In one embodiment of the invention, the step 5 includes:
Multiple straight line classes are screened, determine the left side straight line class and right side straight line class of the lane line;
Point in the left side straight line class is divided into upper and bottom section according to number, and calculates average coordinates respectively
(xUpper left, yUpper left)、(xLower-left, yLower-left);
Point in the right side straight line class is divided into upper and bottom section according to number, and calculates average coordinates respectively
(xUpper right, yUpper right)、(xBottom right, yBottom right);
Calculate middle coordinate xOn=(xUpper left+xUpper right)/2, yOn=(yUpper left+yUpper right)/2;xUnder=(xLower-left+xBottom right)/2, yUnder=(yLower-left+
yBottom right)/2;
Connect the middle coordinate (xOn, yOn) and (xUnder, yUnder) obtain the lane line.
Beneficial effects of the present invention are:
1st, technical solution of the present invention is down-sampled by carrying out longitudinal slitting band piecemeal to whole image, is utilized when detecting edge
The characteristic of longitudinal arrangement, each band inside lane horizontal direction change is violent, therefore uses simple gradient operator in band
Image block data does horizontal gradient computing, by the graded of track left and right edges conversely and in lane width field inside gradient
The characteristic that extreme point occurs in pairs, the suspicious marginal point in track can be detected roughly, arranged with this to reduce non-longitudinal trend
Marginal information interference, improve the real-time of algorithm and the reliability in track.
2nd, the embodiment of the present invention obtains the fine-point position at track edge by the point to coarse positioning after finely positioning again
Put, then the data in the slope information between fine-point and the lane configurations table pre-set carry out judgement of tabling look-up, will
Meet that the point of condition is referred in different straight line classes, the straight line class that most numbers are selected in the classification point counted more than threshold value is made
At left and right sides of lane lines, these straight line class upper-lower positions to be averaging and can obtain lane line, complete the detection of lane line.Cause
This need not carry out the calculating of the noise spot of non-longitudinal trend arrangement, so as to reduce a large amount of useless calculating, largely reduce
The processing pressure of system, improves processing speed.
Brief description of the drawings
Fig. 1 is the flow chart of method for detecting lane lines provided in an embodiment of the present invention;
Fig. 2 is to schematic diagram of the described image slitting with piecemeal in the embodiment of the present invention;
Fig. 3 is the schematic diagram of the gradient data of single band in the embodiment of the present invention;
Fig. 4 is coarse positioning point schematic diagram in the embodiment of the present invention;
Fig. 5 is schematic diagram after fine positioning in the embodiment of the present invention;
Fig. 6 is schematic diagram after sorting out in the embodiment of the present invention;
Fig. 7 is cathetus display schematic diagram of the embodiment of the present invention.
Embodiment
In order to facilitate the understanding of the purposes, features and advantages of the present invention, below in conjunction with the accompanying drawings to the present invention
Embodiment be described in detail.
As shown in Figures 1 to 7, Fig. 1 is the flow chart of method for detecting lane lines provided in an embodiment of the present invention;Fig. 2 is this
To schematic diagram of the described image slitting with piecemeal in inventive embodiments;Fig. 3 is the gradient number of single band in the embodiment of the present invention
According to schematic diagram;Fig. 4 is coarse positioning point schematic diagram in the embodiment of the present invention;Fig. 5 is to illustrate in the embodiment of the present invention after fine positioning
Figure;Fig. 6 is schematic diagram after sorting out in the embodiment of the present invention;Fig. 7 is cathetus display schematic diagram of the embodiment of the present invention.The present embodiment
The operation principle of method for detecting lane lines is described in detail in more detail as follows.
As shown in figure 1, method for detecting lane lines provided in an embodiment of the present invention, switchs to gray scale by the image of processing first
Figure, then carry out coarse positioning, fine positioning successively, sort out point, straight line display processing.Specifically include:
Step 1:The image of vehicle front is obtained, described image is switched into gray-scale map;
Step 2:Coarse positioning is carried out to the lane line using the half-tone information of described image, and records coarse positioning point;
Step 3:The positioning that becomes more meticulous is carried out to the coarse positioning point, and retains fine-point;
Step 4:The fine-point is sorted out, obtains multiple straight line classes;
Step 5:The lane line is obtained according to multiple straight line classes.
<Coarse positioning>
Further, coarse positioning is carried out to the lane line using the half-tone information of described image, and records coarse positioning point,
Including:
Step 21, using the half-tone information of described image described image is divided into multiple bands, will each band
It is divided into multiple block of pixels;
Step 22, the grey scale pixel value to each block of pixels are summed, the Sum values of each block of pixels;
The Sum values of each block of pixels of step 23, basis obtain the gradient data of each band;
Step 24, maximum point and minimum point are searched in the gradient data of each band;
Step 25, the maximum point and the minimum point be recorded as the coarse positioning point.
Specifically:
During coarse positioning, for the image of W*H sizes, by taking 720*200 as an example, image is longitudinally divided into multiple bands (generally
Choose 40), each band include h rows view data (generally choosing 5), then to each band with w (generally selection 5 or 10) as
The wide piecemeal of element, is always divided into W/w block.Following steps are carried out after piecemeal successively:
1st, the block progress grey scale pixel value of each w*h sizes in band is summed to obtain Sum, i.e., each band obtains W/w
Dimension data Sum.
2nd, horizontal gradient calculating is carried out to the data Sum [i] of each piece of i after summation, entered using view data and template
Row convolution algorithm, template use simple operator such as [- 1,0,1] or [1,0, -1].
By taking [- 1,0,1] as an example, the gradient data for obtaining single band is:
Diff [i]=Sum [i+1]-Sum [i-1], i=1,2,3... (W/w-1) (1)
By taking the image of 720*200 sizes in Fig. 2 as an example, using the upper left corner as the origin of coordinates, in figure using y=50 as starting point
Band press h=5, w=5 size is carried out after piecemeal seeks pixel value sum, with formula (1) to each block number of band according to doing level
Gradient calculation, obtained gradient data are as shown in Figure 3;
3rd, some gradient extreme values pair are searched successively in gradient data.It is opposite by the graded of track left and right edges and
The characteristic that inside gradient extreme point occurs in pairs in lane width field, the suspicious marginal point in track can be detected roughly.
In the embodiment of the present invention, when calculating gradient using template [- 1,0,1], left-hand lane marginal point is gradient forward direction pole
At big value, right side edge point is at gradient negative sense minimum, while the gradient magnitude of extreme value is greater than threshold value MaxMinTh, and pole
The number for the block that big value and minimum are separated by is in the threshold range of track, i.e., a little bigger for each extreme value, if row block index is nl
(left-hand lane), when its gradient magnitude is more than MaxMinTh, then gradient amplitude is searched in widthTh width threshold values on the right side of it
Minimum point more than MaxMinTh, if row block index isnr(right-hand lane), if having found a pair of Min-max points (nl,
Nr), then it is assumed that this is suspicious track to point, by the extreme point found to (nl, nr) according to formula (2):
Map back the point coordinates P that artwork obtains the left and right sides of track coarse positioningi(x,y),Pi+1(x, y), preserve this position
Information, continue according to the 3rd point of other extreme points pair for finding current band.Otherwise, by the maximum point found from candidate
Deleted in point, then the lookup of extreme value pair is carried out to the gradient data of band successively.
On the other hand, can be specially to carry out gradient calculation exemplified by [1,0, -1]:First lookup negative sense minimum, then
Positive maximum is searched in the block of widthTh fields.
Wherein, MaxMinTh uses adaptive threshold, adaptive according to the background luminance of vicinity and its positional information
Threshold value should be adjusted, threshold value MaxMinTh calculation is as follows:
Set the basic threshold value baseTh of gradient first, due to lane line in the picture between partly tend to vertical, more arrive
Image both sides more tilt, and after diverse location punishment block seeks pixel value sum, the size and location of pixel value sum is relevant, more past
It is middle bigger, it is smaller closer to both sides.Therefore need to image diverse location Set scale coefficient locateRate, the basic threshold of gradient
Value baseTh is proportionally adjusted in diverse location region.To each extreme point found, the brightness of m, its field block is calculated
Average is as the background value bkgrd at this extreme value, then with bkgrd compared with default background maximum MaxBkgrdTh.
As bkgrd≤MaxBkgrdTh:
MaxMinTh=baseTh*locateRate+bkgrd*LumaRate (3)
As bkgrd > MaxBkgrdTh, image is likely to occur the interference of illumination so that graded reduces, therefore wants root
Gradient is correspondingly reduced according to the background bkgrd at extreme value and background maximum MaxBkgrdTh difference bkgrd-MaxBkgrdTh
Amplitude threshold, that is, reduce overTimes times of difference:
For the gradient data shown in Fig. 3, such as:The block coordinate of the extreme point pair finally found out be (40,42), (81,
84), map back artwork and produce the point coordinates (202,52) of coarse positioning, (212,52), (407,52), (422,52).Full figure it is thick
Anchor point is as shown in Figure 4.
4th, aforesaid operations are carried out successively to other bands of image, finds out the left and right in doubtful track in all bands of full figure
The coarse positioning point P of sidei(x,y),Pi+1(x,y)。
After the present invention is by carrying out slitting band piecemeal to image, the gradiometer of horizontal direction is only done to the pixel value sum of block
Calculate, the extreme point that track is searched in the limitation, the adjustment of gradient amplitude threshold adaptive and the limitation in direction further according to lane width is real
Existing rim detection, takes full advantage of the trend of track longitudinal arrangement and the directional information of track the right and left graded, greatly
Ground avoid using Canny operators or Sobel operators detection edge when caused by numerous Clutter edges.
The down-sampled of track candidate point is realized by slitting band, follow-up calculating is carried out on the base point after coarse positioning
Processing, avoid come the method accelerated, is imaging the method using by global rejecting upper part and left-right parts image block
The data of head collection itself can be applicable when not including sky background, and be capable of detecting when in image from middle vertical distribution to a left side
All lane lines that right both sides are slowly distributed to intermediate, inclined.
<Fine positioning>
The positioning that becomes more meticulous is carried out to the coarse positioning point, and retains fine-point, including:
Step 31, choose coarse positioning point Pi(x, y), with the coarse positioning point PiCentered on (x, y), M pixel is respectively taken up and down
OK, left and right respectively takes xOffset pixel;
Step 32, the coarse positioning point to each pixel column carry out convolution algorithm, obtain the picture of each pixel column
Plain extreme point;
Step 33, the abscissa to multiple pixel extreme points in upper M pixel column are averaged X1, to lower M pixel
The abscissa of multiple pixel extreme points in row is averaged X2;
Step 34, the difference for judging each pixel extreme point and the average value X1 in the upper M pixel column
Absolute value whether be more than preset value, if so, then giving up the pixel extreme point, if it is not, then retaining the pixel extreme point;
Step 35, each pixel extreme point in the lower M pixel column and the average value X2's are judged respectively
Whether the absolute value of difference is more than preset value, if so, then giving up the pixel extreme point, if it is not, then retaining the pixel extreme value
Point;
Step 36, step 31 is performed successively to step 35 to all coarse positioning points;
Step 37, the coordinate of multiple pixel extreme points of reservation switched into fine point coordinates.
The point of coarse positioning obtains the fine-point position at track edge after finely positioning, eliminates not meeting car during fine positioning
Road is longitudinally continuous the noise spot of compact arrangement rule, while preserves the near edge point coordinates information and this edge of each band
Slope information.It is specific as follows:
1st, the P for having been detected byi(x, y), respectively to PiSearched in the xOffset row neighborhood of the rows of m up and down of (x, y)
Gradient extreme point.P when being all set to this band coarse positioning for the x initial coordinates of every rowiThe x coordinate of (x, y), in Pi(x's, y)
In the range of xOffset, 4 points or 5 pixel value sums do difference before and after using successively, i.e., with view data and template [- 1 ,-
1, -1, -1,0,1,1,1,1] or [- 1, -1, -1, -1, -1,0,1,1,1,1,1] carries out convolution, finds this xOffset scope
Extreme coordinates P of the maximum point of inside gradient change as this journeyim(x,y).M rows up and down, the extreme point P per a line are tried to achieve successivelyim
(x,y)。
In the embodiment of the present invention, preferable m value is:5 are generally taken when 3,10*10 piecemeals are generally taken during 5*5 piecemeals;
XOffset is generally according to PiThe x coordinate of (x, y) determines, if the coarse positioning point be located at before image 1/3 or it is rear 1/3 when take 30,
15 are taken during positioned at centre 1/3.
2nd, respectively to the extreme point P of top and bottom m rowsim(x, y) seeks the average value X1 and X2 of x coordinate, then by upper and lower m rows
Extreme point PimThe x coordinate of (x, y) and average x coordinate X1 and X2 are contrasted respectively, are retained in point of the HCCI combustion in xTh pixels,
Preferably, xTh takes 3, i.e.,:Ask poor with the corresponding average coordinates value of each pixel extreme point, and difference taken absolute value,
To determine the distance between the pixel extreme point and average coordinates value, decide whether to retain the pixel extreme point with this.
The number of point to retaining in upper M pixel column is counted as N1, to the number meter of the point retained in lower M pixel column
Number is N2.
3rd, work as N1, when N2 is more than preset value PnTh, then seek remaining point the x coordinate sum sumx1 of top and the bottom respectively,
Sumx2, i.e. the x coordinate of all pixels extreme point to retaining in upper M pixel column is summed, to what is retained in lower M pixel column
The x coordinate summation of all pixels extreme point;Respective average avg1, avg2 are asked again:Avg1=sumx1/N1, avg2=sumx2/
N2。
It should be noted that when M is 5, preset value PnTh values are 2, and when M is 10, preset value PnTh values are 3.
4th, using the average of retention point, according to equation below by the coarse positioning point P of bandimThe x coordinate of (x, y) is changed into fine
Point PiaThe x coordinate of (x, y);
Pia.x=Pim.x+(sumx1+sumx2)/(N1+N2)。
5th, the fine-point position of whole coarse positioning points is obtained, as shown in Figure 5.
<Sort out point>
Because the graded of track left and right edges on the contrary, and occur in lane width field inside gradient extreme point in pairs
Characteristic, can be sorted out for each single fine-point in the embodiment of the present invention, can also be directed to retain it is all into
Fine-point is sorted out, by calculating the slope of fine-point, whether within a preset range the slope is judged, when their slope
When meeting the limitation of gradient maxima minimum value, then it is classified as same straight line class, the point as on straight line.Classifying method is such as
Under:
Step 41, using i-th point in multiple fine-points as datum mark, determine that first is straight according to the datum mark
Line class, wherein i are 1,2,3,4 ...;
Step 42, by i+1 fine-point compared with the datum mark;
If the i+1 fine-point records the i+1 fine-point in the range of datum mark determination
In the first straight line class, and the new datum mark using the i+1 fine-point as the first straight line class;
If the i+1 fine-point not in the range of datum mark determination, increases straight line class newly, and with described the
Datum mark of the i+1 fine-point as the newly-increased straight line class;
Step 43, by unclassified multiple fine-points respectively with the datum mark of the first straight line class and described newly-increased
The datum mark of straight line class is compared, and is completed until multiple fine-points are sorted out.
Wherein, step 42 includes:
Step 421, judge whether the Y value of the fine point coordinates of the i+1 is equal with the Y value of the benchmark point coordinates;
If equal, the i+1 fine-point is not in the range of datum mark determination;
If unequal, step 422 is performed;
Step 422, judge the slope of the i+1 fine-point whether in default slope range;
If in the default slope range, the i+1 fine-point is in the range of datum mark determination;
If not in the default slope range, the i+1 fine-point is not in the model of datum mark determination
In enclosing.
Specifically, by taking the classification of paired fine-point as an example:
1st, in the embodiment of the present invention, with first couple of fine-point P0aPoint on the basis of (x, y), from second couple of fine-point Pia(x,
Y), P(i+1)a(x, y) starts, and first determines whether that second pair of fine and reference pair of points y-coordinate is unequal, and the difference of y-coordinate is small
When threshold value yTh, illustrate that they not in same band and threshold range in longitudinal comparison, then it is such with storage to calculate it
Fine-point pair slope information:Pia(x, y) and P0a(x,y)、P(i+1)a(x, y) and P1aThe slope information k of (x, y) pointi=
(Pia.x-P0a.x)/(Pia.y-P0a.y), ki+1=(P(i+1)a.x-P1a.x)/(P(i+1)a.y-P1a.y);
When y-coordinate is equal, datum mark of this point for new straight line class is stored, then selects lower a pair of fine-points to carry out
State operation.
It should be noted that in the present embodiment, threshold value yTh values are 5.
2nd, for calculating the slope information k of i-th pair fine-pointi,ki+1, by the image diverse location pre-set
The slope information allocation list data in region draw the slope average Kavg of corresponding fine-point, as MinRate*Kavg≤Ki≤
MaxRate*Kavg and MinRate*Kavg≤Ki+1During≤MaxRate*Kavg, then it is assumed that the corresponding benchmark of this fine-point
They, then be referred in respective straight line class by the point that point belongs on same straight line, and this substitutes original base to fine-point
It is on schedule right, the new datum mark in straight line class where turning into it each.
Follow-up fine-point is then compared with the new datum mark in the straight line class, to judge whether it is the straight line class
On point.
If judging the slope of the fine-point not in the slope range, benchmark of this point for new straight line class is stored
Point, next fine-point is then selected to carry out aforesaid operations.
Usual minimum slope coefficient MinRate takes 0.8, and greatest gradient coefficient MaxRate takes 1.2.
All fine-points are both needed to carry out this classification process, the fine-point finally retained respectively constitutes multiple straight line classes.
Classification process and the classification process to each single fine-point to each fine-point pair are identicals.This hair
In bright embodiment, after sorting out to all fine-points, in addition to multiple straight line classes are screened, determine the lane line
Left side straight line class and right side straight line class;
1st, after sorting out to all fine-points, each straight line class divided is screened using the number P mNum of its point, point
Number is more than the straight line class of preset value, illustrates that its is representative, therefore retains, and when in a certain straight line class, the number of point is special
It is few, less than preset value, then illustrate the straight line class without representativeness, therefore give up, generally, preset value is 4.
To the straight line class of reservation, slope calculations Km.
2nd, the number P mNum of the point of different straight lines is adjusted using the slope of each straight line class so that positioned at image
The points weight of the straight line of intermediate region reduces, and the points weight positioned at the inclined straight line of the right and left increases.Adjusting formula is:
PmNum2=PmNum/ (moffset-poffset*Km) (5)
Usual moffset takes 1500, poffset to take 2;Step 5 is performed again;
3rd, as shown in fig. 6, multiple straight line classes for selecting, when the straight line extended line calculated using slope K m is less than 0
Or during more than picture traverse, illustrate that the straight line that this straight line class represents belongs to angled straight lines in the picture, i.e., positioned at the right and left
Region, then in the straight line class more than threshold value BandNumLR of counting, two most straight line classes of selection points are as final car
The left and right sides of diatom;Otherwise illustrate that lane line is located at image intermediate region, give up.
In the embodiment of the present invention, straight line parameter is calculated using Hough transform during straight-line detection, but make full use of car
Slope information and the presupposed information contrast of road left and right sides, it is assigned to by comparing presupposed information by the fine-point filtered out different
In straight line class, this inherently overcomes Hough detection parameters and calculates the problem of time-consuming, greatly increases algorithm computing speed
Degree, enables to realize real-time lane detection in embedded system.
<Straight line is shown>
Point in the left side straight line class is divided into upper and bottom section according to number, and calculates average coordinates respectively
(xUpper left, yUpper left)、(xLower-left, yLower-left);
Point in the right side straight line class is divided into upper and bottom section according to number, and calculates average coordinates respectively
(xUpper right, yUpper right)、(xBottom right, yBottom right);
Calculate middle coordinate xOn=(xUpper left+xUpper right)/2, yOn=(yUpper left+yUpper right)/2;xUnder=(xLower-left+xBottom right)/2, yUnder=(yLower-left+
yBottom right)/2;
As shown in fig. 7, connect the middle coordinate (xOn, yOn) and (xUnder, yUnder) lane line is obtained, complete lane line
Detection.
In summary, specific case used herein is to a kind of method for detecting lane lines provided in an embodiment of the present invention
Embodiment is set forth, and the explanation of above example is only intended to help and understands the solution of the present invention and its core concept;
Meanwhile for those of ordinary skill in the art, according to the thought of the present invention, can in specific embodiments and applications
There is change part, in summary, this specification content should not be construed as limiting the invention, and protection scope of the present invention should be with
Appended claim is defined.
Claims (8)
1. a kind of method for detecting lane lines, it is characterised in that methods described includes:
Step 1:The image of vehicle front is obtained, described image is switched into gray-scale map;
Step 2:Coarse positioning is carried out to the lane line using the half-tone information of described image, and records coarse positioning point;
Step 3:The positioning that becomes more meticulous is carried out to the coarse positioning point, and retains fine-point;
Step 4:The fine-point is sorted out, obtains multiple straight line classes;
Step 5:The lane line is obtained according to multiple straight line classes.
2. according to the method for claim 1, it is characterised in that the step 2 includes:
Step 21, using the half-tone information of described image described image is divided into multiple bands, each band is divided equally
For multiple block of pixels;
Step 22, the grey scale pixel value to each block of pixels are summed, and obtain the Sum values of each block of pixels;
The Sum values of each block of pixels of step 23, basis obtain the gradient data of each band;
Step 24, maximum point and minimum point are searched in the gradient data of each band;
Step 25, the maximum point and the minimum point be recorded as the coarse positioning point.
3. according to the method for claim 2, it is characterised in that the step 3 includes:
Step 31, choose coarse positioning point Pi(x, y), with the coarse positioning point PiCentered on (x, y), M pixel column is respectively taken up and down,
Left and right respectively takes xOffset pixel;
Step 32, the coarse positioning point to each pixel column carry out convolution algorithm, obtain the pixel pole of each pixel column
Value point;
Step 33, the abscissa to multiple pixel extreme points in upper M pixel column are averaged X1, in lower M pixel column
The abscissas of multiple pixel extreme points average X2;
Step 34, judge the exhausted of each pixel extreme point in the upper M pixel column and the difference of the average value X1
Whether preset value is more than to value, if so, then giving up the pixel extreme point, if it is not, then retaining the pixel extreme point;
Step 35, the difference for judging each pixel extreme point and the average value X2 in the lower M pixel column respectively
Absolute value whether be more than preset value, if so, then giving up the pixel extreme point, if it is not, then retaining the pixel extreme point;
Step 36, step 31 is performed successively to step 35 to all coarse positioning points;
Step 37, the coordinate of multiple pixel extreme points of reservation switched into fine point coordinates.
4. according to the method for claim 3, it is characterised in that the step 36 includes:
The fine-point PiaThe abscissa of (x, y) is:
Pia.x=Pim.x+(sumx1+sumx2)/(N1+N2)
Wherein, Pim.xFor the value of the abscissa of the pixel extreme point of reservation;Sumx1 is the multiple of reservation in upper M pixel column
The abscissa sum of the pixel extreme point;N1 is the number of the multiple pixel extreme points retained in upper M pixel column;
Sumx2 is the abscissa sum of the multiple pixel extreme points retained in lower M pixel column;N2 is to be protected in lower M pixel column
The number of the multiple pixel extreme points stayed.
5. according to the method for claim 3, it is characterised in that the step 4 includes:
Step 41, using i-th point in multiple fine-points as datum mark, first straight line is determined according to the datum mark
Class, wherein i are 0,1,2,3,4 ...;
Step 42, by i+1 fine-point compared with the datum mark;
If the i+1 fine-point is recorded in institute by the i+1 fine-point in the range of datum mark determination
State in first straight line class, and the new datum mark using the i+1 fine-point as the first straight line class;
If the i+1 fine-point in the range of datum mark determination, does not increase straight line class newly, and with the i+1
Datum mark of the individual fine-point as the newly-increased straight line class;
Step 43, by unclassified multiple fine-points datum mark with the first straight line class and the newly-increased straight line respectively
The datum mark of class is compared, and is completed until multiple fine-points are sorted out.
6. according to the method for claim 5, it is characterised in that the step 42 includes:
Step 421, judge whether the Y value of the fine point coordinates of the i+1 is equal with the Y value of the benchmark point coordinates;
If equal, the i+1 fine-point is not in the range of datum mark determination;
If unequal, step 422 is performed;
Step 422, judge the slope of the i+1 fine-point whether in default slope range;
If in the default slope range, the i+1 fine-point is in the range of datum mark determination;
If not in the default slope range, the i+1 fine-point is not in the range of datum mark determination.
7. according to the method for claim 6, it is characterised in that the default slope range is the benchmark point coordinates institute
Predeterminable area in corresponding slope range, including:
The slope average Kavg of predeterminable area according to where default slope information allocation list obtains the datum mark;
The slope average is multiplied by greatest gradient coefficient and minimum slope coefficient, oblique in the datum mark region to obtain
Rate scope;
Wherein, the greatest gradient coefficient is 1.2, and the minimum slope coefficient is 0.8.
8. according to the method for claim 5, it is characterised in that the step 5 includes:
Multiple straight line classes are screened, determine the left side straight line class and right side straight line class of the lane line;
Point in the left side straight line class is divided into upper and bottom section according to number, and calculates average coordinates respectively
(xUpper left, yUpper left)、(xLower-left, yLower-left);
Point in the right side straight line class is divided into upper and bottom section according to number, and calculates average coordinates respectively
(xUpper right, yUpper right)、(xBottom right, yBottom right);
Calculate middle coordinate xOn=(xUpper left+xUpper right)/2, yOn=(yUpper left+yUpper right)/2;xUnder=(xLower-left+xBottom right)/2, yUnder=(yLower-left+
yBottom right)/2;
Connect the middle coordinate (xOn, yOn) and (xUnder, yUnder) obtain the lane line.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710957864.6A CN107832674B (en) | 2017-10-16 | 2017-10-16 | Lane line detection method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710957864.6A CN107832674B (en) | 2017-10-16 | 2017-10-16 | Lane line detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107832674A true CN107832674A (en) | 2018-03-23 |
CN107832674B CN107832674B (en) | 2021-07-09 |
Family
ID=61647976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710957864.6A Active CN107832674B (en) | 2017-10-16 | 2017-10-16 | Lane line detection method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107832674B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109117757A (en) * | 2018-07-27 | 2019-01-01 | 四川大学 | A kind of method of drag-line in extraction Aerial Images |
CN110135252A (en) * | 2019-04-11 | 2019-08-16 | 长安大学 | A kind of adaptive accurate lane detection and deviation method for early warning for unmanned vehicle |
CN110490033A (en) * | 2018-10-29 | 2019-11-22 | 长城汽车股份有限公司 | Image processing method and device for lane detection |
CN111178193A (en) * | 2019-12-18 | 2020-05-19 | 深圳市优必选科技股份有限公司 | Lane line detection method, lane line detection device and computer-readable storage medium |
CN111460072A (en) * | 2020-04-01 | 2020-07-28 | 北京百度网讯科技有限公司 | Lane line detection method, apparatus, device, and storage medium |
CN112581473A (en) * | 2021-02-22 | 2021-03-30 | 常州微亿智造科技有限公司 | Method for realizing surface defect detection gray level image positioning algorithm |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20140132210A (en) * | 2013-05-07 | 2014-11-17 | 숭실대학교산학협력단 | Lane detection method and system |
CN105224909A (en) * | 2015-08-19 | 2016-01-06 | 奇瑞汽车股份有限公司 | Lane line confirmation method in lane detection system |
CN105426863A (en) * | 2015-11-30 | 2016-03-23 | 奇瑞汽车股份有限公司 | Method and device for detecting lane line |
CN107025432A (en) * | 2017-02-28 | 2017-08-08 | 合肥工业大学 | A kind of efficient lane detection tracking and system |
-
2017
- 2017-10-16 CN CN201710957864.6A patent/CN107832674B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20140132210A (en) * | 2013-05-07 | 2014-11-17 | 숭실대학교산학협력단 | Lane detection method and system |
CN105224909A (en) * | 2015-08-19 | 2016-01-06 | 奇瑞汽车股份有限公司 | Lane line confirmation method in lane detection system |
CN105426863A (en) * | 2015-11-30 | 2016-03-23 | 奇瑞汽车股份有限公司 | Method and device for detecting lane line |
CN107025432A (en) * | 2017-02-28 | 2017-08-08 | 合肥工业大学 | A kind of efficient lane detection tracking and system |
Non-Patent Citations (2)
Title |
---|
XINXIN DU ET AL.: "Vision-based approach towards lane line detection and vehicle localization", 《MACHINE VISION AND APPLICATIONS》 * |
刘永涛: "基于环境感知技术的客运车辆危险行驶状态辨识技术研究", 《中国博士学位论文全文数据库 工程科技Ⅱ辑》 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109117757A (en) * | 2018-07-27 | 2019-01-01 | 四川大学 | A kind of method of drag-line in extraction Aerial Images |
CN109117757B (en) * | 2018-07-27 | 2022-02-22 | 四川大学 | Method for extracting guy cable in aerial image |
CN110490033A (en) * | 2018-10-29 | 2019-11-22 | 长城汽车股份有限公司 | Image processing method and device for lane detection |
CN110490033B (en) * | 2018-10-29 | 2022-08-23 | 毫末智行科技有限公司 | Image processing method and device for lane detection |
CN110135252A (en) * | 2019-04-11 | 2019-08-16 | 长安大学 | A kind of adaptive accurate lane detection and deviation method for early warning for unmanned vehicle |
CN111178193A (en) * | 2019-12-18 | 2020-05-19 | 深圳市优必选科技股份有限公司 | Lane line detection method, lane line detection device and computer-readable storage medium |
CN111460072A (en) * | 2020-04-01 | 2020-07-28 | 北京百度网讯科技有限公司 | Lane line detection method, apparatus, device, and storage medium |
CN111460072B (en) * | 2020-04-01 | 2023-10-03 | 北京百度网讯科技有限公司 | Lane line detection method, device, equipment and storage medium |
CN112581473A (en) * | 2021-02-22 | 2021-03-30 | 常州微亿智造科技有限公司 | Method for realizing surface defect detection gray level image positioning algorithm |
CN112581473B (en) * | 2021-02-22 | 2021-05-18 | 常州微亿智造科技有限公司 | Method for realizing surface defect detection gray level image positioning algorithm |
Also Published As
Publication number | Publication date |
---|---|
CN107832674B (en) | 2021-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107832674A (en) | A kind of method for detecting lane lines | |
CN107330376B (en) | Lane line identification method and system | |
CN103996053B (en) | Lane departure alarm method based on machine vision | |
Yan et al. | A method of lane edge detection based on Canny algorithm | |
US8902053B2 (en) | Method and system for lane departure warning | |
CN107463890B (en) | A kind of Foregut fermenters and tracking based on monocular forward sight camera | |
CN105678287B (en) | A kind of method for detecting lane lines based on ridge measurement | |
Wang et al. | A vision-based road edge detection algorithm | |
CN108052904B (en) | Method and device for acquiring lane line | |
CN112381870B (en) | Binocular vision-based ship identification and navigational speed measurement system and method | |
CN102938057B (en) | A kind of method for eliminating vehicle shadow and device | |
CN106887004A (en) | A kind of method for detecting lane lines based on Block- matching | |
JP2016205887A (en) | Road surface gradient detection device | |
CN103413308A (en) | Obstacle detection method and device | |
US20200125869A1 (en) | Vehicle detecting method, nighttime vehicle detecting method based on dynamic light intensity and system thereof | |
CN104318225A (en) | License plate detection method and device | |
CN105279512A (en) | Tilt vehicle license plate recognition method and device | |
CN104915642B (en) | Front vehicles distance measuring method and device | |
CN107194946A (en) | A kind of infrared obvious object detection method based on FPGA | |
CN105551046A (en) | Vehicle face location method and device | |
CN104700066A (en) | Method and device for detecting whether driver wears safety belt or not | |
CN107133600A (en) | A kind of real-time lane line detection method based on intra-frame trunk | |
CN109961065B (en) | Sea surface ship target detection method | |
CN108256385A (en) | The front vehicles detection method of view-based access control model | |
Chen et al. | A novel lane departure warning system for improving road safety |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |