CN103839264A - Detection method of lane line - Google Patents
Detection method of lane line Download PDFInfo
- Publication number
- CN103839264A CN103839264A CN201410065412.3A CN201410065412A CN103839264A CN 103839264 A CN103839264 A CN 103839264A CN 201410065412 A CN201410065412 A CN 201410065412A CN 103839264 A CN103839264 A CN 103839264A
- Authority
- CN
- China
- Prior art keywords
- straight line
- candidate
- point
- image
- vanishing point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 25
- 238000000034 method Methods 0.000 claims abstract description 29
- 230000009466 transformation Effects 0.000 claims abstract description 4
- 230000001186 cumulative effect Effects 0.000 claims abstract description 3
- 238000001914 filtration Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 claims description 7
- 239000012141 concentrate Substances 0.000 claims description 6
- 241001212149 Cathetus Species 0.000 claims description 5
- 230000008859 change Effects 0.000 abstract description 2
- 238000000605 extraction Methods 0.000 abstract description 2
- 238000004364 calculation method Methods 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 239000000523 sample Substances 0.000 description 5
- 230000003044 adaptive effect Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 240000008168 Ficus benjamina Species 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000004040 coloring Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000012468 concentrated sample Substances 0.000 description 1
- 238000005265 energy consumption Methods 0.000 description 1
- 238000003912 environmental pollution Methods 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000004642 transportation engineering Methods 0.000 description 1
Images
Abstract
The invention discloses a detection method of a lane line. At first, edge information of a road image is extracted by means of a self-adaption threshold value edge extraction algorithm, cumulative probability Hough transformation in which an edge gradient direction constraint is brought is utilized so that straight lines can be quickly detected from an edge image, then the accurate positions of blanking points are estimated between successive frames by means of the detected straight lines through an iterative calculation method, a part of the straight lines not deriving from the edge of the lane line are filtered out by means of blanking point constraints, a tracking door is built so that noise can be further filtered out, candidate straight lines are obtained, and ultimately a pair of optimal straight lines are selected from the candidate straight lines to represent a current lane line through use of color information of candidate straight line neighborhoods and lane line information detected by a previous frame. According to the method, robustness on road environment change is achieved, the calculated quantity is small and real-time performance is achieved.
Description
Technical field
The present invention relates to vehicle assistant drive technical field, particularly a kind of lane line method for quick for real-time system.
Background technology
The invention of automobile has changed the mankind's trip mode, and it effectively promotes flowing of commodity and personnel with its convenience and high efficiency, and economical and social development are played an important role.By 2010, approximately there was the various automobile that adds up to 1,000,000,000 in the whole world, and this numeral is still in rapid growth.But, many cities are all faced with the problems of being brought by the excessive use of personal automobile in the world at present, and such as various serious traffic hazards, the vexed series of problems such as road congestion, huge energy resource consumption, noise pollution and environmental pollution have just seriously reduced townie quality.Along with scientific and technological progress and social development, people are more and more higher in the requirement of safety, energy consumption, the aspect such as convenient, comfortable to automobile.And the technology such as sensor technology, control automatically, artificial intelligence, machine vision are increasingly mature and application more and more in Communication and Transportation Engineering, this background has expedited the emergence of a new developing direction---the intelligent vehicle of automobile industry.
At present, every country has been carried out the research of intelligent automobile in succession, but is limited to the development of technology, and machine replaces the mankind completely to be completed vehicle drive and also need a period of time.The intelligent vehicle of this life often adopts DAS (Driver Assistant System) to reduce driving workload, guarantees the safety of human pilot.The automatic parking backup system that such as can see in high-end vehicle, Brake Assist, reversing aid system, driving backup system, lane keeping backup system etc.
Wherein the gordian technique of lane keeping backup system is lane detection, can obtain the relative position relation of vehicle and lane line by it, and then can remind the travel conditions of human pilot vehicle, the deviation problem that can effectively solve fatigue driving or human negligence and produce, and then improve travel safety.
The key issue of lane detection is how to extract the feature of lane line, and which kind of model to simulate lane line with.Because the residing environment of road is subject to the impact of the many factors such as weather, illumination and road conditions, the detection algorithm that complexity is low is difficult to find the feature and the matching lane line accurately that conform and change, and the high detection algorithm of complexity cannot guarantee the real-time of system conventionally, be difficult to be applied to the vehicle of running at high speed.
Therefore, be necessary to propose a kind of real-time, reliable and stable method for detecting lane lines.
Summary of the invention
It is that current method for detecting lane lines is difficult to meet the real-time of lane detection and the problem of robustness that the present invention wants technical solution problem.
For solving the problems of the technologies described above, the present invention proposes a kind of detection method of the lane line based on vanishing point constraint, and the method comprises the following steps:
Step 2, carries out pre-service to described road image, obtains through pretreated road image I
p,t;
Step 3, detects pretreated road image I
p,tin marginal information, obtain edge image I
e,tand the gradient direction of each marginal point;
Step 4, from described edge image I
emiddle detection of straight lines, obtains straight line set { l
t,i| l
t,i=(k
t,i, b
t,i), i=1,2 ... N}, wherein, k and b are respectively slope and the intercept of straight line, and N is the quantity of the straight line that detects;
Step 5, utilizes the straight line obtaining in step 4 to estimate the position of vanishing point in described road image;
Step 6, utilizes vanishing point to retrain the partial noise in the straight line obtaining in step 4 described in filtering;
Step 7, by building the noise in the further filtering straight line of tracking gate, obtains candidate's straight line collection of the right and left;
Step 8, calculates described candidate's straight line and concentrates the posterior probability of every candidate's straight line, then concentrates from left and right candidate's straight line left side bearing and the right side bearing of selecting the straight line of a pair of posterior probability maximum to come matching track, finally obtains the lane line detecting.
The obtained beneficial effect of the present invention has:
The present invention is by using the Boundary extracting algorithm of adaptive threshold, the impact that can alleviate illumination variation;
The present invention is by using PPHT, and detection of straight lines from edge image rapidly, guarantees the real-time of straight-line detection, and by introducing edge gradient direction, improve the accuracy of straight-line detection;
The present invention passes through proposed between successive frame, the straight line that utilization detects is estimated the method for the position of vanishing point, can obtain fast the accurate location of vanishing point, and utilize vanishing point to retrain the partial noise in filtering straight line, improve the robustness of lane detection;
The present invention is by using tracking gate to obtain candidate's straight line, further filtering noise;
The present invention is by using colouring information in candidate straight line neighborhood and the historical information of lane line, selects optimum straight line to represent current track, thereby improves the accuracy of lane detection.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of method for detecting lane lines of the present invention;
Fig. 2 is the duality schematic diagram between the straight line in vanishing point and the parameter space in image space of the present invention;
Fig. 3 is the process flow diagram of vanishing point location estimation of the present invention;
Fig. 4 is tracking gate schematic diagram of the present invention;
Fig. 5 is candidate's straight line neighborhood gray scale check schematic diagram of the present invention.
Embodiment
For making the object, technical solutions and advantages of the present invention clearer, below in conjunction with specific embodiment, and with reference to accompanying drawing, the present invention is described in more detail.
The present invention proposes a kind of detection method of lane line, as shown in Figure 1, said method comprising the steps of:
Step 2, carries out pre-service to described road image, obtains through pretreated road image I
p,t;
Image pre-service is mainly to prepare in order to carry out the operations such as feature extraction below, mainly comprises gray processing, smoothing denoising, yardstick convergent-divergent or the image cropping of road image, and wherein, the object of graphical rule convergent-divergent is to reduce calculated amount; Image cropping also can reach similar object, particularly include region, non-road surface in image in.
The object of follow-up all operations is the image I after convergent-divergent
p,t, wherein t is frame number.
Step 3, detects pretreated road image I
p,tin marginal information, obtain edge image I
e,tand the gradient direction of each marginal point;
For edge extracting, in an embodiment of the present invention, use the method that is similar to Canny algorithm to extract marginal information, different, what adopt here is adaptive threshold value.Adaptive threshold value is determined by histogram of gradients, supposes that in image, only having the pixel of P% is marginal point, can release high threshold T
h, wherein, P is rule of thumb set by user, and certainly, the span of P is between 0-100, same, and the certainly not pixel ratio at edge that arranges in image can be released low threshold value T
l.With respect to fixed threshold is set, the method that this adaptive threshold calculates can successfully manage the impact of illumination variation.
Step 4, from described edge image I
emiddle detection of straight lines, obtains straight line set { l
t,i| l
t,i=(k
t,i, b
t,i), i=1,2 ... N}, wherein, k and b are respectively slope and the intercept of straight line, and N is the quantity of the straight line that detects;
For straight-line detection, in an embodiment of the present invention, use cumulative probability Hough transformation (PPHT) from edge image I
e,tthe straight line that middle detection exists, compared with standard Hough transformation (SHT), PPHT only uses part point to vote, thereby has improved the speed of straight-line detection.And in order to eliminate the impact of noise, improve the accuracy of straight-line detection, in the process of marginal point ballot, each marginal point is only vertical with its gradient direction or approximately perpendicular straight line ballot, and gradient information calculated in the process of edge extracting, here without double counting.
Step 5, utilizes the straight line obtaining in step 4 to estimate the position of vanishing point in described road image;
Estimate that the position of vanishing point is that to detect in the straight line obtaining be not the straight line that comes from lane line edge in order to remove.Due to perspective effect, in image, parallel lines converges on vanishing point, because lane line is parallel lines at real world, so lane line also converges on vanishing point.Utilize the duality between the point in image space X-Y cathetus and parameter space K-B, can be by the straight line l obtaining
t,ibe converted into the point (k in parameter space K-B
t,i, b
t,i), because vanishing point is the intersection point of parallel lines under perspective effect, the location estimation problem of vanishing point can be converted into the fitting a straight line problem in parameter space K-B so, as shown in Figure 2.In Fig. 2, if the position of vanishing point is (x in image space X-Y
v, y
v), so in parameter space K-B, (k
t,i, b
t,i) the straight line l at place can be expressed as b=-x
vk+y
v.Owing to there being part exterior point in parameter space K-B, i.e. noise spot, so in an embodiment of the present invention, uses weight w
t,iweigh a point (k
t,i, b
t,i) be the probability of interior point.Because which point is that exterior point is unknown, to regard weight w as hidden variable here, and adopt expectation-maximization algorithm (EM algorithm) to come the position of iterative estimate vanishing point, the number of times of iteration represents with j.Fig. 3 is the process flow diagram of vanishing point location estimation of the present invention, and as shown in Figure 3, the location estimation of vanishing point comprises the following steps:
Step 51, if t>1, the initial position of vanishing point is set to previous frame estimates the position of the vanishing point obtaining, i.e. (x
v,t (0), y
v,t (0))=(x
v, t-1, y
v, t-1); If t=1, does not arrange the initial position of vanishing point; Iterations j is initialized as to 1, enters step 52 to step 54 and carry out iterative computation;
Step 52, calculates the weight of each point in the j time iteration parameter space K-B: if j=1 and t=1 distribute equal initial weight w for each point
1, i (1)=1/N
t, otherwise utilize following formula to calculate w
t,i j:
Namely: if j=1 and t>1, utilize previous frame t-1 frame estimate that the position of the vanishing point obtaining calculates initial weight w as initial position
t,i (1); If j>1, the position (x of the vanishing point obtaining by the j-1 time iteration
v,t (j-1), y
v,t (j-1)) each point (k in the K-B of undated parameter space
t,i, b
t,i) weight w
t,i (j).If point (k
t,i, b
t,i) from (x in K-B space
v,t (j-1), y
v,t (j-1)) nearer as the straight line of parameter, weight w
t,i (j)larger.
Step 53, according to the weight w after upgrading
t,i (j)calculate the parameter of the j time iteration K-B space cathetus, namely position (the x of vanishing point
v,t (j), y
v,t (j)), make the residual sum of squares (RSS) minimum with weight, and (x
v,t (j), y
v,t (j)) position (x of the vanishing point that calculates with previous frame
v, t-1, y
v, t-1) variation as much as possible little, minimize following objective function:
Wherein λ
1, λ
2with α be given coefficient, δ is unit sequence of impacts:
Try to achieve the j time iteration time vanishing point position (x
v,t (j), y
v,t (j)) be:
Wherein,
Step 54, if the estimated result of vanishing point position restrained, | x
v,t (j)-x
v,t (j-1)| and | y
v,t (j)-y
v,t (j-1)| in given error range, or reach given maximum iteration time N
max, termination of iterations and the position of vanishing point that obtains t frame are for (x
v,t, y
v,t)=(x
v,t (j), y
v,t (j)), proceed iteration otherwise return to step 52.
Step 6, utilizes vanishing point to retrain the partial noise in the straight line obtaining in step 4 described in filtering;
Utilizing vanishing point constraint to remove in the process of noise, first calculate vanishing point (x
v,t, y
v,t) and every straight line l
t,ibetween distance d
t,i, consider certain error, a distance threshold r need to be set
eif, vanishing point (x
v,t, y
v,t) to straight line l
t,ibetween distance be greater than given distance threshold r
e: d
t,i>r
e, think that this straight line is noise.
Step 7, by building the noise in the further filtering straight line of tracking gate, obtains the right and left candidate straight line collection;
Utilize tracking gate to determine that the step of candidate's straight line is further comprising the steps:
Step 71, builds left and right tracking gate, as shown in Figure 4;
Described left and right tracking gate is trapezoidal, and on it, the position of base mid point P1 is the position at the vanishing point place that obtains, and the slope of the straight line that the mid point P2 of left tracking gate bottom and P1 form equals the slope of the left side bearing in the track that t-1 frame detects.
The width w on the high h of given tracking gate, upper base
uwidth w with bottom
d, can determine the coordinate of the each end points of left tracking gate, in like manner can calculate the coordinate of the each end points of right tracking gate.
Step 72, if the intersection point on the upper and lower base of straight line and corresponding tracking gate all between two end points on base, place, this straight line is counted as candidate's straight line.
Utilize said method can determine respectively candidate's straight line collection of the right and left.
Step 8, calculates described candidate's straight line and concentrates the posterior probability of every candidate's straight line, then concentrates from left and right candidate's straight line left side bearing and the right side bearing of selecting the straight line of a pair of posterior probability maximum to come matching track, finally obtains the lane line detecting.
The calculating of the posterior probability of candidate's straight line is further comprising the steps:
Step 81, for every candidate's straight line, utilizes the half-tone information in its neighborhood to calculate its likelihood probability;
In order to reduce calculated amount, at candidate's straight line l
t,iupper random uniform sampling, calculate each sample point average gray of the block of pixels of four direction around, as shown in Figure 5, in Fig. 5, the edge pixel point that oblique line is filled is the current sample point being verified, block of pixels in overstriking square frame is the block of pixels that need to be verified, and wherein the block of pixels in the square frame of bottom right has higher gray scale.If maximum average gray is in four block of pixels that sample point m is around verified
and
be greater than given threshold value
this point is considered to an evidence point.By the sample point set S gathering
t,ithe ratio of middle evidence point is as the likelihood probability of this candidate's straight line, that is:
Wherein, II is indicator function, N
sfor the number of the concentrated sample point of sample point.
Step 82, utilizes the lane line L detecting in t-1 frame
t-1calculated candidate straight line l
t,iprior probability:
Wherein, γ
kand γ
bbe respectively the standard deviation of slope and intercept.
Step 83, calculates candidate's straight line l according to the likelihood probability obtaining and prior probability
t,iposterior probability.
Candidate's straight line l
t,iposterior probability be proportional to the product of prior probability and likelihood probability:
P(l
t,i|L
t-1,S
t,i)∝P(S
t,i|l
t,i)·P(l
t,i|L
t-1) (7)
The detection that above-mentioned steps circular flow can continue lane line.
Above-described specific embodiment; object of the present invention, technical scheme and beneficial effect are further described; institute is understood that; the foregoing is only specific embodiments of the invention; be not limited to the present invention; within the spirit and principles in the present invention all, any modification of making, be equal to replacement, improvement etc., within all should being included in protection scope of the present invention.
Claims (10)
1. a detection method for lane line, is characterized in that, the method comprises the following steps:
Step 1, obtains road image I by vehicle-mounted imageing sensor
t, wherein t is frame number;
Step 2, carries out pre-service to described road image, obtains through pretreated road image I
p,t;
Step 3, detects pretreated road image I
p,tin marginal information, obtain edge image I
e,tand the gradient direction of each marginal point;
Step 4, from described edge image I
emiddle detection of straight lines, obtains straight line set { l
t,i| l
t,i=(k
t,i, b
t,i), i=1,2 ... N}, wherein, k and b are respectively slope and the intercept of straight line, and N is the quantity of the straight line that detects;
Step 5, utilizes the straight line obtaining in step 4 to estimate the position of vanishing point in described road image;
Step 6, utilizes vanishing point to retrain the partial noise in the straight line obtaining in step 4 described in filtering;
Step 7, by building the noise in the further filtering straight line of tracking gate, obtains candidate's straight line collection of the right and left;
Step 8, calculates described candidate's straight line and concentrates the posterior probability of every candidate's straight line, then concentrates from left and right candidate's straight line left side bearing and the right side bearing of selecting the straight line of a pair of posterior probability maximum to come matching track, finally obtains the lane line detecting.
2. method according to claim 1, is characterized in that, in described step 4, uses cumulative probability Hough transformation from described edge image I
e,tthe straight line that middle detection exists.
3. method according to claim 2, is characterized in that, in the process of linear edge point ballot, each marginal point is only vertical with its gradient direction or approximately perpendicular straight line ballot.
4. method according to claim 1, it is characterized in that, in described step 5, utilize the duality between image space X-Y cathetus intersection point and parameter space K-B cathetus, the vanishing point estimation problem in image space X-Y is converted to the Parameter Estimation Problem in parameter space K-B.
5. method according to claim 4, is characterized in that, utilizes the duality between the point in image space X-Y cathetus and parameter space K-B, by the straight line l obtaining
t,ibe converted into the point (k in parameter space K-B
t,i, b
t,i), if the position of vanishing point is (x in image space X-Y
v, y
v), so in parameter space K-B, (k
t,i, b
t,i) the straight line l at place can be expressed as b=-x
vk+y
v.
6. method according to claim 4, is characterized in that, in the straight line that use detects simultaneously and previous frame, the position of vanishing point is as constraint.
7. method according to claim 4, is characterized in that, uses weight w
t,iweigh the point (k in a parameter space K-B
t,i, b
t,i) be the probability of interior point, and regard weight as hidden variable, then adopt expectation-maximization algorithm to come the position of iterative estimate vanishing point.
8. method according to claim 1, is characterized in that, in described step 6, first calculates vanishing point (x
v,t, y
v,t) and every straight line l
t,ibetween distance d
t,iif this distance is greater than given distance threshold r
e, think that this straight line is noise.
9. method according to claim 1, is characterized in that, in described step 7, utilizes tracking gate to determine that the step of candidate's straight line is further comprising the steps:
Step 71, builds left and right tracking gate;
Step 72, if the intersection point on the upper and lower base of straight line and corresponding tracking gate all between two end points on base, place, this straight line is counted as candidate's straight line.
10. method according to claim 1, is characterized in that, in described step 8, the calculating of the posterior probability of candidate's straight line is further comprising the steps:
Step 81, for every candidate's straight line, utilizes the half-tone information in its neighborhood to calculate its likelihood probability;
Step 82, utilizes the roadmarking L detecting in t-1 frame
t-1calculated candidate straight line l
t,iprior probability;
Step 83, calculates candidate's straight line l according to the likelihood probability obtaining and prior probability
t,iposterior probability.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410065412.3A CN103839264B (en) | 2014-02-25 | 2014-02-25 | A kind of detection method of lane line |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410065412.3A CN103839264B (en) | 2014-02-25 | 2014-02-25 | A kind of detection method of lane line |
Publications (2)
Publication Number | Publication Date |
---|---|
CN103839264A true CN103839264A (en) | 2014-06-04 |
CN103839264B CN103839264B (en) | 2016-09-14 |
Family
ID=50802730
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410065412.3A Active CN103839264B (en) | 2014-02-25 | 2014-02-25 | A kind of detection method of lane line |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103839264B (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104318258A (en) * | 2014-09-29 | 2015-01-28 | 南京邮电大学 | Time domain fuzzy and kalman filter-based lane detection method |
CN105426868A (en) * | 2015-12-10 | 2016-03-23 | 山东大学 | Lane detection method based on adaptive region of interest |
CN105912977A (en) * | 2016-03-31 | 2016-08-31 | 电子科技大学 | Lane line detection method based on point clustering |
CN106682586A (en) * | 2016-12-03 | 2017-05-17 | 北京联合大学 | Method for real-time lane line detection based on vision under complex lighting conditions |
CN106682563A (en) * | 2015-11-05 | 2017-05-17 | 腾讯科技(深圳)有限公司 | Lane line detection self-adaptive adjusting method and device |
CN106887004A (en) * | 2017-02-24 | 2017-06-23 | 电子科技大学 | A kind of method for detecting lane lines based on Block- matching |
CN107230212A (en) * | 2017-05-08 | 2017-10-03 | 武汉科技大学 | A kind of measuring method and system of the handset size of view-based access control model |
WO2017181658A1 (en) * | 2016-04-21 | 2017-10-26 | 深圳市元征科技股份有限公司 | Method and device for correcting straight direction for instructing vehicle traveling |
WO2018027500A1 (en) * | 2016-08-08 | 2018-02-15 | 深圳市锐明技术股份有限公司 | Lane line detection method and device |
CN109241929A (en) * | 2018-09-20 | 2019-01-18 | 北京海纳川汽车部件股份有限公司 | Method for detecting lane lines, device and the automatic driving vehicle of automatic driving vehicle |
CN109272536A (en) * | 2018-09-21 | 2019-01-25 | 浙江工商大学 | A kind of diatom vanishing point tracking based on Kalman filter |
CN109635816A (en) * | 2018-10-31 | 2019-04-16 | 百度在线网络技术(北京)有限公司 | Lane line generation method, device, equipment and storage medium |
CN110298845A (en) * | 2019-06-17 | 2019-10-01 | 中国计量大学 | It transmits electricity under a kind of complex background based on image procossing line detecting method |
CN110514163A (en) * | 2019-08-29 | 2019-11-29 | 广州小鹏汽车科技有限公司 | A kind of determination method and device of barrier profile, vehicle, storage medium |
CN110705444A (en) * | 2019-09-27 | 2020-01-17 | 四川长虹电器股份有限公司 | Lane tracking system and method |
WO2022051951A1 (en) * | 2020-09-09 | 2022-03-17 | 华为技术有限公司 | Lane line detection method, related device, and computer readable storage medium |
CN116681721A (en) * | 2023-06-07 | 2023-09-01 | 东南大学 | Linear track detection and tracking method based on vision |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004246641A (en) * | 2003-02-14 | 2004-09-02 | Nissan Motor Co Ltd | Traffic white lane line recognizing device |
CN103308056A (en) * | 2013-05-23 | 2013-09-18 | 中国科学院自动化研究所 | Road marking detection method |
CN103465906A (en) * | 2013-08-27 | 2013-12-25 | 东莞中国科学院云计算产业技术创新与育成中心 | Parking lot automatic parking implementation method based on immediacy sense |
-
2014
- 2014-02-25 CN CN201410065412.3A patent/CN103839264B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004246641A (en) * | 2003-02-14 | 2004-09-02 | Nissan Motor Co Ltd | Traffic white lane line recognizing device |
CN103308056A (en) * | 2013-05-23 | 2013-09-18 | 中国科学院自动化研究所 | Road marking detection method |
CN103465906A (en) * | 2013-08-27 | 2013-12-25 | 东莞中国科学院云计算产业技术创新与育成中心 | Parking lot automatic parking implementation method based on immediacy sense |
Non-Patent Citations (1)
Title |
---|
TAO SUN, SHUMING TANG AND ET AL.: "A Robust Lane Detection Method for Autonomous Car-like Robot", 《2013 FOURTH INTERNATIONAL CONFERENCE ON INTELLIGENT CONTROL AND INFORMATION PROCESSING,ICICIP》 * |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104318258A (en) * | 2014-09-29 | 2015-01-28 | 南京邮电大学 | Time domain fuzzy and kalman filter-based lane detection method |
CN104318258B (en) * | 2014-09-29 | 2017-05-24 | 南京邮电大学 | Time domain fuzzy and kalman filter-based lane detection method |
CN106682563A (en) * | 2015-11-05 | 2017-05-17 | 腾讯科技(深圳)有限公司 | Lane line detection self-adaptive adjusting method and device |
CN106682563B (en) * | 2015-11-05 | 2018-10-23 | 腾讯科技(深圳)有限公司 | A kind of lane detection self-adapting regulation method and device |
CN105426868A (en) * | 2015-12-10 | 2016-03-23 | 山东大学 | Lane detection method based on adaptive region of interest |
CN105426868B (en) * | 2015-12-10 | 2018-09-28 | 山东大学 | A kind of lane detection method based on adaptive area-of-interest |
CN105912977A (en) * | 2016-03-31 | 2016-08-31 | 电子科技大学 | Lane line detection method based on point clustering |
CN105912977B (en) * | 2016-03-31 | 2021-03-30 | 电子科技大学 | Lane line detection method based on point clustering |
WO2017181658A1 (en) * | 2016-04-21 | 2017-10-26 | 深圳市元征科技股份有限公司 | Method and device for correcting straight direction for instructing vehicle traveling |
WO2018027500A1 (en) * | 2016-08-08 | 2018-02-15 | 深圳市锐明技术股份有限公司 | Lane line detection method and device |
CN106682586A (en) * | 2016-12-03 | 2017-05-17 | 北京联合大学 | Method for real-time lane line detection based on vision under complex lighting conditions |
CN106887004A (en) * | 2017-02-24 | 2017-06-23 | 电子科技大学 | A kind of method for detecting lane lines based on Block- matching |
CN107230212B (en) * | 2017-05-08 | 2020-04-17 | 武汉科技大学 | Vision-based mobile phone size measuring method and system |
CN107230212A (en) * | 2017-05-08 | 2017-10-03 | 武汉科技大学 | A kind of measuring method and system of the handset size of view-based access control model |
CN109241929A (en) * | 2018-09-20 | 2019-01-18 | 北京海纳川汽车部件股份有限公司 | Method for detecting lane lines, device and the automatic driving vehicle of automatic driving vehicle |
CN109272536A (en) * | 2018-09-21 | 2019-01-25 | 浙江工商大学 | A kind of diatom vanishing point tracking based on Kalman filter |
CN109272536B (en) * | 2018-09-21 | 2021-11-09 | 浙江工商大学 | Lane line vanishing point tracking method based on Kalman filtering |
CN109635816A (en) * | 2018-10-31 | 2019-04-16 | 百度在线网络技术(北京)有限公司 | Lane line generation method, device, equipment and storage medium |
CN110298845A (en) * | 2019-06-17 | 2019-10-01 | 中国计量大学 | It transmits electricity under a kind of complex background based on image procossing line detecting method |
CN110514163A (en) * | 2019-08-29 | 2019-11-29 | 广州小鹏汽车科技有限公司 | A kind of determination method and device of barrier profile, vehicle, storage medium |
CN110514163B (en) * | 2019-08-29 | 2021-06-01 | 广州小鹏自动驾驶科技有限公司 | Method and device for determining obstacle contour, vehicle and storage medium |
CN110705444A (en) * | 2019-09-27 | 2020-01-17 | 四川长虹电器股份有限公司 | Lane tracking system and method |
CN110705444B (en) * | 2019-09-27 | 2022-02-08 | 四川长虹电器股份有限公司 | Lane tracking system and method |
WO2022051951A1 (en) * | 2020-09-09 | 2022-03-17 | 华为技术有限公司 | Lane line detection method, related device, and computer readable storage medium |
CN116681721A (en) * | 2023-06-07 | 2023-09-01 | 东南大学 | Linear track detection and tracking method based on vision |
CN116681721B (en) * | 2023-06-07 | 2023-12-29 | 东南大学 | Linear track detection and tracking method based on vision |
Also Published As
Publication number | Publication date |
---|---|
CN103839264B (en) | 2016-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103839264A (en) | Detection method of lane line | |
CN104008645B (en) | One is applicable to the prediction of urban road lane line and method for early warning | |
CN104657727B (en) | A kind of detection method of lane line | |
CN102270301B (en) | Method for detecting unstructured road boundary by combining support vector machine (SVM) and laser radar | |
CN104992145B (en) | A kind of square samples track tracking detection method | |
CN103177246B (en) | Based on the dual model Lane detection method of dynami c block division | |
CN104318258B (en) | Time domain fuzzy and kalman filter-based lane detection method | |
CN111563412B (en) | Rapid lane line detection method based on parameter space voting and Bessel fitting | |
CN102073852B (en) | Multiple vehicle segmentation method based on optimum threshold values and random labeling method for multiple vehicles | |
CN103714538B (en) | road edge detection method, device and vehicle | |
CN106446914A (en) | Road detection based on superpixels and convolution neural network | |
CN105069859B (en) | Vehicle running state monitoring method and device | |
CN103577809B (en) | A kind of method that traffic above-ground mark based on intelligent driving detects in real time | |
CN101976504B (en) | Multi-vehicle video tracking method based on color space information | |
CN104156731A (en) | License plate recognition system based on artificial neural network and method | |
CN105069415A (en) | Lane line detection method and device | |
CN104200485A (en) | Video-monitoring-oriented human body tracking method | |
CN103902985B (en) | High-robustness real-time lane detection algorithm based on ROI | |
CN102982304B (en) | Utilize polarized light image to detect the method and system of vehicle location | |
CN105224909A (en) | Lane line confirmation method in lane detection system | |
CN110378416A (en) | A kind of coefficient of road adhesion estimation method of view-based access control model | |
CN105426868A (en) | Lane detection method based on adaptive region of interest | |
CN105069441A (en) | Moving vehicle detection method based on background updating and particle swarm optimization algorithm | |
CN104077756A (en) | Direction filtering method based on lane line confidence | |
CN202134079U (en) | Unmanned vehicle lane marker line identification and alarm device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |