CN103206957A - Detecting and tracking method for lane lines of autonomous vehicle navigation - Google Patents

Detecting and tracking method for lane lines of autonomous vehicle navigation Download PDF

Info

Publication number
CN103206957A
CN103206957A CN2012105738234A CN201210573823A CN103206957A CN 103206957 A CN103206957 A CN 103206957A CN 2012105738234 A CN2012105738234 A CN 2012105738234A CN 201210573823 A CN201210573823 A CN 201210573823A CN 103206957 A CN103206957 A CN 103206957A
Authority
CN
China
Prior art keywords
point
road
pixel
line
connection degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012105738234A
Other languages
Chinese (zh)
Other versions
CN103206957B (en
Inventor
王相海
方玲玲
宋传鸣
秦钜鳌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haimen Mao Mao Art Design Co., Ltd.
Original Assignee
Liaoning Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liaoning Normal University filed Critical Liaoning Normal University
Priority to CN201210573823.4A priority Critical patent/CN103206957B/en
Publication of CN103206957A publication Critical patent/CN103206957A/en
Application granted granted Critical
Publication of CN103206957B publication Critical patent/CN103206957B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a detecting and tracking method for lane lines of autonomous vehicle navigation. The method comprises firstly converting video frames into grayscale images, describing a tight degree of local connection of each point in the images through fuzzy connection degree and membership function, and extracting general contour of roads; secondly, extracting edges by using a Canny operator, refining the images by using a refinement algorithm and detecting control points of the road edges; and thirdly, fitting the edge points of the roads by using a non-uniform B-spline interpolation (NUBS), that is, calculating the fitting points by using controlling points. Experimental results show that the method takes grayscale of the roads and gradient characteristics into account, and thus reduces interferences of shadows or other noises and is relatively high in accuracy. A probability of successful detection of the lane lines is 92 % for detection of 50 images.

Description

The lane line of vehicular autonomous navigation detects and tracking
Technical field
The present invention relates to the self-service air navigation aid of a kind of intelligent vehicle, the especially a kind of lane line that can handle the vehicular autonomous navigation of any crankcase ventilaton detects and tracking.
Background technology
In recent years, fast development along with China's economic construction, the population in city and communications and transportation are also flourish, these development have promoted the convenience of people-to-people contacts and material exchange greatly, the following negative effect that becomes increasingly conspicuous that also produced, such as frequent traffic congestion, road block and the traffic hazard etc. that occur, these negative effects are more and more seriously perplexing each metropolitan traffic.Point out that according to China's traffic survey report in recent years the mortality ratio that the traffic hazard of China causes is higher than developed country and some developing country far away.At above-mentioned serious day by day transport need; except utilizing Limited resources financial resources and environmental protection pressure; consider the fast development of urbanization science and technology process, intelligent transportation system (ITS-Intelligent Transportation Systems) is arisen at the historic moment.
What ITS mainly considered on the whole is the combination of driver, the vehicles and road, environment three aspect factor.Broadly, ITS comprises intelligent planning, design, enforcement and the operation etc. that realize traffic system, thereby then mainly be modern high technology to be combined with traffic and transportation system traditional traffic and transportation system is improved narrowly, form a kind of novel modern transportation system.Particularly, ITS be with advanced integrated applications such as electronic technology, infotech, sensing technology and system engineering technology on more perfect road equipment basis, thereby set up a kind of real-time precise and high efficiency, the comprehensive traffic management system that plays a role.It has taken full advantage of the condition of existing traffic infrastructure, has alleviated traffic congestion, has improved conevying efficiency, has ensured traffic safety.
At present, the traffic surveillance and control system among the ITS has adopted computer vision and image processing techniques more and more widely, and at traffic hazard day by day frequently, the intelligent vehicle navigation system also grows up thereupon gradually.The intelligent vehicle navigation system just refers to that information translation when vehicle travels that the driver is received is photoelectric image information, and machine respective handling as calculated can produce rational behavior planning and decision-making in complex environment again.In the intelligent vehicle navigation system, the detection of lane line and tracking are an one basis and important function, in addition, consider structuring characteristic (the so-called structuring of present road, showing the way on the face significantly exactly, pavement marker satisfies stronger geometric properties, interruption markings and continuous mark line etc. such as double amber lines in the road or white), detection and tracking with lane line can obtain lane line information easily, guarantee to travel or change the accuracy in track, improve the reliability of intelligent vehicle navigation system greatly.
Existing lane line detection and tracking method can be summed up as following three classes: based on the method in zone, based on the method for feature and based on the method for model.And be the lane line detection and follow the tracks of modal a kind of method based on the method for model, this method uses certain geometric model that lane line is carried out characterization, that is: use known model, for example coupling work that target image is correlated with such as straight line, para-curve, serpentine.And be lane line to be made straight line handle more widely based on using in the model method, and at straight-line detection preferably disposal route be the Hough conversion, this method is insensitive to noise, can handle the situation that the object part is blocked and covers in the image preferably.It is following not enough that but this method remains down: the required memory capacity of the first, Hough conversion is big, in case parameter space increases, calculated amount just can sharply rise, and expends huge storage space simultaneously; The second, huge calculated amount also can cause detection speed too slow, can't accomplish real-time control; The 3rd, precision is not high enough, and the information of expectation detects less than making false judgment on the contrary, and then produces a large amount of redundant datas; The 4th, need carry out one-to-many mapping and the exhaustive search operation of image space and parameter space, so that it calculates required time and space complexity is all very high, thereby increased time of lane line detection.
Summary of the invention
The present invention is in order to solve the above-mentioned technical matters of existing in prior technology, to provide a kind of lane line that can handle the vehicular autonomous navigation of any crankcase ventilaton to detect and tracking.
Technical solution of the present invention is: 1. the lane line of a vehicular autonomous navigation detects and tracking, it is characterized in that carrying out as follows:
Agreement: I refers to handled video frame images; M, n be line number and the columns of presentation video respectively; (o p) deposits in the image I each pixel p with respect to the fuzzy connection degree of seed point o to array f
Figure 642127DEST_PATH_IMAGE001
Formation Q is used for depositing the pixel value of waiting to ask for fuzzy connection degree; T is used for relatively judging for extracting the set higher value of road point; DELETE_ANGLE is threshold value, is used for the choice operation of Vector Lane angle value;
A. initial setting up
Obtain video frame images and variable is carried out the initialization setting;
B. fuzzy connection degree is handled
B.1 the line number of reading images and columns arrange in the image I each pixel with respect to the fuzzy connection degree of seed point o
Figure 507315DEST_PATH_IMAGE001
Be 0, namely (o is zero p) to f; The fuzzy connection degree that seed point is set is 1;
B.2 the neighbours territory point of o is put in the middle of the formation Q;
B.3 if formation Q is empty, then change b.6 step over to; Otherwise, from Q, shift out a pixel p;
B.4 if f (o, p) less than for extracting the threshold value T of road point setting, then the maximum path of each neighborhood d of pixel p is:
Figure 708489DEST_PATH_IMAGE002
B.5 if
Figure 291917DEST_PATH_IMAGE003
, so
Figure 354682DEST_PATH_IMAGE004
, and all are satisfied
Figure 339956DEST_PATH_IMAGE005
Pixel q insert formation Q; Change over to b.3;
B.5 for all pixel p, if (o p) greater than zero, judges that then p is the pixel on the road to its fuzzy connection degree f, and its gray-scale value is 1; Otherwise, judge that p is not the pixel on the road, its gray-scale value is zero;
C. Edge extraction and refinement
Convert image I to I ' by OpenCV from tape function cvCanny and parallel thinning algorithm;
D.Vector Lane algorithm
D.1 use 3 horizontal lines with image I ' be divided into 4 parts;
D.2 utilize sweep trace bottom-up, from the centre to both sides, line by line scan, find two initial control point of road, left and right sides route respectively
Figure 649715DEST_PATH_IMAGE006
D.3 make up the vector of left and right sides road respectively
Figure 782756DEST_PATH_IMAGE007
D.4 calculate the curvature of road line boundary according to following formula:
Figure 836162DEST_PATH_IMAGE008
D.5 judge
Figure 987527DEST_PATH_IMAGE009
Whether greater than a certain angle DELETE_ANGLE that sets in advance, if greater than, then this point is not starting point on both sides of the road, need proceed to scan, till finding starting point;
E. marginal point extracts
E.1 continue upwards to scan to both sides successively respectively from mid point, finding pixel value is 255 point
Figure 468187DEST_PATH_IMAGE010
, with
Figure 822945DEST_PATH_IMAGE011
Carry out calculating of vector value
Figure 680042DEST_PATH_IMAGE012
E.2 advance to calculate this curvature and last one difference of curvature constantly respectively:
Figure 187378DEST_PATH_IMAGE013
,
Figure 838939DEST_PATH_IMAGE014
E.3 if the gained result
Figure 946573DEST_PATH_IMAGE015
Greater than DELETE_ANGLE, then
Figure 341782DEST_PATH_IMAGE016
Not the gained reference mark, need to continue left or scanning to the right, scan at 255 and still calculate by said process afterwards;
E.4 if this line scanning is counted and surpassed 5 and still do not have point up to specification, then continue upwards to scan, till finding point up to specification;
F. utilize the NUBS method to carry out match at above-mentioned resulting reference mark;
G. repeat b ~ f, finish up to the video frame images processing.
The present invention at first is converted to gray level image with frame of video, describes locally-attached tightness degree of each point in the image with bluring connection degree and membership function, extracts the general profile of road; Secondly, utilize Canny operator extraction edge, use thinning algorithm that image is carried out refinement, utilize line of vector to detect the road edge reference mark then; At last, utilize non-consistent B spline interpolation (NUBS) that the road edge point is carried out match, namely utilize control vertex to calculate the match summit.Experimental result shows that method of the present invention has reduced shade or other interference of noise owing to considered gray scale, the gradient feature of road, has higher accuracy rate, and the probability of successful inspection vehicle diatom is 92% in the detection of 50 width of cloth images.
Compared with prior art, the present invention has following beneficial effect:
The first, fuzzy connection degree and membership function have been described the connectedness in point-to-point transmission feature correlation and zone from the angle of the overall situation.Before carrying out rim detection, earlier road is blured to connect and handle, can detect abundant more lane line information, be conducive to improve the quality at the reference mark that selects.And fuzzy connection degree can partly be cast out the non-road in distant view zone, comes out thereby avoid that the image in the distant view zone is used as Road Detection;
The second, except can be at different situations with comparalive ease the analysis image, the framework that extracts image by image thinning can keep the basic topological structure of its original-shape constant in compression raw image data amount;
The 3rd, the method for line of vector can be crossed over the little breakpoint that is formed by tree shade or other noise, and non-consistent B spline interpolation is the road mark line of match different shape well, thereby makes method of the present invention have stronger robustness.
Description of drawings
Fig. 1 is that the embodiment of the invention is at the lane line testing result figure under the road surface shade situation.
There is lane line testing result figure under the barrier situation of the place ahead in Fig. 2 for the embodiment of the invention.
Fig. 3 is the lane line testing result figure under the embodiment of the invention left-hand bend situation.
Fig. 4 is the lane line testing result figure under the embodiment of the invention exists a small amount of shade situation.
Fig. 5 is the lane line testing result figure under the embodiment of the invention exists a large amount of shade situations.
Fig. 6 is the frame scan testing result figure of embodiment of the invention part lane line coordinate, curvature and accuracy rate.
Fig. 7 is that the lane line of the embodiment of the invention and classical Vector Lane method detects accuracy rate comparative analysis figure.
Embodiment
Method of the present invention comprises four-stage altogether: the fuzzy processing of connection degree, rim detection and refinement, Vector Lane algorithm and non-consistent B spline interpolation.
Agreement: I refers to handled video frame images; M, n be line number and the columns of presentation video respectively; (o p) deposits in the image I each pixel p with respect to the fuzzy connection degree of seed point o to array f
Figure 202159DEST_PATH_IMAGE001
Formation Q is used for depositing the pixel value of waiting to ask for fuzzy connection degree; T is used for relatively judging for extracting the higher value that arranges; DELETE_ANGLE is threshold value, is used for the choice operation of Vector Lane angle value.
Concrete steps are as shown in Figure 1:
Agreement: I refers to handled video frame images; M, n be line number and the columns of presentation video respectively; (o p) deposits in the image I each pixel p with respect to the fuzzy connection degree of seed point o to array f
Figure 290201DEST_PATH_IMAGE001
Formation Q is used for depositing the pixel value of waiting to ask for fuzzy connection degree; T is used for relatively judging for extracting the set higher value of road point; DELETE_ANGLE is threshold value, is used for the choice operation of Vector Lane angle value;
A. initial setting up
Obtain video frame images and variable is carried out the initialization setting;
B. fuzzy connection degree is handled
B.1 the line number of reading images and columns arrange in the image I each pixel with respect to the fuzzy connection degree of seed point o
Figure 619551DEST_PATH_IMAGE001
Be 0, namely (o is zero p) to f; The fuzzy connection degree that seed point is set is 1;
B.2 the neighbours territory point of o is put in the middle of the formation Q;
B.3 if formation Q is empty, then change b.6 step over to; Otherwise, from Q, shift out a pixel p;
B.4 if f (o, p) less than for extracting the threshold value T of road point setting, then the maximum path of each neighborhood d of pixel p is:
Figure 818451DEST_PATH_IMAGE017
B.5 if
Figure 221751DEST_PATH_IMAGE003
, so , and all are satisfied Pixel q insert formation Q; Change over to b.3;
B.5 for all pixel p, if (o p) greater than zero, judges that then p is the pixel on the road to its fuzzy connection degree f, and its gray-scale value is 1; Otherwise, judge that p is not the pixel on the road, its gray-scale value is zero;
C. Edge extraction and refinement
Convert image I to I ' by OpenCV from tape function cvCanny and parallel thinning algorithm;
D.Vector Lane algorithm
D.1 use 3 horizontal lines with image I ' be divided into 4 parts;
D.2 utilize sweep trace bottom-up, from the centre to both sides, line by line scan, find two initial control point of road, left and right sides route respectively
Figure 847401DEST_PATH_IMAGE019
D.3 make up the vector of left and right sides road respectively
Figure 370787DEST_PATH_IMAGE020
D.4 calculate the curvature of road line boundary according to following formula:
Figure 135653DEST_PATH_IMAGE021
D.5 judge
Figure 111699DEST_PATH_IMAGE009
Whether greater than a certain angle DELETE_ANGLE that sets in advance, if greater than, then this point is not starting point on both sides of the road, need proceed to scan, till finding starting point;
E. marginal point extracts
E.1 continue upwards to scan to both sides successively respectively from mid point, finding pixel value is 255 point
Figure 714719DEST_PATH_IMAGE022
, with Carry out calculating of vector value
Figure 506405DEST_PATH_IMAGE023
E.2 advance to calculate this curvature and last one difference of curvature constantly respectively:
Figure 235327DEST_PATH_IMAGE024
,
Figure 314142DEST_PATH_IMAGE025
E.3 if the gained result
Figure 874436DEST_PATH_IMAGE026
Greater than DELETE_ANGLE, then
Figure 380504DEST_PATH_IMAGE027
Not the gained reference mark, need to continue left or scanning to the right, scan at 255 and still calculate by said process afterwards;
E.4 if this line scanning is counted and surpassed 5 and still do not have point up to specification, then continue upwards to scan, till finding point up to specification;
F. utilize the NUBS method to carry out match at above-mentioned resulting reference mark;
G. repeat b ~ f, finish up to the video frame images processing.
Experiment test and parameter setting:
Experiment is to carry out under the Visul C++ environment in conjunction with the OpenCV instrument, testing involved all is that resolution is the coloured image of 256 * 240 pixels, can download from following website: http://vasc.ri.cmu.edu//idb/html/road/may30_90/index.html.
Road surface shown in Figure 1 exists under the situation of shade, and the embodiment of the invention is to the testing result of lane line, and wherein (a) fuzzy connection degree is handled; (b) reference mark is chosen; (c) testing result.
Figure 2 shows that the place ahead exists under the barrier situation, the lane line testing result of the embodiment of the invention, wherein (a) fuzzy connection degree is handled; (b) reference mark is chosen; (c) testing result.
Figure 3 shows that the embodiment of the invention was to the testing result of lane line when road existed to turn left, wherein (a) is original graph; (b) fuzzy connection degree is handled; (c) reference mark is chosen; (d) testing result.
When Figure 4 shows that the shade that has a small amount of scenery on every side, the testing result of the embodiment of the invention.
Shown in Figure 5 then is to exist under the situation of a large amount of scenery shades in the scene, the detection knot of the embodiment of the invention
Really.
Fig. 6 has shown the frame scan testing result of part lane line coordinate, curvature and accuracy rate.
The lane line that Figure 7 shows that the embodiment of the invention and classical Vector Lane method detects accuracy rate relatively.

Claims (1)

1. the lane line of a vehicular autonomous navigation detects and tracking, it is characterized in that carrying out as follows:
Agreement: I refers to handled video frame images; M, n be line number and the columns of presentation video respectively; (o p) deposits in the image I each pixel p with respect to the fuzzy connection degree of seed point o to array f
Figure 642957DEST_PATH_IMAGE001
Formation Q is used for depositing the pixel value of waiting to ask for fuzzy connection degree; T is used for relatively judging for extracting the set higher value of road point; DELETE_ANGLE is threshold value, is used for the choice operation of Vector Lane angle value;
A. initial setting up
Obtain video frame images and variable is carried out the initialization setting;
B. fuzzy connection degree is handled
B.1 the line number of reading images and columns arrange in the image I each pixel with respect to the fuzzy connection degree of seed point o Be 0, namely (o is zero p) to f; The fuzzy connection degree that seed point is set is 1;
B.2 the neighbours territory point of o is put in the middle of the formation Q;
B.3 if formation Q is empty, then change b.6 step over to; Otherwise, from Q, shift out a pixel p;
B.4 if f (o, p) less than for extracting the threshold value T of road point setting, then the maximum path of each neighborhood d of pixel p is:
Figure 25714DEST_PATH_IMAGE003
B.5 if
Figure 950945DEST_PATH_IMAGE005
, so
Figure 486837DEST_PATH_IMAGE007
, and all are satisfied
Figure 813914DEST_PATH_IMAGE009
Pixel q insert formation Q; Change over to b.3;
B.5 for all pixel p, if (o p) greater than zero, judges that then p is the pixel on the road to its fuzzy connection degree f, and its gray-scale value is 1; Otherwise, judge that p is not the pixel on the road, its gray-scale value is zero;
C. Edge extraction and refinement
Convert image I to I ' by OpenCV from tape function cvCanny and parallel thinning algorithm;
D.Vector Lane algorithm
D.1 use 3 horizontal lines with image I ' be divided into 4 parts;
D.2 utilize sweep trace bottom-up, from the centre to both sides, line by line scan, find two initial control point of road, left and right sides route respectively
Figure 160581DEST_PATH_IMAGE011
D.3 make up the vector of left and right sides road respectively
Figure 573108DEST_PATH_IMAGE013
D.4 calculate the curvature of road line boundary according to following formula:
Figure 148577DEST_PATH_IMAGE015
D.5 judge
Figure 330160DEST_PATH_IMAGE017
Whether greater than a certain angle DELETE_ANGLE that sets in advance, if greater than, then this point is not starting point on both sides of the road, need proceed to scan, till finding starting point;
E. marginal point extracts
E.1 continue upwards to scan to both sides successively respectively from mid point, finding pixel value is 255 point
Figure 847729DEST_PATH_IMAGE019
, with
Figure 747552DEST_PATH_IMAGE021
Carry out calculating of vector value
Figure 625247DEST_PATH_IMAGE022
E.2 advance to calculate this curvature and last one difference of curvature constantly respectively: ,
Figure 349806DEST_PATH_IMAGE025
E.3 if the gained result Greater than DELETE_ANGLE, then
Figure 654197DEST_PATH_IMAGE028
Not the gained reference mark, need to continue left or scanning to the right, scan at 255 and still calculate by said process afterwards;
E.4 if this line scanning is counted and surpassed 5 and still do not have point up to specification, then continue upwards to scan, till finding point up to specification;
F. utilize the NUBS method to carry out match at above-mentioned resulting reference mark;
G. repeat b ~ f, finish up to the video frame images processing.
CN201210573823.4A 2012-12-26 2012-12-26 The lane detection and tracking method of vehicular autonomous navigation Active CN103206957B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210573823.4A CN103206957B (en) 2012-12-26 2012-12-26 The lane detection and tracking method of vehicular autonomous navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210573823.4A CN103206957B (en) 2012-12-26 2012-12-26 The lane detection and tracking method of vehicular autonomous navigation

Publications (2)

Publication Number Publication Date
CN103206957A true CN103206957A (en) 2013-07-17
CN103206957B CN103206957B (en) 2016-03-02

Family

ID=48754255

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210573823.4A Active CN103206957B (en) 2012-12-26 2012-12-26 The lane detection and tracking method of vehicular autonomous navigation

Country Status (1)

Country Link
CN (1) CN103206957B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577828A (en) * 2013-11-22 2014-02-12 中国科学院自动化研究所 Road detection method based on edge feature
CN105320518A (en) * 2014-05-27 2016-02-10 伊莱比特汽车公司 Graphical representation of roads and routes using hardware tessellation
CN105823475A (en) * 2014-12-12 2016-08-03 泰勒斯公司 Method of three-dimensional representation of a scene
CN107180432A (en) * 2017-05-16 2017-09-19 重庆邮电大学 A kind of method and apparatus of navigation
CN107672593A (en) * 2017-08-26 2018-02-09 圣码智能科技(深圳)有限公司 Prevent vehicle from deviateing the method for navigation
CN107909047A (en) * 2017-11-28 2018-04-13 上海信耀电子有限公司 A kind of automobile and its lane detection method and system of application
US10311752B2 (en) * 2017-02-03 2019-06-04 Honeywell International Inc. Compressed edge map representation for image aided navigation
CN110171263A (en) * 2019-05-19 2019-08-27 瑞立集团瑞安汽车零部件有限公司 A kind of bend identification and overall height adjusting method for ECAS system
CN110617817A (en) * 2019-09-29 2019-12-27 百度在线网络技术(北京)有限公司 Navigation route determining method, device, equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101470807A (en) * 2007-12-26 2009-07-01 河海大学常州校区 Accurate detection method for highroad lane marker line
CN101582168A (en) * 2009-06-16 2009-11-18 武汉大学 Matting sample set construction method based on fuzzy connectedness
JP2010170488A (en) * 2009-01-26 2010-08-05 Nissan Motor Co Ltd Lane recognition device, and lane recognition method
US20100259617A1 (en) * 2009-04-09 2010-10-14 Nippon Soken, Inc. Boundary line recognition apparatus
JP2012014398A (en) * 2010-06-30 2012-01-19 Fuji Heavy Ind Ltd White-line recognition apparatus for vehicle
CN102509067A (en) * 2011-09-22 2012-06-20 西北工业大学 Detection method for lane boundary and main vehicle position
CN102521589A (en) * 2011-11-18 2012-06-27 深圳市宝捷信科技有限公司 Method and system for detecting lane marked lines
CN102663744A (en) * 2012-03-22 2012-09-12 杭州电子科技大学 Complex road detection method under gradient point pair constraint
CN102722705A (en) * 2012-06-12 2012-10-10 武汉大学 Method for detecting multi-lane line on basis of random sample consensus (RANSAC) algorithm

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101470807A (en) * 2007-12-26 2009-07-01 河海大学常州校区 Accurate detection method for highroad lane marker line
JP2010170488A (en) * 2009-01-26 2010-08-05 Nissan Motor Co Ltd Lane recognition device, and lane recognition method
US20100259617A1 (en) * 2009-04-09 2010-10-14 Nippon Soken, Inc. Boundary line recognition apparatus
CN101582168A (en) * 2009-06-16 2009-11-18 武汉大学 Matting sample set construction method based on fuzzy connectedness
JP2012014398A (en) * 2010-06-30 2012-01-19 Fuji Heavy Ind Ltd White-line recognition apparatus for vehicle
CN102509067A (en) * 2011-09-22 2012-06-20 西北工业大学 Detection method for lane boundary and main vehicle position
CN102521589A (en) * 2011-11-18 2012-06-27 深圳市宝捷信科技有限公司 Method and system for detecting lane marked lines
CN102663744A (en) * 2012-03-22 2012-09-12 杭州电子科技大学 Complex road detection method under gradient point pair constraint
CN102722705A (en) * 2012-06-12 2012-10-10 武汉大学 Method for detecting multi-lane line on basis of random sample consensus (RANSAC) algorithm

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
郭磊: "基于点集优化和干扰点模糊化的车道线识别", 《中国机械工程》, vol. 18, no. 15, 31 August 2007 (2007-08-31), pages 1872 - 1876 *
高德芝: "应用模糊逻辑的车道线检测方法", 《北京工业大学学报》, vol. 37, no. 7, 31 July 2011 (2011-07-31), pages 972 - 977 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577828B (en) * 2013-11-22 2018-02-13 极限元(北京)智能科技股份有限公司 A kind of Approach for road detection based on edge feature
CN103577828A (en) * 2013-11-22 2014-02-12 中国科学院自动化研究所 Road detection method based on edge feature
CN105320518A (en) * 2014-05-27 2016-02-10 伊莱比特汽车公司 Graphical representation of roads and routes using hardware tessellation
CN105320518B (en) * 2014-05-27 2018-08-31 伊莱比特汽车公司 Figured method and apparatus and navigation device are carried out to section using inlaying
CN105823475B (en) * 2014-12-12 2020-10-27 泰勒斯公司 Three-dimensional representation method of scene
CN105823475A (en) * 2014-12-12 2016-08-03 泰勒斯公司 Method of three-dimensional representation of a scene
US10311752B2 (en) * 2017-02-03 2019-06-04 Honeywell International Inc. Compressed edge map representation for image aided navigation
CN107180432A (en) * 2017-05-16 2017-09-19 重庆邮电大学 A kind of method and apparatus of navigation
CN107672593A (en) * 2017-08-26 2018-02-09 圣码智能科技(深圳)有限公司 Prevent vehicle from deviateing the method for navigation
CN107909047A (en) * 2017-11-28 2018-04-13 上海信耀电子有限公司 A kind of automobile and its lane detection method and system of application
CN107909047B (en) * 2017-11-28 2021-07-06 上海信耀电子有限公司 Automobile and lane detection method and system applied to automobile
CN110171263A (en) * 2019-05-19 2019-08-27 瑞立集团瑞安汽车零部件有限公司 A kind of bend identification and overall height adjusting method for ECAS system
CN110617817A (en) * 2019-09-29 2019-12-27 百度在线网络技术(北京)有限公司 Navigation route determining method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN103206957B (en) 2016-03-02

Similar Documents

Publication Publication Date Title
CN103206957B (en) The lane detection and tracking method of vehicular autonomous navigation
US11854272B2 (en) Hazard detection from a camera in a scene with moving shadows
CN112714913B (en) Structure annotation
CN111582083B (en) Lane line detection method based on vanishing point estimation and semantic segmentation
CN106951879B (en) Multi-feature fusion vehicle detection method based on camera and millimeter wave radar
CN101929867B (en) Clear path detection using road model
CN104217427B (en) Lane line localization method in a kind of Traffic Surveillance Video
CN110738121A (en) front vehicle detection method and detection system
Kühnl et al. Monocular road segmentation using slow feature analysis
CN102592454A (en) Intersection vehicle movement parameter measuring method based on detection of vehicle side face and road intersection line
JP6150164B2 (en) Information detection apparatus, mobile device control system, mobile object, and information detection program
CN104700414A (en) Rapid distance-measuring method for pedestrian on road ahead on the basis of on-board binocular camera
CN103914688A (en) Urban road obstacle recognition system
US20130266186A1 (en) Top-down view classification in clear path detection
CN109753949B (en) Multi-window traffic sign detection method based on deep learning
CN107909047B (en) Automobile and lane detection method and system applied to automobile
CN105930833A (en) Vehicle tracking and segmenting method based on video monitoring
CN110379168A (en) A kind of vehicular traffic information acquisition method based on Mask R-CNN
CN104021368A (en) Method and system for estimating road height shape
CN111259796A (en) Lane line detection method based on image geometric features
CN104700071A (en) Method for extracting panorama road profile
CN103794050A (en) Real-time transport vehicle detecting and tracking method
CN109241855B (en) Intelligent vehicle travelable area detection method based on stereoscopic vision
CN113111707B (en) Front car detection and ranging method based on convolutional neural network
JP4956099B2 (en) Wall detector

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20180105

Address after: 430074, 3, 22, 09, 41, No. 1, modern and international design city, Optics Valley Avenue, East Lake New Technology Development Zone, Hubei, Wuhan

Patentee after: Wuhan Mai Liao Network Technology Co., Ltd.

Address before: 116000 the Yellow River Road, Shahekou District, Liaoning, No. 850, No.

Patentee before: Liaoning Normal University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20190624

Address after: 226112 Dadao Road, Dishiqiao, Samsung Town, Haimen City, Nantong City, Jiangsu Province, 137

Patentee after: Haimen Mao Mao Art Design Co., Ltd.

Address before: 430074 Hubei Optics Valley Wuhan New Technology Development Zone, Optics Valley 41, No. 3 modern, international design city, 3 buildings, 22 stories, 09 rooms.

Patentee before: Wuhan Mai Liao Network Technology Co., Ltd.