WO2005029440A1 - 路面走行レーン検出装置 - Google Patents
路面走行レーン検出装置 Download PDFInfo
- Publication number
- WO2005029440A1 WO2005029440A1 PCT/JP2004/013802 JP2004013802W WO2005029440A1 WO 2005029440 A1 WO2005029440 A1 WO 2005029440A1 JP 2004013802 W JP2004013802 W JP 2004013802W WO 2005029440 A1 WO2005029440 A1 WO 2005029440A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lane
- line
- curve
- group
- segment
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/255—Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/457—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by analysing connectivity, e.g. edge linking, connected component analysis or slices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a traveling lane detecting device on a road surface, and more particularly to a traveling lane detecting device for detecting a traveling lane from an image of a road surface in front of a vehicle continuously imaged.
- marking lines are painted on the road surface for various purposes, including lane boundaries that identify the boundaries of driving lanes (lanes), and marking lines in different forms such as solid lines, broken lines, or blocks. Also, there is a mixture of marking lines of different colors such as white or yellow, and a combination of these marking lines also exists.
- FIG. 3 is an example of an image DS including a marking line on a two-lane road surface near a tunnel entrance.
- a lane boundary line LB indicating a left boundary of the driving lane DL
- a white or yellow solid line is used as the driving guide line LG.
- a white or yellow dashed marking line is used as the lane boundary line RB indicating the right boundary of the driving lane DL
- a white block-shaped marking line painted on the inside thereof is used as the driving guide line RG. Used.
- the width of these marking lines is set to 20 cm, and the length of the dashed marking line is set to 8 m, and the length of the blank space between the painting parts is set to 12 m.
- the width of the block-shaped marking line is set to 30cm, the length of the painted part is set to 2-3m, and the length of the blank part between the painted parts is set to 2-3m.
- a lane boundary line or a driving guidance line means a sign indicating the function, and a white line or a yellow line on the road surface is referred to as a lane mark.
- Patent Document 1 discloses the device.
- the vehicle is configured as follows. That is, the image captured by the camera also detects the marking lines drawn on the road surface of the road, and extracts the marking lines that should be a pair of white lines separating the driving lane from the detection lines. Then, an interval between a pair of marking lines extracted as a white line is detected.
- Patent Document 2 proposes a lane boundary detection device configured as described below for the purpose of stably detecting a lane boundary. That is, the sensitivity to the spatial density change of the image data is set to be relatively high, the first contour information detecting means for extracting the first contour information from the image data, and the first contour information detecting means for the spatial density change of the image data.
- the second contour information detecting means for setting the sensitivity to be relatively low and extracting the second contour information from the image data, and extracting the outermost contour information of the white line group from the first and second contour information.
- a contour extracting means is provided, and a lane boundary position is set based on the outermost contour information.
- Patent Document 3 proposes a lane boundary detection device configured as follows for the same purpose as described above.
- the outermost contour extraction unit (reference numeral 15 in Patent Document 3; the same applies hereinafter) converts the original image data stored in the frame buffer unit (13) and the position information of the edge detected by the edge detection unit (14).
- the outermost contour information of the white line group is extracted based on the contour data including.
- the outermost contour extraction unit (15) determines whether or not the edge corresponds to a gap generated between the white lines constituting the white line group based on the outline data including the position information of the extracted edge of the original image data. Is described, and the edge corresponding to the gap is also deleted from the contour data force.
- Patent Document 4 also discloses a lane boundary constructed as follows for the same purpose as described above. Detection devices have been proposed. That is, the traveling lane of the moving object including the lane in the predetermined area is imaged by the imaging means to obtain image data. A density histogram is created based on the obtained image data, and a group of histograms is detected to perform grouping. Then, a first center position, which is the center of each histogram, is detected from the grouped histograms, and based on the first center position, the center is set in the group of histograms grouped together. Detect the second center position.
- the center of the lane mark or the lane mark group having a plurality of lane marks is detected, and the lane mark boundary position is determined.
- the document states that the creation of a histogram enables stable detection of lane mark boundary positions.
- Hough transform is widely known as a straight line detection method, and is described, for example, in Non-Patent Document 1 below.
- Such a Hough transform is known as a method of detecting a line that is robust to noise.
- ⁇ the curve on the coordinate system is characterized by crossing at one point.
- RANSAC Random Sample Consensus
- Patent Document 1 Japanese Patent Application Laid-Open No. 2003-168198
- Patent Document 2 Japanese Patent Application Laid-Open No. 2003-187227
- Patent Document 3 Japanese Patent Application Laid-Open No. 2003-187252
- Patent Document 4 JP 2003-178399 A
- Non-Patent Document 1 Hideyuki Tamura, ⁇ Introduction to Computer Image Processing '', Soken Shuppan, March 10, 1985, 1st edition, 1st edition, p. 127 and p. 128
- Non-Patent Document 2 Random Sample by Martin A. Fischero and Robert C. Bolles
- Consensus A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography, Graphics and Image Processing, vol. 24 (6), pages 381—395. Published in 1981 Non-Patent Document 3: "Multiple View Geometry in Computer Vision” by Richard Hartley and Andrew Zisserman, Cambridge University Press. August 2000, pp. 101-107.
- Patent Document 1 discloses that when a plurality of sign lines adjacent to each other on at least one side are detected as lane boundary lines of a road, the distance between a pair of sign lines detected at that time is determined. It is stated that a pair of marking lines that best match the interval will be extracted as a white line based on, and it is assumed that the interval between the lane boundary lines on both sides is constant. Further, it is not easy to specify a reference line from a plurality of marking lines, and further improvement is desired.
- the outermost contour position is specified by lowering the sensitivity in the gap between a plurality of marking lines by using two types of contour detection methods having different sensitivities to spatial density changes. Therefore, even if the contrast between the marking line and the gap is insufficient due to lighting conditions, etc., or if the image is saturated and saturated, the position of the outermost contour can be specified stably. It is extremely difficult to detect the marking line at the position of the lane boundary line.
- the edge interval is narrow, and the density difference force at both edge positions is regarded as a gap between a plurality of marking lines, and the data is not adopted.
- the position of the outermost contour can be stably specified in the same way as described above, but this also involves detecting the marking line at the original lane boundary position. Is difficult.
- the lane width (lane Width) is slightly smaller than the actual lane width by lm, and in some cases, smooth running control becomes difficult. Therefore, it is necessary to be able to reliably distinguish the block-shaped marking line from the boundary of the driving lane.
- the present invention can stably identify the position of the boundary of the traveling lane in a road traveling lane detection device that detects a traveling lane from an image of a road surface in front of a vehicle that is continuously imaged. It is an object to provide a road surface traveling lane detecting device.
- the present invention provides a road surface traveling lane detection device that detects a traveling lane from an image captured continuously of a road surface by an imaging unit.
- Edge point detecting means for detecting the edge points of the plurality of edge points, and a plurality of edge points detected by the edge point detecting means, wherein a line segment is created based on the continuity of the distance and azimuth between each of the edge points.
- Segment group creating means for creating a group of segments by grouping a plurality of line segments having a predetermined relationship
- curve detecting means for detecting a curve that matches the segment group created by the segment group creating means.
- a plurality of curves distributed near the left and right lane boundaries among the curves detected by the curve detecting means are compared with the segment group created by the segment group creating means, and the curve closest to the center of the traveling lane is compared.
- the segment group constituting the curved line has a predetermined length and a repetition period, it is specified as the innermost marking line, and the position of the curve adjacent to the center of the traveling lane outside the innermost marking line is determined.
- a lane boundary line position specifying means for specifying the position of the boundary line of the traveling lane.
- the plurality of line segments having a predetermined relationship refers to a line segment that can be sequentially selected, for example, within a predetermined distance and azimuth range with respect to a predetermined line segment.
- the segment group creation means exists within a range of a predetermined distance and an azimuth with respect to a predetermined line segment among the plurality of line segments.
- the line group may be configured to create another line segment as one group.
- the segment group creating means may include an edge point group based on a plurality of edge points detected by the edge point detecting means, The line segment may be created based on the continuity of the distance and the azimuth.
- the segment group creating means may include, within a line segment group based on the plurality of line segments, another line segment within a range of a predetermined distance and a direction set for the predetermined line segment. In such a case, a configuration may be adopted in which it is determined that the above-mentioned relationship is established, and processing is performed as belonging to the same group.
- the present invention relates to a road surface traveling lane detecting device that detects a traveling lane from an image of a road surface continuously imaged by an imaging means, and an edge inspection for detecting a plurality of edge points in an image.
- Output means a curve detection means for detecting a curve of an edge point group that matches the plurality of edge points detected by the edge point detection means, and an edge point group contributing to the configuration of the curve detected by the curve detection means.
- a segment group creating means for creating a segment group by making a match, and comparing a plurality of curves distributed near the left and right lane boundaries among the curves detected by the curve detecting means with the segment group created by the segment group creating means.
- the vehicle may further include a lane boundary position specifying unit that specifies a position of a curve adjacent to the center of the row lane outside the innermost marking line as a position of a boundary line of the traveling lane.
- the segment group creating unit creates an edge histogram for the edge point group contributing to the configuration of the curve detected by the curve detecting unit, and groups the edge point group contributing to the histogram peak.
- the edge histogram is a horizontal edge histogram created in the horizontal direction with respect to the vertical component of the edge point group.
- the edge point detecting means detects the plurality of edge points on the image picked up by the image pick-up means, and then projects the coordinate values of the plurality of edge points back to three-dimensional road surface coordinates to perform the projection. It may be configured to output as a plurality of edge points.
- the present invention has the following effects because it is configured as described above. That is, when the segment group forming the curve closest to the center of the traveling lane has a predetermined length and a repetition period, it is specified as the innermost marking line, and the innermost marking line is determined with respect to the center of the traveling lane. Positional force of the curve adjacent to the outside of the line Since it is specified as the position of the boundary of the driving lane, the block-shaped marking line in which the segment group has a predetermined length and repetition period is the boundary of the driving lane. Are clearly distinguished and excluded. Therefore, the position of the boundary of the traveling lane can be specified stably.
- the segment group creating means can appropriately create a segment group by configuring as described above. Further, the edge point detection means can appropriately detect and process a plurality of edge points by having the above configuration.
- FIG. 1 is a block diagram showing a main configuration of a road lane device according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a hardware configuration of a road surface traveling lane device according to one embodiment of the present invention.
- FIG. 3 is a front view showing an example of an image captured in one embodiment of the present invention.
- FIG. 4 is a plan view showing a plurality of edge points projected on road surface coordinates in one embodiment of the present invention.
- FIG. 5 is a plan view showing a line segment projected on road surface coordinates in one embodiment of the present invention.
- FIG. 6 is a plan view showing an example of grouping of line segments in one embodiment of the present invention.
- FIG. 7 is a plan view of a lane mark projected on road surface coordinates in another embodiment of the present invention, and a graph showing a corresponding horizontal histogram.
- FIG. 1 shows an embodiment of a road surface traveling lane detecting device, which is configured to continuously capture an image of a road surface by an imaging means VD and detect the captured image traveling lane.
- an edge point detecting means ED for detecting a plurality of edge points from a contour line in an image, and a distance and an azimuth between each edge point with respect to the plurality of edge points detected by the edge point detecting means ED.
- a segment group creating means SD for creating a line segment based on the continuity of a plurality of segments and grouping a plurality of line segments having a predetermined relationship to create a segment group, and a segment group created by the segment group creating means SD.
- a curve detecting means CD for detecting a curve that fits the client group. Then, the lane boundary position identifying means LD compares the plurality of curves distributed near the left and right lane boundaries among the curves detected by the curve detecting means CD with the segment groups created by the segment group creating means SD, and travels. When the segment group that forms the curve closest to the center of the lane has a predetermined length and repetition period, it is specified as the innermost marking line. It is configured to identify the position as the position of the boundary of the traveling lane.
- the road surface traveling lane detecting device of FIG. 1 has the hardware configuration shown in FIG. That is, for example, a CCD camera (hereinafter simply referred to as a camera) CM is mounted as an image pickup means VD in front of a vehicle (not shown), and a field of view in front of the vehicle including a road surface is continuously imaged. camera The video signal of CM is converted to AZD via video input buffer circuit VB and sync separation circuit SY and stored in frame memory FM. The image data stored in the frame memory FM is processed by the image processing unit VC.
- a CCD camera hereinafter simply referred to as a camera
- the image processing unit VC includes an image data control unit VP, an edge point detection unit EP, a segment group creation unit SP, a curve detection unit CP, and a lane boundary position identification unit LP.
- the edge point detection unit EP, segment group creation unit SP, curve detection unit CP, and lane boundary line position identification unit LP are respectively the edge point detection unit ED, segment group creation unit SD, curve detection unit CD, Corresponds to lane boundary position identification means LD.
- the data addressed by the image data control unit VP is called from the image data in the frame memory FM and sent to the edge point detection unit EP, where a plurality of edge points are detected. Is detected.
- a line segment is created based on the continuity of the distance and azimuth between the edge points in the present embodiment with respect to the edge point data detected in this manner, and a plurality of segments having a predetermined relationship are created. Are segmented to form a segment group. Further, a curve that matches the segment group created by the segment group creating means SD is detected by the curve detecting means CD.
- the lane boundary line position identification unit LP selects a plurality of curves distributed near the left and right lane boundaries from the curve data detected by the curve detection unit CP as described above.
- a plurality of curves are compared with the segment group created by the segment group creation unit SP, and when the segment group that forms the curve closest to the center of the running lane has a predetermined length and repetition period, it is used as the innermost marking line. Specified.
- the position force of the curve adjacent to the center of the traveling lane outside the innermost marking line is specified as the position of the boundary of the traveling lane.
- the position of the boundary line of the driving lane specified in this way is further determined by the system control, if necessary, together with the detection results of the driving lane width, the curvature of the road, the position with the own vehicle, the attitude angle, and the like. It is supplied to the SC (computer) and output to external system equipment (not shown) via the output interface circuit OU.
- SC computer
- CL, PW, and IN in FIG. 2 are a clock circuit, a power supply circuit, and an input interface circuit, respectively.
- the edge point detection unit EP As shown in FIG. 3, a plurality of edge points are detected from an image DS captured by the camera CM, and a three-dimensional road surface is obtained from an image plane (not shown) of the plurality of edge points. Back projection to coordinates is performed. That is, based on the plurality of edge points detected on the image plane and the parameters of the camera CM, the coordinate values of the plurality of edge points are projected back as a point group of three-dimensional road surface coordinates as shown in FIG. (The line segments in Fig. 4 represent the edge points).
- the white lines (LB, LG, RB, RG in Fig. 3) as lane marks are thinned or stained, or the white lines on the image are connected at adjacent parts due to the performance of the power CM.
- the edge point group may be different from that of the lower part as shown in the upper part of FIG. 4, but it is appropriately determined without causing an error by the processing described later.
- a curve including a plurality of straight lines is applied to a plurality of edge points (represented by EGP in FIG. 4) projected back onto the road surface, for example, by the above-described RANSAC.
- Curve fitting is performed.
- the above-described Hough transform may be used, or for example, the least square method may be applied.
- the edge point group EGP may be grouped based on predetermined attributes, and may be subjected to curve fitting! /.
- a line segment LS is created in the segment group creation unit SP based on the distance between the edge points and the continuity of the orientation with respect to the edge point group EGP. You.
- the group of line segments if other line segments LS exist within the range of the distance and orientation set for a certain line segment LS, those line segments LS, LS belong to the same group.
- a group is created as shown in Fig. 6 by repeating this process (the group inside the lane center is SGI and the group outside is lane SGO).
- the plus edge the left side of the white line, shown as LS (+) in FIG. 5
- the minus edge the right side of the white line, LS (—) in FIG. 5 is selected.
- the plus edge the left side of the white line, shown as LS (+) in FIG. 5
- the minus edge the right side of the white line, LS (—) in FIG. 5
- curve fitting curve fitting
- the data may be grouped based on a predetermined attribute and the curve fitting may be performed.
- this line segment it is detected by the curve detection unit CP. It is verified what attribute the curved line is made up of. For example, if the curve of the group SGI in Fig. 6 is composed of multiple periodic short segments, it can be determined that the curve is applied to a relatively short marking line such as a block. it can.
- the line segment has a predetermined length and period in the vertical direction or the horizontal direction, and is determined by the lane boundary line position specifying unit LP to be a block-shaped mark line.
- the curve (eg, RG in Fig. 3) is removed from the candidate lane boundary, and the curve (RB in Fig. 3) outside the block-shaped marking line (RG in Fig. 3) with respect to the center of the lane is changed to It is determined as a boundary line.
- the line segment LS is first determined, and the line segment LS is grouped and curve fitting is performed.
- the line segment LS is configured as shown by the dashed arrow in FIG.
- the embodiment may be such that a curve that fits a plurality of edge points is detected, and a group of segments is created by grouping edge points that contribute to the configuration of the curve. That is, in the image processing unit VC in FIG. 2, a curve that matches a plurality of edge points is detected by the curve detection unit CP, and the segment group creation unit SP determines the perpendicularity of the edge point group that contributes to the configuration of this curve. A horizontal edge histogram is created for the components, and a group of segments is created by grouping the edge points that contribute to the histogram peak.
- a horizontal edge histogram is created for a plurality of edge points back-projected on the road surface of the three-dimensional road surface coordinates, as indicated by HG in FIG.
- a plurality of peaks PK appear in each of the peak positions, each of which contains a large amount of vertical line components, so that an edge group contributing to each histogram peak is regarded as one group. can do.
- the lane boundary line position identification unit LP determines that the histogram peak is a block-shaped marking line
- the marking line (RG) is excluded from the lane boundary candidate, and the marking line (RB) outside the block-shaped marking line (RG) with respect to the center of the lane is regarded as the lane boundary line.
- the marking line indicating the lane boundary provided on the traveling road surface is not only a simple solid line or a broken line, but also a plurality of lines formed by a combination of the simple marking line and the block-shaped marking line.
- the position of the lane boundary line can be stably specified. As a result, it is possible to recognize the boundary line that satisfies the high reliability expected from the alarm system and the control system.
- the road surface traveling lane detecting device can stably specify the position of the boundary line of the traveling lane as described above, and thus can be applied to, for example, various warning systems and control systems in vehicles and the like. .
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
- Image Processing (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/572,956 US20070084655A1 (en) | 2003-09-24 | 2004-09-22 | Device for detecting a road traveling lane |
EP04787985A EP1667085A4 (en) | 2003-09-24 | 2004-09-22 | DEVICE FOR DETECTING A ROAD TRAFFIC PATH |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-331356 | 2003-09-24 | ||
JP2003331356A JP3956926B2 (ja) | 2003-09-24 | 2003-09-24 | 路面走行レーン検出装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005029440A1 true WO2005029440A1 (ja) | 2005-03-31 |
Family
ID=34373039
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/013802 WO2005029440A1 (ja) | 2003-09-24 | 2004-09-22 | 路面走行レーン検出装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20070084655A1 (ja) |
EP (1) | EP1667085A4 (ja) |
JP (1) | JP3956926B2 (ja) |
KR (1) | KR100784307B1 (ja) |
CN (1) | CN100452093C (ja) |
WO (1) | WO2005029440A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103150905A (zh) * | 2013-02-06 | 2013-06-12 | 广州畅通智能交通科技有限公司 | 路侧安装波频检测器检测交通流的方法 |
CN103630122A (zh) * | 2013-10-15 | 2014-03-12 | 北京航天科工世纪卫星科技有限公司 | 一种单目视觉车道线检测方法及其测距方法 |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4365352B2 (ja) | 2005-07-06 | 2009-11-18 | 本田技研工業株式会社 | 車両及びレーンマーク認識装置 |
JP4905648B2 (ja) * | 2006-02-21 | 2012-03-28 | 学校法人東京理科大学 | 車両用刺激提示装置及び刺激提示方法 |
KR101035761B1 (ko) * | 2006-07-06 | 2011-05-20 | 포항공과대학교 산학협력단 | 차선 인식을 위한 영상 처리 방법 및 시스템 |
JP2009237901A (ja) * | 2008-03-27 | 2009-10-15 | Zenrin Co Ltd | 路面標示地図生成方法 |
US8384776B2 (en) * | 2009-04-22 | 2013-02-26 | Toyota Motor Engineering And Manufacturing North America, Inc. | Detection of topological structure from sensor data with application to autonomous driving in semi-structured environments |
CN101567086B (zh) * | 2009-06-03 | 2014-01-08 | 北京中星微电子有限公司 | 一种车道线检测方法及其设备 |
JP5363921B2 (ja) * | 2009-08-31 | 2013-12-11 | 富士重工業株式会社 | 車両用白線認識装置 |
JP4968412B2 (ja) * | 2010-01-29 | 2012-07-04 | トヨタ自動車株式会社 | 道路情報検出装置、および車両走行制御装置 |
JP5469509B2 (ja) * | 2010-03-31 | 2014-04-16 | パナソニック株式会社 | 車線位置検出装置および車線位置検出方法 |
WO2012089261A1 (en) * | 2010-12-29 | 2012-07-05 | Tomtom Belgium Nv | Method of automatically extracting lane markings from road imagery |
JP5452518B2 (ja) * | 2011-02-09 | 2014-03-26 | 富士重工業株式会社 | 車両用白線認識装置 |
US9098751B2 (en) * | 2011-07-27 | 2015-08-04 | Gentex Corporation | System and method for periodic lane marker identification and tracking |
KR101295077B1 (ko) * | 2011-12-28 | 2013-08-08 | 전자부품연구원 | 다양한 도로 상황에서의 차선 검출 및 추적 장치 |
CN102968770A (zh) * | 2012-11-30 | 2013-03-13 | 华为技术有限公司 | 噪声消除方法及装置 |
JP6169366B2 (ja) * | 2013-02-08 | 2017-07-26 | 株式会社メガチップス | 物体検出装置、プログラムおよび集積回路 |
DE102014210411A1 (de) * | 2013-09-06 | 2015-03-12 | Robert Bosch Gmbh | Verfahren und Steuer- und Erfassungseinrichtung zum Plausibilisieren einer Falschfahrt eines Kraftfahrzeugs |
CN104648397B (zh) * | 2013-11-19 | 2017-05-17 | 沙漠科技股份有限公司 | 车道偏移警示系统及方法 |
JP6274936B2 (ja) * | 2014-03-25 | 2018-02-07 | ダイハツ工業株式会社 | 運転支援装置 |
FR3019508B1 (fr) * | 2014-04-08 | 2017-12-08 | Alstom Transp Tech | Procede de detection de rails sur lesquels circule un vehicule ferroviaire |
US9321461B1 (en) | 2014-08-29 | 2016-04-26 | Google Inc. | Change detection using curve alignment |
JP6451858B2 (ja) * | 2015-08-04 | 2019-01-16 | 日産自動車株式会社 | 段差検出装置及び段差検出方法 |
KR101694347B1 (ko) * | 2015-08-31 | 2017-01-09 | 현대자동차주식회사 | 차량 및 차선인지방법 |
JP6530685B2 (ja) * | 2015-09-15 | 2019-06-12 | 株式会社デンソーアイティーラボラトリ | 物体検出装置、物体検出システム、物体検出方法および物体検出プログラム |
US10614321B2 (en) | 2016-03-24 | 2020-04-07 | Nissan Motor Co., Ltd. | Travel lane detection method and travel lane detection device |
EP3435353B1 (en) * | 2016-03-24 | 2022-03-02 | Nissan Motor Co., Ltd. | Travel path detection method and travel path detection device |
JP7024176B2 (ja) * | 2016-09-27 | 2022-02-24 | 日産自動車株式会社 | 走路検出方法及び走路検出装置 |
KR101910256B1 (ko) * | 2016-12-20 | 2018-10-22 | 전자부품연구원 | 카메라 기반 도로 곡률 추정을 위한 차선 검출 방법 및 시스템 |
CN108335404B (zh) * | 2018-02-07 | 2020-09-15 | 深圳怡化电脑股份有限公司 | 边缘拟合方法及验钞设备 |
US10860868B2 (en) | 2018-04-18 | 2020-12-08 | Baidu Usa Llc | Lane post-processing in an autonomous driving vehicle |
US11869251B2 (en) * | 2018-07-02 | 2024-01-09 | Nissan Motor Co., Ltd. | Driving support method and driving support device |
JP2020085788A (ja) * | 2018-11-29 | 2020-06-04 | 太陽誘電株式会社 | 鉄損の算出方法及び算出装置 |
CN112950740A (zh) * | 2019-12-10 | 2021-06-11 | 中交宇科(北京)空间信息技术有限公司 | 高精地图道路中心线的生成方法、装置、设备及存储介质 |
CN113212352B (zh) * | 2020-02-06 | 2023-05-05 | 佛吉亚歌乐电子有限公司 | 图像处理装置以及图像处理方法 |
CN111611862B (zh) * | 2020-04-22 | 2022-09-09 | 浙江众合科技股份有限公司 | 一种基于曲线拟合的地铁轨道半自动标注方法 |
KR102499334B1 (ko) * | 2021-06-28 | 2023-02-14 | (주)뷰런테크놀로지 | 라이다 센서를 이용하여 차선을 검출하는 방법 및 상기 방법을 수행하는 차선 검출 장치 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08249597A (ja) * | 1995-03-15 | 1996-09-27 | Nissan Motor Co Ltd | 道路形状検出装置 |
JPH1185999A (ja) * | 1997-09-13 | 1999-03-30 | Honda Motor Co Ltd | 車両用白線検出装置 |
JP2003030626A (ja) * | 2001-07-18 | 2003-01-31 | Toshiba Corp | 画像処理装置及びその方法 |
JP2003228711A (ja) * | 2001-11-30 | 2003-08-15 | Hitachi Ltd | レーンマーク認識方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5359666A (en) * | 1988-09-28 | 1994-10-25 | Honda Giken Kogyo Kabushiki Kaisha | Driving way judging device and method |
JP3169483B2 (ja) * | 1993-06-25 | 2001-05-28 | 富士通株式会社 | 道路環境認識装置 |
DE69635569T2 (de) * | 1995-04-25 | 2006-08-10 | Matsushita Electric Industrial Co., Ltd., Kadoma | Vorrichtung zum Bestimmen der lokalen Position eines Autos auf einer Strasse |
JP3993259B2 (ja) * | 1996-07-31 | 2007-10-17 | アイシン精機株式会社 | 画像処理装置 |
US5991427A (en) * | 1996-07-31 | 1999-11-23 | Aisin Seiki Kabushiki Kaisha | Method and apparatus for detecting a lane on a road |
JPH1067252A (ja) * | 1996-08-29 | 1998-03-10 | Aisin Seiki Co Ltd | 車両走行状態検出装置 |
JP3373773B2 (ja) * | 1998-01-27 | 2003-02-04 | 株式会社デンソー | レーンマーク認識装置、車両走行制御装置および記録媒体 |
JP3463858B2 (ja) * | 1998-08-27 | 2003-11-05 | 矢崎総業株式会社 | 周辺監視装置及び方法 |
JP2001101415A (ja) * | 1999-09-29 | 2001-04-13 | Fujitsu Ten Ltd | 画像認識装置および画像処理装置 |
JP2001134769A (ja) * | 1999-11-04 | 2001-05-18 | Honda Motor Co Ltd | 対象物認識装置 |
CN1351317A (zh) * | 2000-10-27 | 2002-05-29 | 新鼎系统股份有限公司 | 图像检测系统与方法 |
JP3635244B2 (ja) * | 2001-05-16 | 2005-04-06 | 富士通テン株式会社 | カーブr補正方法及びその装置 |
JP3662218B2 (ja) * | 2001-12-18 | 2005-06-22 | アイシン精機株式会社 | 車線境界検出装置 |
-
2003
- 2003-09-24 JP JP2003331356A patent/JP3956926B2/ja not_active Expired - Fee Related
-
2004
- 2004-09-22 WO PCT/JP2004/013802 patent/WO2005029440A1/ja active Application Filing
- 2004-09-22 KR KR1020067004529A patent/KR100784307B1/ko not_active IP Right Cessation
- 2004-09-22 US US10/572,956 patent/US20070084655A1/en not_active Abandoned
- 2004-09-22 EP EP04787985A patent/EP1667085A4/en not_active Withdrawn
- 2004-09-22 CN CNB200480023582XA patent/CN100452093C/zh not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08249597A (ja) * | 1995-03-15 | 1996-09-27 | Nissan Motor Co Ltd | 道路形状検出装置 |
JPH1185999A (ja) * | 1997-09-13 | 1999-03-30 | Honda Motor Co Ltd | 車両用白線検出装置 |
JP2003030626A (ja) * | 2001-07-18 | 2003-01-31 | Toshiba Corp | 画像処理装置及びその方法 |
JP2003228711A (ja) * | 2001-11-30 | 2003-08-15 | Hitachi Ltd | レーンマーク認識方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1667085A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103150905A (zh) * | 2013-02-06 | 2013-06-12 | 广州畅通智能交通科技有限公司 | 路侧安装波频检测器检测交通流的方法 |
CN103630122A (zh) * | 2013-10-15 | 2014-03-12 | 北京航天科工世纪卫星科技有限公司 | 一种单目视觉车道线检测方法及其测距方法 |
CN103630122B (zh) * | 2013-10-15 | 2015-07-15 | 北京航天科工世纪卫星科技有限公司 | 一种单目视觉车道线检测方法及其测距方法 |
Also Published As
Publication number | Publication date |
---|---|
EP1667085A4 (en) | 2007-02-07 |
CN100452093C (zh) | 2009-01-14 |
JP2005100000A (ja) | 2005-04-14 |
CN1836266A (zh) | 2006-09-20 |
KR100784307B1 (ko) | 2007-12-13 |
EP1667085A1 (en) | 2006-06-07 |
JP3956926B2 (ja) | 2007-08-08 |
KR20060057004A (ko) | 2006-05-25 |
US20070084655A1 (en) | 2007-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005029440A1 (ja) | 路面走行レーン検出装置 | |
KR100741739B1 (ko) | 노면주행레인 검출장치 | |
JP4650079B2 (ja) | 物体検出装置、および方法 | |
JP2000357233A (ja) | 物体認識装置 | |
JP2008168811A (ja) | 車線認識装置、車両、車線認識方法、及び車線認識プログラム | |
JP6021689B2 (ja) | 車両諸元計測処理装置、車両諸元計測方法及びプログラム | |
JP2005215985A (ja) | 走行車線判定プログラムおよびその記録媒体、走行車線判定装置ならびに走行車線判定方法 | |
KR102491527B1 (ko) | 카메라 영상에서 대상의 감지 | |
KR20200087354A (ko) | 자율주행용 데이터 라벨링 장치 및 방법 | |
JP2004038624A (ja) | 車両認識方法、車両認識装置及び車両認識用プログラム | |
JP3629935B2 (ja) | 移動体の速度計測方法およびその方法を用いた速度計測装置 | |
JP3879874B2 (ja) | 物流計測装置 | |
JP3586938B2 (ja) | 車載用距離測定装置 | |
JP2002008019A (ja) | 軌道認識装置及び軌道認識装置を用いた鉄道車両 | |
JP2022074331A (ja) | 状態判定装置、状態判定システム及び状態判定方法 | |
JP2005148784A (ja) | 路面走行レーン検出装置 | |
CN118103881A (zh) | 用于确定车辆自身位置的方法及装置 | |
JP2009187181A (ja) | 車両用画像処理装置及び車両用画像処理方法 | |
JP2006113738A (ja) | 物体検出装置および物体検出方法 | |
JP2003317105A (ja) | 走行路認識装置 | |
JPH01242916A (ja) | 車上距離検出装置 | |
JP2000182184A (ja) | 車載空中線検出方法および装置 | |
JPH11167624A (ja) | 道路の白線認識方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200480023582.X Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BW BY BZ CA CH CN CO CR CU CZ DK DM DZ EC EE EG ES FI GB GD GE GM HR HU ID IL IN IS KE KG KP KR LC LK LR LS LT LU LV MA MD MG MN MW MX MZ NA NI NO NZ OM PG PL PT RO RU SC SD SE SG SK SL SY TM TN TR TT TZ UA UG US UZ VC YU ZA ZM |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SZ TZ UG ZM ZW AM AZ BY KG MD RU TJ TM AT BE BG CH CY DE DK EE ES FI FR GB GR HU IE IT MC NL PL PT RO SE SI SK TR BF CF CG CI CM GA GN GQ GW ML MR SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2004787985 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020067004529 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2007084655 Country of ref document: US Ref document number: 10572956 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 1020067004529 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2004787985 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 10572956 Country of ref document: US |