WO2014167393A1 - Travel path detection apparatus and travel path detection method - Google Patents

Travel path detection apparatus and travel path detection method Download PDF

Info

Publication number
WO2014167393A1
WO2014167393A1 PCT/IB2014/000432 IB2014000432W WO2014167393A1 WO 2014167393 A1 WO2014167393 A1 WO 2014167393A1 IB 2014000432 W IB2014000432 W IB 2014000432W WO 2014167393 A1 WO2014167393 A1 WO 2014167393A1
Authority
WO
WIPO (PCT)
Prior art keywords
travel path
reference point
point
edge point
vehicle
Prior art date
Application number
PCT/IB2014/000432
Other languages
French (fr)
Inventor
Akihiro Tsukada
Original Assignee
Toyota Jidosha Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Jidosha Kabushiki Kaisha filed Critical Toyota Jidosha Kabushiki Kaisha
Publication of WO2014167393A1 publication Critical patent/WO2014167393A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30172Centreline of tubular or elongated structure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30256Lane; Road marking

Definitions

  • the invention relates to a travel path detection apparatus and a travel path detection method for detecting a travel path of a vehicle.
  • JP 2006-268199 A describes an image processing system that detects a lane mark on a travel path of a vehicle.
  • the image processing system sets a reference region corresponding to a road surface part other than the lane mark on a road surface image, creates a luminance histogram by measuring luminance frequencies in the reference region ⁇ and extracts a cluster part having a width, a height, or a surface area that equals or exceeds a . threshold from the created histogram as a road surface cluster. Further, the image processing system detects all white line candidate edge points as primary white line candidate edge points, and detects those primary white line candidate edge points that overlap the reference region as secondary white line candidate edge points.
  • the image processing system detects only those secondary white line candidate edge points in which a value of a luminance parameter is not included in a luminance range of the road surface cluster as true white line edge points. By excluding the secondary white line candidate edge points included in the luminance range of the road surface cluster from the true white line edge points in this manner, a true lane mark is detected precisely, and erroneous detection of a lane mark is suppressed.
  • white line candidates are detected as a collection of white line candidate edge points on a line.
  • the lane mark is detected as a line on an image, and therefore, when an actual lane mark is a compound line or includes a diverging line, it may be impossible to determine which of the lines on the image is a desired white line.
  • a lane estimation result may be unstable.
  • white line candidates can no longer be detected, and therefore the lane estimation result may be unstable likewise in this case.
  • the invention therefore provides a travel path detection apparatus and a travel path detection method with which a travel path estimation result can be stabilized.
  • a first aspect of the invention is a travel path detection apparatus including: an edge detection unit configured to detect an edge point, which is a point at which a luminance of a road varies; a reference point calculation unit configured to calculate a reference point as a candidate of a travel path of a vehicle on the basis of a left side edge point and a right side edge point detected on the road by the edge detection unit; a reference point position voting unit configured to vote for a position of the reference point calculated by the reference point calculation unit; and a travel path detection unit configured to detect information indicating the travel path of the vehicle on the basis of the position of the reference point voted for by the reference point position voting unit.
  • the reference point calculation unit calculates the reference point on the basis of the left side edge point and the right side edge point
  • the travel path detection unit detects the information indicating the travel path on the basis of the reference point position that has been voted for.
  • the travel path information is detected using the position of the reference point obtained from the left and right side edge points instead of detecting a white line candidate in the form of a line, as in the related art.
  • the travel path information is detected on the basis of voting results for reference point positions obtained from respective edge points detected from the compound lines on either side, and as a result, a travel path information detection result can be stabilized.
  • the reference point may be a midpoint between the left side edge point and the right side edge point.
  • the reference point calculation unit calculates the midpoint between the left side edge point and the right side edge point, and therefore information indicating a travel path close to the center of a lane can be detected.
  • the edge detection unit may be configured to detect a rising edge point at which the luminance increases when a luminance detection position is moved from one side to another side of left and right sides of a captured image of the road, and a falling edge point at which the luminance decreases when the luminance detection position is moved from the one side to the other side of the left and right sides of the captured image
  • the reference point ' calculation unit may be configured to calculate the reference point on the basis of a left side rising edge point and a right side falling edge point, and calculate the reference point on the basis of a left side falling edge point and a right side rising edge point.
  • reference points between rising edge points and falling edge points are calculated successively, and therefore the number of votes for the reference point positions can be increased.
  • the estimated reference point position can be brought closer to the position on the travel path, and as a result, the travel path information detection result can be stabilized even further. .
  • the travel path detection apparatus may further include: a distance calculation unit configured to calculate a distance between the left side edge point and the right side edge point, and a pitching angle variation calculation unit configured to, when a difference between a lane width of the road and one of an average value and a median of the distance calculated by the distance calculation unit equals or exceeds a threshold, calculate a variation amount in a pitching angle of the vehicle using the lane width of the road and the one of the average value and the median.
  • a distance calculation unit configured to calculate a distance between the left side edge point and the right side edge point
  • a pitching angle variation calculation unit configured to, when a difference between a lane width of the road and one of an average value and a median of the distance calculated by the distance calculation unit equals or exceeds a threshold, calculate a variation amount in a pitching angle of the vehicle using the lane width of the road and the one of the average value and the median.
  • the distance calculation unit may be configured to correct the pitching angle of the vehicle by the variation amount in the pitching angle, calculated by the pitching angle variation calculation unit, and then recalculate the distance between the left side edge point and the right side edge point before the reference point calculation unit calculates the reference point.
  • the edge point may be an edge point of a white line on the road.
  • the reference point position voting unit may be configured to vote for the position of the reference point by recording, as the candidate of the travel path of the vehicle, the position of the reference point calculated by the reference point calculation unit.
  • the travel path detection unit may be configured to detect the position of the reference point having the largest number of votes, among a plurality of the positions of the reference points voted for by the reference point position voting unit, as the information indicating the travel path of the vehicle.
  • a second aspect of the invention is a travel path detection method including: detecting an edge point, which is a point at which a luminance of a road varies; calculating a reference point as a candidate of a travel path of a vehicle on the basis of a left side edge point and a right side edge point on the road; voting for a position of the calculated reference point; and detecting information indicating the travel path of the vehicle on the basis of the reference point position that has been voted for.
  • the reference point is calculated on the basis of the left side edge point and the right side edge point, the calculated reference point is voted for, and the travel path is detected on the basis of the reference point position that has been voted for.
  • the travel path is detected using the position of the reference point obtained from the left and right side edge points instead of detecting a white line candidate in the form of a line, as in the related art.
  • the travel path is detected- on the basis of voting results for reference point positions obtained from respective edge points detected from the compound lines on either side, and as a result, the travel path detection result can be stabilized.
  • FIG. 1 is a block diagram showing a travel path detection apparatus according to a first embodiment of the invention
  • FIG 2 is a perspective view showing a rising edge and a falling edge of a white line on an actual travel path
  • FIG. 3 is a view illustrating voting for a midpoint position
  • FIG. 4 is a perspective view showing a rising edge and a falling edge of a compound line
  • FIG. 5 is a view illustrating voting for a midpoint position on the compound line
  • FIG. 6 is a flowchart showing processing executed by the travel path detection apparatus of FIG. 1 to estimate the travel path;
  • FIG. 7 is a view illustrating a distance between edge points on a captured image
  • FIGS. 8A and 8B are views illustrating voting for a midpoint position in the case of a diverging line or a hidden white line;
  • FIG. 9 is a block diagram showing a travel path detection apparatus according to a second embodiment of the invention.
  • FIGS. 10A to IOC are histograms showing frequencies of a value of half a vehicle width on a captured image.
  • FIG . 1 1 is a flowchart showing processing executed by the travel path detection apparatus according to the second embodiment of the invention to estimate the travel path.
  • a travel path detection apparatus 1 detects a travel path as a vehicle travels.
  • the travel path detection apparatus , 1 is used in lane keeping assist (LKA) or the like, for example, to forestall deviation of the vehicle from the travel path (a lane) by prompting a driver to pay attention or supporting a steering operation when the vehicle is about to deviate from the travel path.
  • LKA lane keeping assist
  • the travel path detection apparatus 1 includes an edge detection unit 21 that detects an edge point, which is a point at which a luminance of a road varies, a midpoint calculation unit (a reference point calculation unit) 22 that calculates a midpoint between a left side edge point and a right side edge point as a reference point for vehicle travel, a midpoint position voting unit (a reference point position voting unit) 23 that votes for a calculated midpoint position, and a travel path detection unit 24 that detects the travel path of the vehicle on the basis of a midpoint position having the most votes, from among midpoint positions that have been voted for.
  • an edge detection unit 21 that detects an edge point, which is a point at which a luminance of a road varies
  • a midpoint calculation unit 22 that calculates a midpoint between a left side edge point and a right side edge point as a reference point for vehicle travel
  • a midpoint position voting unit (a reference point position voting unit) 23 that votes for a calculated midpoint position
  • the travel path detection apparatus 1 further includes a camera 10 that captures images of the road during travel, an electronic control unit (ECU) 20 that performs image processing on an image captured by the camera 10 and detects the travel path, and an output unit 30 that issues a warning to the driver of the vehicle and assists a steering operation of the vehicle upon reception of a control signal from the ECU 20.
  • ECU electronice control unit
  • the camera 10 has a function for capturing a frontward image of the road along which the vehicle is traveling, and the captured image is output to the ECU 20.
  • the ECU 20 performs image processing on the captured image captured by the camera 10. More specifically, the ECU 20 converts the captured image captured by the camera 10 into a planar view image to detect the travel path.
  • the edge detection unit 21, midpoint calculation unit 22, midpoint position voting unit 23, and travel path detection unit 24 described above are provided in the ECU 20, for example.
  • the edge detection unit 21 detects an edge point from the captured image of the road captured by the camera 10. As shown in FIG. 2, when a white line HI and a white line H2 exist in front of the vehicle on either side, the edge detection unit 21 detects rising edge points LI and Rl and falling edge points L2 and R2 from the captured image.
  • the rising edge points LI and Rl are edge points at which a luminance increases when a luminance detection position is moved from a left side to a right side on the captured image (in other words, edge points at which the luminance on the right side of the edge point is higher than the luminance on the left side), and the falling edge points L2 and R2 are edge points at which the luminance decreases when the luminance detection position is moved from the left side to the right side on the captured image (in other words, edge points at which the luminance on the right side of the edge point is lower than the luminance on the left side).
  • the midpoint calculation unit 22 calculates, as a midpoint position, a position of a midpoint between a left side edge point (edge point on a left side of the captured image) and a right side edge point (edge point on a right side of the captured image) detected by the edge detection unit 21 in an identical horizontal position of the captured image. More specifically, as shown in FIG. 3, the midpoint calculation unit 22 calculates, on the captured image converted into a planar view image, a midpoint between the left side rising edge point LI and the right side falling edge point R2 and a midpoint between the left side falling edge point L2 and the right side rising edge point Rl .
  • the midpoint position , voting unit 23 generates travel path candidate portions CI, C2 such as those shown in FIG. 3, for example, from the midpoint positions calculated by the midpoint calculation unit 22.
  • the travel path candidate portions CI , C2 are displayed to be darkest at the midpoint position calculated by the midpoint calculation unit 22 and become gradually lighter toward a periphery.
  • the midpoint position voting unit 23 generates a histogram showing frequencies of midpoint positions such as those shown in FIG.
  • voting for a midpoint position means recording, as a travel path candidate, a midpoint position calculated by the midpoint calculation unit 22.
  • the travel path detection unit 24 detects the travel path of the vehicle on the basis of the midpoint position having the largest number of votes among the midpoint positions voted for by the midpoint position voting unit 23. More specifically, the travel path detection unit 24 detects, as travel path information, the midpoint position having the highest frequency on the histogram generated by the midpoint position voting unit 23, and outputs the detected travel path information to the output unit 30.
  • the output unit 30 assists the driver in driving the vehicle upon reception of the travel path information output by the travel path ; detection unit 24. More specifically, for example, the output unit 30 may include a warning device that issues the driver with a warning using voice, image display, or the like when the vehicle deviates from the travel path, or may perform steering control to guide the vehicle to a more correct travel path by assisting steering of the vehicle when the vehicle deviates from the travel path.
  • a warning device that issues the driver with a warning using voice, image display, or the like when the vehicle deviates from the travel path, or may perform steering control to guide the vehicle to a more correct travel path by assisting steering of the vehicle when the vehicle deviates from the travel path.
  • the midpoint calculation unit 22 calculates the midpoint positions between the left side edge points LI, L2 and the right side edge points Rl , R2, the midpoint position voting unit 23 votes for the calculated midpoint positions, and the travel path detection unit 24 detects the midpoint position having the largest number of votes among the midpoint positions that have been voted for as the travel path information.
  • a midpoint position between edge points on the left and right sides is detected as the travel path information instead of detecting a white line candidate in the form of a line, as in the related art.
  • the edge detection unit 21 detects rising edge points LI 1, LI 3 and falling edge points LI 2,
  • the midpoint calculation unit 22 then respectively calculates a midpoint position between the left side falling edge point L14 and the right
  • Rl l a midpoint position between the left side falling edge point LI 2 and the right side rising edge point R13, a midpoint position between the left side rising edge point Ll l and the right side rising edge point R12, and a midpoint position between the left side rising edge point LI 1 and the right side falling edge point R14.
  • the midpoint position voting unit 23 generates travel path candidate portions Cl l to CI 8 from the respective calculated midpoint positions.
  • the midpoint position voting unit 23 then generates a histogram such as that shown in FIG. 5 by voting for the midpoint positions repeatedly, whereupon the travel path detection unit 24 performs travel path detection in a similar manner to that described above and outputs the travel path information to the output unit 30.
  • FIG. 6 Processing shown in FIG. 6 is executed repeatedly by the ECU 20 at fixed time intervals, for example.
  • step Sl l the edge detection unit 21 extracts and labels edge points LI to L4, Rl to R4, as shown in FIG. 7, on each line of an image captured by the camera 10 (edge detection step).
  • edge point extraction and labeling is performed on a line extending in a lateral direction of the road or the vehicle on the captured image, whereupon edge point extraction and labeling is performed on the next line in a different position in a longitudinal direction (an advancement direction) of the road or the vehicle.
  • labeling is processing for identifying an extracted edge point grou by luminance variation (increasing or decreasing), height, distance, or the like.
  • the line direction may be set as a pixel arrangement direction in the lateral direction of the road or the vehicle on the detection subject image, for example.
  • a distance dy (a distance ⁇ d 12 , for example) between subject edge points (the left side edge point LI and the right side edge point R2, for example) on a subject line (line 1 in FIG. 7, for example) is calculated.
  • values of the lower limit threshold D m j n and the upper limit threshold D max correspond to lower and upper limit values of an imaginable lane width. However, the values may be modified as appropriate.
  • the processing advances to S I 4, where the midpoint calculation unit 22 calculates the midpoint positions between the edge points and the midpoint position voting unit 23 votes for the midpoint positions (reference point calculation step, reference point position voting step).
  • SI 5 a determination is made as to whether or not the processing of S 12 to S 14 has been executed in relation to all of the edge points on the subject line.
  • the processing advances to S I 6, and when it is determined in S I 5 that the processing has not yet been executed on all of the edge points on the subject line, the subject edge points are changed, whereupon the processing of S 12 to S 14 is executed again.
  • S I 6 a determination is made as to whether or not the processing of S12 to S 14 has been executed in relation to all of the subject lines.
  • the processing advances to S I 8
  • the processing advances to SI 7.
  • processing is performed to change the subject line to the next line.
  • line 1 in FIG. 7 is the subject line, for example, the subject line is changed to line 2 positioned below line 1.
  • processing is executed to detect the midpoint position having the largest number of votes as the travel path information. More specifically, the travel path detection unit 24 executes processing to detect the travel path on the basis of the voting results relating to the midpoint positions voted for in S14 by the midpoint position voting unit 23 (travel path detection step). After the travel path detection unit 24 has detected the travel path in this manner, the travel path information is output to the output unit 30, whereupon the series of processes is terminated.
  • the travel path detection processing of the travel path detection apparatus 1 is performed as described above.
  • the midpoint positions between the rising edge points and the falling edge points are calculated successively, and therefore the number of midpoint position votes given by the midpoint position voting unit 23 can be increased. Accordingly, the detected midpoint position can be brought closer to the actual lane center, and as a result, the travel path detection result can be made more stable. Further, in the travel path detection apparatus 1 , the travel path is detected by voting for midpoint positions, and therefore travel path detection can be performed with stability even when the lane mark is a diverging line or a part of the lane mark is hidden.
  • the travel path detection method includes: the edge detection step for detecting edge points, which are points at which the luminance of the road varies; the reference point calculation step for calculating a reference point as a candidate of the travel path of the vehicle on the basis of a left side edge point and a right side edge point detected on the road in the edge detection step; the reference point position voting step for voting for the reference point positions calculated in the reference point calculation step; and the travel path detection step for detecting the travel path of the vehicle on the basis of the reference point positions voted for in the reference point position voting step.
  • the travel path is detected using reference point positions obtained from left and right side edge points rather than detecting white line candidates in the form of lines, as in the related art.
  • the travel path detection result can be stabilized.
  • edge points on the diverging white line H23 can be excluded from the processing subjects, and therefore the appropriate travel path candidate portions C21 to C27 can be detected without being affected by the white line H23.
  • the travel path detection apparatus 1 the travel path is detected by voting for midpoint positions, and as a result, the travel path detection result can be stabilized even when an image Of the diverging white line H23 is captured.
  • a situation in which a midpoint position detection result deviates greatly from the lane center when a part of a white line is hidden can be avoided, and as a result, the travel path detection result can be stabilized.
  • the travel path is detected using voting by the midpoint position voting unit 23, greater noise resistance than that of the related art is obtained, and therefore the travel path can be detected with a high degree of precision even in a noisy environment such as in rain or at night.
  • the travel path detection apparatus 101 according to the second embodiment calculates an amount of variation in a pitching angle of the vehicle, and calculates the distance between the edge points after correcting the pitching angle by the variation amount.
  • the travel path detection apparatus 101 according to the second embodiment differs from the first embodiment in using an ECU 120, to which a distance calculation unit 125 and a pitching angle variation calculation unit 126 have been added, instead of the ECU 20 according to the first embodiment, but is otherwise configured similarly to the first embodiment. Accordingly, the following description focuses on the ECU 120 according to the second embodiment, and duplicate description of identical parts to the first embodiment has been omitted.
  • the distance calculation unit 125 calculates a distance between an edge point positioned on the left side of the captured image and an edge point positioned , on the right side of the captured image. More specifically, the distance calculation unit 125 calculates a distance d 12 between a left side rising edge point LI and a right side falling edge point R2 shown in FIG. 7, for example, and calculates distances between edge points in this manner in relation to each edge point and each line.
  • the pitching angle variation calculation unit 126 generates histograms such as those shown in FIGS. 10A to IOC from the distances between the edge points calculated by the distance calculation unit 125 in relation to the respective edge points and lines.
  • FIG. 10A shows an example of a value of half a vehicle width in a case where pitching does not occur in the vehicle.
  • FIG. 10B shows an example of the value of half the vehicle width in a case where pitching occurs in the vehicle.
  • FIG. IOC shows , an example in which the pitching angle is corrected by shifting the histogram of FIG. 10B rightward.
  • an average value XI of a value obtained by halving the vehicle width when pitching does not occur differs from an average value X2 obtained by halving the vehicle width when pitching occurs.
  • the value of the vehicle width varies on the captured image.
  • the precision of travel path detection during pitching is improved by calculating the amount of variation in the pitching angle, correcting the pitching angle by the variation amount, and then recalculating the distance between the edge points.
  • an average value X3 of the frequency approaches the average value XI of FIG. 10A, and therefore the travel path can be detected with a similar degree of precision to that of a case in which pitching does not occur.
  • the pitching angle variation calculation unit 126 calculates a variation amount ⁇ in the pitching angle on the basis of Equation (1) shown below, for example.
  • the variation amount ⁇ in the pitching angle is calculated using Equation (1), for example, whereupon the distance d between the edge points is recalculated using a sum of the predetermined pitching angle P 0 and the variation amount ⁇ as a value of the pitching angle P.
  • S27 to S32 similar processing to that of S13 to S 18 in the first embodiment, shown in FIG. 6, is performed, and once the travel path information has been output to the output unit 30, the series of processes is terminated.
  • the pitching angle variation calculation unit 126 calculates the variation amount ⁇ in the pitching angle using the lane width d La ne and the one of the average value d av and the median.
  • the distance calculation unit 125 corrects the value of the pitching angle P by the variation amount ⁇ and then recalculates the distance d between the edge points before the midpoint calculation unit 22 calculates the midpoint.
  • the travel path detection apparatus 101 according to the second embodiment is configured identically to the travel path detection apparatus 1 according to the first embodiment, and therefore identical effects to those of the travel path detection apparatus 1 according to the first embodiment are also obtained.
  • the midpoint calculation unit 22 calculates a midpoint between a left side edge point and a right side edge point and the midpoint position voting unit 23 votes for a midpoint position
  • a point other than the midpoint may be used as the reference point.
  • the reference point position may be offset by a predetermined amount to the left side or the right side, for example.
  • the vehicle can be caused to travel on the left side or the right side within the lane.
  • the reference point position can be set such that when the vehicle travels around a curve, the vehicle is caused to travel on an inner side of the curve within the lane.
  • the edge detection unit 21 detects the edge points from the captured image captured by the camera 10, whereupon the midpoint position between the detected edge points is calculated.
  • the reference point position need not be calculated from a captured image. More specifically, an infrared sensor or a radar apparatus using electromagnetic waves may be used instead of a captured image. In this case, for example, the laser apparatus may detect a three-dimensional point as a point at which the luminance of the road varies, and the reference point may be calculated from the detected three-dimensional point.
  • edge points of a white line were detected, but edge points of a curbstone, for example, may be detected instead.
  • the edge point detection subject is not limited to that described in the above embodiments.
  • edge points are detected from both the left side and the right side of the captured image captured by the camera 10, but in a case where the information indicating the travel path along which the vehicle is traveling is obtained, edge points may be detected from either one of the left side and the right side of the captured image, and the reference point position and the travel path, can be detected likewise in the case.
  • edge points are detected from both the left side and the right side of a road having a single lane, but the lane of the subject road is not limited to a single lane. More specifically, edge points may be detected from a three-lane road by modifying the threshold of the subject lane width, for example. In this case, when one white line is hidden, the reference point position can be calculated from a white line on an outer side or an inner side thereof, and therefore the travel path can be detected with a high degree of precision even when a white line is hidden. .
  • an, edge point at which the luminance increases when the luminance detection position is moved from the left side to the right side on the captured image is set as the rising edge point
  • an edge point at which the luminance decreases when the luminance detection position is moved from the left side to the right side on the captured image is set as the falling edge point
  • an edge point at which the luminance increases when the luminance detection position is moved from the right side to the left side on the captured image may be set as the rising edge point
  • an edge point at which the luminance decreases when the luminance detection position is moved from the right side to the left side on the captured image may be set as the falling edge point.

Abstract

A travel path detection apparatus includes: an edge detection unit (21) configured to detect an edge point, which is a point at which a luminance of a road varies; a reference point calculation unit (22) configured to calculate a reference point as a candidate of a travel path of a vehicle on the basis of a left side edge point and a right side edge point detected on the road by the edge detection unit (21); a reference point position voting unit (23) configured to for a position of the reference point calculated by the reference point calculation unit (22); and a travel path, detection unit (24) configured to detect information indicating the travel path of the vehicle on the basis of the reference point position voted for by the reference point position voting unit (23).

Description

TRAVEL PATH DETECTION APPARATUS AND TRAVEL PATH DETECTION
METHOD
BACKGROUND OF THE INVENTION
1. Field of the Invention
[0001] The invention relates to a travel path detection apparatus and a travel path detection method for detecting a travel path of a vehicle.
2. Description of Related Art
[0002] Japanese Patent Application Publication No. 2006-268199 (JP 2006-268199 A) describes an image processing system that detects a lane mark on a travel path of a vehicle. The image processing system sets a reference region corresponding to a road surface part other than the lane mark on a road surface image, creates a luminance histogram by measuring luminance frequencies in the reference region^ and extracts a cluster part having a width, a height, or a surface area that equals or exceeds a . threshold from the created histogram as a road surface cluster. Further, the image processing system detects all white line candidate edge points as primary white line candidate edge points, and detects those primary white line candidate edge points that overlap the reference region as secondary white line candidate edge points.
[0003] The image processing system then detects only those secondary white line candidate edge points in which a value of a luminance parameter is not included in a luminance range of the road surface cluster as true white line edge points. By excluding the secondary white line candidate edge points included in the luminance range of the road surface cluster from the true white line edge points in this manner, a true lane mark is detected precisely, and erroneous detection of a lane mark is suppressed.
[0004] In the image processing system described above, however, white line candidates are detected as a collection of white line candidate edge points on a line. In other words, the lane mark is detected as a line on an image, and therefore, when an actual lane mark is a compound line or includes a diverging line, it may be impossible to determine which of the lines on the image is a desired white line. As a result, a lane estimation result may be unstable. Furthermore, when the actual lane mark is hidden by another vehicle or the like, white line candidates can no longer be detected, and therefore the lane estimation result may be unstable likewise in this case.
SUMMARY OF THE INVENTION
[0005] The invention therefore provides a travel path detection apparatus and a travel path detection method with which a travel path estimation result can be stabilized.
[0006] A first aspect of the invention is a travel path detection apparatus including: an edge detection unit configured to detect an edge point, which is a point at which a luminance of a road varies; a reference point calculation unit configured to calculate a reference point as a candidate of a travel path of a vehicle on the basis of a left side edge point and a right side edge point detected on the road by the edge detection unit; a reference point position voting unit configured to vote for a position of the reference point calculated by the reference point calculation unit; and a travel path detection unit configured to detect information indicating the travel path of the vehicle on the basis of the position of the reference point voted for by the reference point position voting unit.
[0007] According to the configuration described above, the reference point calculation unit calculates the reference point on the basis of the left side edge point and the right side edge point, the reference point position voting unit votes for the reference point, and the travel path detection unit detects the information indicating the travel path on the basis of the reference point position that has been voted for. In the above configuration, therefore, the travel path information is detected using the position of the reference point obtained from the left and right side edge points instead of detecting a white line candidate in the form of a line, as in the related art. Hence, even when an .actual lane mark is a compound line, for example, the travel path information is detected on the basis of voting results for reference point positions obtained from respective edge points detected from the compound lines on either side, and as a result, a travel path information detection result can be stabilized.
[0008] Further, the reference point may be a midpoint between the left side edge point and the right side edge point. In this case, the reference point calculation unit calculates the midpoint between the left side edge point and the right side edge point, and therefore information indicating a travel path close to the center of a lane can be detected.
[0009] Further, the edge detection unit may be configured to detect a rising edge point at which the luminance increases when a luminance detection position is moved from one side to another side of left and right sides of a captured image of the road, and a falling edge point at which the luminance decreases when the luminance detection position is moved from the one side to the other side of the left and right sides of the captured image, and the reference point'calculation unit may be configured to calculate the reference point on the basis of a left side rising edge point and a right side falling edge point, and calculate the reference point on the basis of a left side falling edge point and a right side rising edge point. In this case, reference points between rising edge points and falling edge points are calculated successively, and therefore the number of votes for the reference point positions can be increased. Hence, the estimated reference point position can be brought closer to the position on the travel path, and as a result, the travel path information detection result can be stabilized even further. .
[0010] Moreover, the travel path detection apparatus may further include: a distance calculation unit configured to calculate a distance between the left side edge point and the right side edge point, and a pitching angle variation calculation unit configured to, when a difference between a lane width of the road and one of an average value and a median of the distance calculated by the distance calculation unit equals or exceeds a threshold, calculate a variation amount in a pitching angle of the vehicle using the lane width of the road and the one of the average value and the median. In this case, when the difference between the lane width of the road and the one of the average value and the median equals or exceeds the threshold, the distance calculation unit may be configured to correct the pitching angle of the vehicle by the variation amount in the pitching angle, calculated by the pitching angle variation calculation unit, and then recalculate the distance between the left side edge point and the right side edge point before the reference point calculation unit calculates the reference point.
[0011] When pitching occurs in the vehicle during travel due to sudden depression of a brake or the. like, the distance between the edge points on a captured image deviates from an actual lane width. In other words, when pitching occurs, the lane on the captured image is distorted such that the reference point position cannot be calculated correctly, and as a result, the travel path detection result may become unstable. According to the configuration described above, however, the distance between the edge points is recalculated after correcting the pitching angle when pitching occurs, and therefore a distortion effect on the captured image caused by pitching can be eliminated. As a result, a travel path estimation result can be stabilized even when pitching occurs.
[0012] The edge point may be an edge point of a white line on the road. The reference point position voting unit may be configured to vote for the position of the reference point by recording, as the candidate of the travel path of the vehicle, the position of the reference point calculated by the reference point calculation unit. The travel path detection unit may be configured to detect the position of the reference point having the largest number of votes, among a plurality of the positions of the reference points voted for by the reference point position voting unit, as the information indicating the travel path of the vehicle. ¾
[0013] A second aspect of the invention is a travel path detection method including: detecting an edge point, which is a point at which a luminance of a road varies; calculating a reference point as a candidate of a travel path of a vehicle on the basis of a left side edge point and a right side edge point on the road; voting for a position of the calculated reference point; and detecting information indicating the travel path of the vehicle on the basis of the reference point position that has been voted for.
[0014] According to the configuration described above, the reference point is calculated on the basis of the left side edge point and the right side edge point, the calculated reference point is voted for, and the travel path is detected on the basis of the reference point position that has been voted for. In the above configuration, therefore, the travel path is detected using the position of the reference point obtained from the left and right side edge points instead of detecting a white line candidate in the form of a line, as in the related art. Hence, even when the actual lane mark is a compound line, for example, the travel path is detected- on the basis of voting results for reference point positions obtained from respective edge points detected from the compound lines on either side, and as a result, the travel path detection result can be stabilized.
BRIEF DESCRIPTION OF THE DRAWINGS
[0015] Features, advantages, and technical and industrial significance of exemplary embodiments of the invention will be described below with reference to the accompanying drawings, in which like numerals denote like elements, and wherein:
FIG. 1 is a block diagram showing a travel path detection apparatus according to a first embodiment of the invention;
FIG 2 is a perspective view showing a rising edge and a falling edge of a white line on an actual travel path;
FIG. 3 is a view illustrating voting for a midpoint position;
FIG. 4 is a perspective view showing a rising edge and a falling edge of a compound line;
FIG. 5 is a view illustrating voting for a midpoint position on the compound line;
FIG. 6 is a flowchart showing processing executed by the travel path detection apparatus of FIG. 1 to estimate the travel path;
FIG. 7 is a view illustrating a distance between edge points on a captured image;
FIGS. 8A and 8B are views illustrating voting for a midpoint position in the case of a diverging line or a hidden white line; FIG. 9 is a block diagram showing a travel path detection apparatus according to a second embodiment of the invention;
FIGS. 10A to IOC are histograms showing frequencies of a value of half a vehicle width on a captured image; and
FIG . 1 1 is a flowchart showing processing executed by the travel path detection apparatus according to the second embodiment of the invention to estimate the travel path.
DETAILED DESCRIPTION OF EMBODIMENTS
[0016] Embodiments of the invention will be described below with reference to the drawings. Note that in the following description, identical or corresponding elements have been allocated identical reference symbols, and duplicate description thereof has been omitted.
[0017] (First Embodiment)
As shown in FIG 1 , a travel path detection apparatus 1 according to a first embodiment detects a travel path as a vehicle travels. The travel path detection apparatus , 1 is used in lane keeping assist (LKA) or the like, for example, to forestall deviation of the vehicle from the travel path (a lane) by prompting a driver to pay attention or supporting a steering operation when the vehicle is about to deviate from the travel path.
[Q018] The travel path detection apparatus 1 includes an edge detection unit 21 that detects an edge point, which is a point at which a luminance of a road varies, a midpoint calculation unit (a reference point calculation unit) 22 that calculates a midpoint between a left side edge point and a right side edge point as a reference point for vehicle travel, a midpoint position voting unit (a reference point position voting unit) 23 that votes for a calculated midpoint position, and a travel path detection unit 24 that detects the travel path of the vehicle on the basis of a midpoint position having the most votes, from among midpoint positions that have been voted for.
[0019] The travel path detection apparatus 1 further includes a camera 10 that captures images of the road during travel, an electronic control unit (ECU) 20 that performs image processing on an image captured by the camera 10 and detects the travel path, and an output unit 30 that issues a warning to the driver of the vehicle and assists a steering operation of the vehicle upon reception of a control signal from the ECU 20.
[0020] The camera 10 has a function for capturing a frontward image of the road along which the vehicle is traveling, and the captured image is output to the ECU 20. The ECU 20 performs image processing on the captured image captured by the camera 10. More specifically, the ECU 20 converts the captured image captured by the camera 10 into a planar view image to detect the travel path. The edge detection unit 21, midpoint calculation unit 22, midpoint position voting unit 23, and travel path detection unit 24 described above are provided in the ECU 20, for example.
[0021] The edge detection unit 21 detects an edge point from the captured image of the road captured by the camera 10. As shown in FIG. 2, when a white line HI and a white line H2 exist in front of the vehicle on either side, the edge detection unit 21 detects rising edge points LI and Rl and falling edge points L2 and R2 from the captured image. Here, the rising edge points LI and Rl are edge points at which a luminance increases when a luminance detection position is moved from a left side to a right side on the captured image (in other words, edge points at which the luminance on the right side of the edge point is higher than the luminance on the left side), and the falling edge points L2 and R2 are edge points at which the luminance decreases when the luminance detection position is moved from the left side to the right side on the captured image (in other words, edge points at which the luminance on the right side of the edge point is lower than the luminance on the left side).
[0022] The midpoint calculation unit 22 calculates, as a midpoint position, a position of a midpoint between a left side edge point (edge point on a left side of the captured image) and a right side edge point (edge point on a right side of the captured image) detected by the edge detection unit 21 in an identical horizontal position of the captured image. More specifically, as shown in FIG. 3, the midpoint calculation unit 22 calculates, on the captured image converted into a planar view image, a midpoint between the left side rising edge point LI and the right side falling edge point R2 and a midpoint between the left side falling edge point L2 and the right side rising edge point Rl . By calculating the midpoint between the left side rising edge point LI and the right side falling edge point R2 and the midpoint between the left side falling edge point L2 and the right side rising edge point Rl in this manner, a position close to a center of the lane can be detected more accurately as travel path information.
[0023] The midpoint position , voting unit 23 generates travel path candidate portions CI, C2 such as those shown in FIG. 3, for example, from the midpoint positions calculated by the midpoint calculation unit 22. The travel path candidate portions CI , C2 are displayed to be darkest at the midpoint position calculated by the midpoint calculation unit 22 and become gradually lighter toward a periphery. By displaying the travel path candidate portions CI , C2 in this shaded manner, a situation in which the midpoint position deviates greatly from the lane center is suppressed. The midpoint position voting unit 23 generates a histogram showing frequencies of midpoint positions such as those shown in FIG. 3, for example, by repeatedly generating the travel path candidate portions CI, C2 and repeatedly voting for the midpoint positions calculated by the midpoint calculation unit 22. Here, voting for a midpoint position means recording, as a travel path candidate, a midpoint position calculated by the midpoint calculation unit 22.
[0024] The travel path detection unit 24 detects the travel path of the vehicle on the basis of the midpoint position having the largest number of votes among the midpoint positions voted for by the midpoint position voting unit 23. More specifically, the travel path detection unit 24 detects, as travel path information, the midpoint position having the highest frequency on the histogram generated by the midpoint position voting unit 23, and outputs the detected travel path information to the output unit 30.
[0025] The output unit 30 assists the driver in driving the vehicle upon reception of the travel path information output by the travel path; detection unit 24. More specifically, for example, the output unit 30 may include a warning device that issues the driver with a warning using voice, image display, or the like when the vehicle deviates from the travel path, or may perform steering control to guide the vehicle to a more correct travel path by assisting steering of the vehicle when the vehicle deviates from the travel path.
[0026] Hence, in the travel path detection apparatus 1, as shown in FIG. 3, the midpoint calculation unit 22 calculates the midpoint positions between the left side edge points LI, L2 and the right side edge points Rl , R2, the midpoint position voting unit 23 votes for the calculated midpoint positions, and the travel path detection unit 24 detects the midpoint position having the largest number of votes among the midpoint positions that have been voted for as the travel path information. In this embodiment, a midpoint position between edge points on the left and right sides is detected as the travel path information instead of detecting a white line candidate in the form of a line, as in the related art. Therefore, even when an actual lane mark is a compound line, for example, the travel path is detected on the basis of voting results for the midpoint positions between the respective edge points detected from the compound lines on either side, and as a result, a travel path detection result can be stabilized.
(
[0027] More specifically, as shown in FIG. 4, in the case of a compound line in which two white lines HI 1 , HI 2 extend in a front-rear direction on the left side of the road and two white lines HI 3, HI 4 extend in the front-rear direction on the right side of the road, the edge detection unit 21 detects rising edge points LI 1, LI 3 and falling edge points LI 2,
L14 from the left side of the captured image, and detects rising edge points Rl l , R13 and falling edge points Rl 2, Rl 4 from the right side of the captured image.
[0028] As shown in FIG. 5, the midpoint calculation unit 22 then respectively calculates a midpoint position between the left side falling edge point L14 and the right
, side rising edge point Rl l , a midpoint position between the left side falling edge point L14 and the right side rising edge point Rl 3, a midpoint position between the left side rising edge point L13 and the right side falling edge point R12, a midpoint position between the left side rising edge point LI 3 and the right side falling edge point R14, a midpoint position between the left side falling edge point LI 2 and the right side rising edge point
Rl l, a midpoint position between the left side falling edge point LI 2 and the right side rising edge point R13, a midpoint position between the left side rising edge point Ll l and the right side rising edge point R12, and a midpoint position between the left side rising edge point LI 1 and the right side falling edge point R14.
[0029] Next, the midpoint position voting unit 23 generates travel path candidate portions Cl l to CI 8 from the respective calculated midpoint positions. The midpoint position voting unit 23 then generates a histogram such as that shown in FIG. 5 by voting for the midpoint positions repeatedly, whereupon the travel path detection unit 24 performs travel path detection in a similar manner to that described above and outputs the travel path information to the output unit 30.
[0030] Here, when white line candidates are detected in the form of lines from the edge points, as in the related art, and the lane mark is constituted by a compound line, it is impossible to determine which line is a desired white line, and therefore a travel path estimation result may be unstable. With the travel path detection apparatus 1 according to this embodiment, as shown in FIG. 5, however, the travel path is detected on the basis of the midpoint positions between the respective edge points using a left-right symmetrical characteristic of the lane mark, and therefore the travel path detection result is stabilized such that the travel path can be detected more accurately even when the lane mark is a compound line.
[0031] Next, an example . of a travel path detection method using the travel path detection apparatus 1 according to this embodiment will be described with reference to FIGS. 6 and 7. Processing shown in FIG. 6 is executed repeatedly by the ECU 20 at fixed time intervals, for example.
[0032] First, in step Sl l ("S 11" hereafter; likewise with respect to other steps), the edge detection unit 21 extracts and labels edge points LI to L4, Rl to R4, as shown in FIG. 7, on each line of an image captured by the camera 10 (edge detection step). For example, edge point extraction and labeling is performed on a line extending in a lateral direction of the road or the vehicle on the captured image, whereupon edge point extraction and labeling is performed on the next line in a different position in a longitudinal direction (an advancement direction) of the road or the vehicle. Here, labeling is processing for identifying an extracted edge point grou by luminance variation (increasing or decreasing), height, distance, or the like. Further, the line direction may be set as a pixel arrangement direction in the lateral direction of the road or the vehicle on the detection subject image, for example. Next, in S I 2, a distance dy (a distance d12, for example) between subject edge points (the left side edge point LI and the right side edge point R2, for example) on a subject line (line 1 in FIG. 7, for example) is calculated.
[0033] In S I 3, a determination is made as to whether or not the distance dj, between the edge points calculated in S 12 is larger than a lower limit threshold Dmjn and smaller than an upper limit threshold Dmax. Here, values of the lower limit threshold Dmjn and the upper limit threshold Dmax correspond to lower and upper limit values of an imaginable lane width. However, the values may be modified as appropriate. When it is not determined in S 13 that the distance dj, between the edge points is larger than the lower limit threshold Dmjn and smaller than the upper limit threshold Dmax, the processing advances to S 15 without performing midpoint position voting. When, on the other hand, it is determined in SI 3 that the distance djj between the edge points is larger than the lower limit threshold Dmjn and smaller than the upper limit threshold Dmax, the processing advances to S I 4, where the midpoint calculation unit 22 calculates the midpoint positions between the edge points and the midpoint position voting unit 23 votes for the midpoint positions (reference point calculation step, reference point position voting step).
[0034] In SI 5, a determination is made as to whether or not the processing of S 12 to S 14 has been executed in relation to all of the edge points on the subject line. When it is determined in S I 5 that the processing has been executed on all of the edge points on the subject line, the processing advances to S I 6, and when it is determined in S I 5 that the processing has not yet been executed on all of the edge points on the subject line, the subject edge points are changed, whereupon the processing of S 12 to S 14 is executed again.
[0035] In S I 6, a determination is made as to whether or not the processing of S12 to S 14 has been executed in relation to all of the subject lines. When it is determined in S 16 that the processing has been executed on all of the subject lines, the processing advances to S I 8, and when it is determined in S 16 that the processing has not yet been executed on all of the subject lines, the processing advances to SI 7.
[0036] In SI 7, processing is performed to change the subject line to the next line. When line 1 in FIG. 7 is the subject line, for example, the subject line is changed to line 2 positioned below line 1. In SI 8, processing is executed to detect the midpoint position having the largest number of votes as the travel path information. More specifically, the travel path detection unit 24 executes processing to detect the travel path on the basis of the voting results relating to the midpoint positions voted for in S14 by the midpoint position voting unit 23 (travel path detection step). After the travel path detection unit 24 has detected the travel path in this manner, the travel path information is output to the output unit 30, whereupon the series of processes is terminated.
[0037] The travel path detection processing of the travel path detection apparatus 1 is performed as described above. In the travel path detection apparatus 1, the midpoint positions between the rising edge points and the falling edge points are calculated successively, and therefore the number of midpoint position votes given by the midpoint position voting unit 23 can be increased. Accordingly, the detected midpoint position can be brought closer to the actual lane center, and as a result, the travel path detection result can be made more stable. Further, in the travel path detection apparatus 1 , the travel path is detected by voting for midpoint positions, and therefore travel path detection can be performed with stability even when the lane mark is a diverging line or a part of the lane mark is hidden.
[0038] Furthermore, the travel path detection method according to the first . embodiment includes: the edge detection step for detecting edge points, which are points at which the luminance of the road varies; the reference point calculation step for calculating a reference point as a candidate of the travel path of the vehicle on the basis of a left side edge point and a right side edge point detected on the road in the edge detection step; the reference point position voting step for voting for the reference point positions calculated in the reference point calculation step; and the travel path detection step for detecting the travel path of the vehicle on the basis of the reference point positions voted for in the reference point position voting step. Hence, in this embodiment, the travel path is detected using reference point positions obtained from left and right side edge points rather than detecting white line candidates in the form of lines, as in the related art.
Therefore, even when the actual lane mark is a compound line, for example, the travel path is detected using the reference point voting results obtained from the respective edge points detected from the compound lines on both sides, and as a result, the travel path detection result can be stabilized.
[0039] Even in the case of a diverging line in which two white lines H21 , H22 extend in the front-rear direction on the left and right sides and a white line H23 diverges further leftward from the left side white line H21, as shown in FIG. 8A, midpoint position calculation by the midpoint calculation unit 22 and midpoint position voting by the midpoint position voting unit 23 are performed in a similar manner to that described above, and therefore appropriate travel path candidate portions C21 to C27 can be detected as travel path candidates extending in the front-rear direction between the white line H21 and the white line H22.
[0040] By determining whether or not the distance between the edge points is within a predetermined range in S13 of FIG. 6, edge points on the diverging white line H23 can be excluded from the processing subjects, and therefore the appropriate travel path candidate portions C21 to C27 can be detected without being affected by the white line H23. Hence, with the travel path detection apparatus 1, the travel path is detected by voting for midpoint positions, and as a result, the travel path detection result can be stabilized even when an image Of the diverging white line H23 is captured.
[0041] Furthermore, in a case where a white line H31 extends in the front-rear direction on the left side and a white line H32 extends in the front-rear direction on the right side but a part of the left side white line H31 is hidden by another vehicle B or the like, as shown in FIG. 8B, the midpoint of the part hidden by the other vehicle B cannot be detected. The midpoints in all other parts can be detected, however, and therefore appropriate travel path candidate portions C31 to C35 can be detected as travel path candidates extending in. the front-rear direction between the white line H31 and the white line H32 to the front and rear of the other vehicle B. Hence, a situation in which a midpoint position detection result deviates greatly from the lane center when a part of a white line is hidden can be avoided, and as a result, the travel path detection result can be stabilized. Moreover, when the travel path is detected using voting by the midpoint position voting unit 23, greater noise resistance than that of the related art is obtained, and therefore the travel path can be detected with a high degree of precision even in a noisy environment such as in rain or at night.
[0042] (Second Embodiment)
Next, a travel path detection apparatus 101 according to a second embodiment will be described with reference to FIGS. 9 to 11. The travel path detection apparatus 101 according to the second embodiment calculates an amount of variation in a pitching angle of the vehicle, and calculates the distance between the edge points after correcting the pitching angle by the variation amount. As shown in FIG. 9, the travel path detection apparatus 101 according to the second embodiment differs from the first embodiment in using an ECU 120, to which a distance calculation unit 125 and a pitching angle variation calculation unit 126 have been added, instead of the ECU 20 according to the first embodiment, but is otherwise configured similarly to the first embodiment. Accordingly, the following description focuses on the ECU 120 according to the second embodiment, and duplicate description of identical parts to the first embodiment has been omitted.
[0043] The distance calculation unit 125 calculates a distance between an edge point positioned on the left side of the captured image and an edge point positioned , on the right side of the captured image. More specifically, the distance calculation unit 125 calculates a distance d12 between a left side rising edge point LI and a right side falling edge point R2 shown in FIG. 7, for example, and calculates distances between edge points in this manner in relation to each edge point and each line.
[0044] The pitching angle variation calculation unit 126 generates histograms such as those shown in FIGS. 10A to IOC from the distances between the edge points calculated by the distance calculation unit 125 in relation to the respective edge points and lines. FIG. 10A shows an example of a value of half a vehicle width in a case where pitching does not occur in the vehicle. FIG. 10B shows an example of the value of half the vehicle width in a case where pitching occurs in the vehicle. FIG. IOC shows, an example in which the pitching angle is corrected by shifting the histogram of FIG. 10B rightward.
[0045] Here, as shown in FIGS. 10A and 10B, an average value XI of a value obtained by halving the vehicle width when pitching does not occur differs from an average value X2 obtained by halving the vehicle width when pitching occurs. When pitching occurs, therefore, the value of the vehicle width varies on the captured image. Hence, in this embodiment, the precision of travel path detection during pitching is improved by calculating the amount of variation in the pitching angle, correcting the pitching angle by the variation amount, and then recalculating the distance between the edge points. In other words, after the pitching angle has been corrected, as shown in FIG. IOC,, an average value X3 of the frequency approaches the average value XI of FIG. 10A, and therefore the travel path can be detected with a similar degree of precision to that of a case in which pitching does not occur.
[0046] Further, using a focal length f of the camera 10, an average value dav of the vehicle width value on the obtained histogram, a desired lane width (a lane width when the pitching angle is 0°) dLane, a predetermined pitching angle P0, and an average value y of a distance from the vehicle to an initially set estimation region (when a distance of 0 to 100 m in front of the vehicle is the estimation region, the average value is 50 m), the pitching angle variation calculation unit 126 calculates a variation amount ΔΡ in the pitching angle on the basis of Equation (1) shown below, for example.
Figure imgf000016_0001
[0047] Next, a method of detecting the travel path using the travel path detection apparatus 101 according to this embodiment will be described with reference to FIG. 11. Processing shown in FIG. 11 is executed by the ECU 120 repeatedly at fixed time intervals, for example.
[0048] First, in S21 , similar edge point extraction and labeling processing to that of Sl l in FIG. 6 is performed. Next, in S22, the distance djj between the subject edge points on the subject line is calculated in a similar manner to SI 2. Following S22, a histogram such as that shown in FIG. 10A is created in S23 using the distance d , whereupon one of the average value dav and a median is calculated from the created histogram (S24). .
[0049] Following S24, a determination is made as to whether or not an absolute value of a difference between the aforementioned desired lane width dLane and the one of the average value dav and the median calculated in S24 is smaller than a threshold dt|iresh (S25). When it is determined in S25 that the absolute value of the difference between the lane width and the one of the average value dav and the median is smaller than the threshold dtiiresh> it is assumed that the pitching angle is smaller than a predetermined value, and the processing advances to S27. When, on the other hand, it is not determined in S25 that the absolute value of the difference between the lane width dLane nd the one of the average value dav and the median is smaller than the threshold dthresh, it is assumed that the pitching angle equals or exceeds the predetermined value, and the processing advances to S26.
[0050] In S26, the variation amount ΔΡ in the pitching angle is calculated using Equation (1), for example, whereupon the distance d between the edge points is recalculated using a sum of the predetermined pitching angle P0 and the variation amount ΔΡ as a value of the pitching angle P. In S27 to S32, similar processing to that of S13 to S 18 in the first embodiment, shown in FIG. 6, is performed, and once the travel path information has been output to the output unit 30, the series of processes is terminated.
[0051] Hence, in the travel path detection apparatus 101 , when the difference between the lane width dLane denoting the width of the road and the one of the average value day and the median of the distance between the edge points, calculated by the distance calculation unit 125, equals or exceeds the threshold dthresh, the pitching angle variation calculation unit 126 calculates the variation amount ΔΡ in the pitching angle using the lane width dLane and the one of the average value dav and the median. Further, when the difference between the lane width dLane denoting the width of the road and the one of the average value dav and the median of the distance equals or exceeds the threshold dt resh, the distance calculation unit 125 corrects the value of the pitching angle P by the variation amount ΔΡ and then recalculates the distance d between the edge points before the midpoint calculation unit 22 calculates the midpoint.
[0052] Hence, in the second embodiment, when pitching occurs, the distance d,j between the edge points is recalculated after correcting the pitching angle P, and therefore a distortion effect on the captured image caused by pitching can be eliminated. As a result, the travel path detection result can be stabilized even when pitching occurs. Furthermore, the travel path detection apparatus 101 according to the second embodiment is configured identically to the travel path detection apparatus 1 according to the first embodiment, and therefore identical effects to those of the travel path detection apparatus 1 according to the first embodiment are also obtained.
[0053] The above embodiments illustrate embodiments of the travel path detection apparatus according to the invention, but the travel path detection apparatus according to the invention is not limited to the configurations described in the above embodiments, and may be altered or implemented in other applications providing the description in the claims is not modified.
[0054] For example, in the above embodiments, an example in which the midpoint calculation unit 22 calculates a midpoint between a left side edge point and a right side edge point and the midpoint position voting unit 23 votes for a midpoint position was described, but a point other than the midpoint may be used as the reference point. More specifically, instead of using the midpoint position as the reference point position, the reference point position may be offset by a predetermined amount to the left side or the right side, for example. In this case, the vehicle can be caused to travel on the left side or the right side within the lane. Moreover, the reference point position can be set such that when the vehicle travels around a curve, the vehicle is caused to travel on an inner side of the curve within the lane.
[0055] Further,, in the above embodiments, the edge detection unit 21 detects the edge points from the captured image captured by the camera 10, whereupon the midpoint position between the detected edge points is calculated. However, the reference point position need not be calculated from a captured image. More specifically, an infrared sensor or a radar apparatus using electromagnetic waves may be used instead of a captured image. In this case, for example, the laser apparatus may detect a three-dimensional point as a point at which the luminance of the road varies, and the reference point may be calculated from the detected three-dimensional point.
[0056] Furthermore, in the above embodiments, an example in which the edge points of a white line are detected was described, but edge points of a curbstone, for example, may be detected instead. Hence, the edge point detection subject is not limited to that described in the above embodiments.
[0057] Moreover, in the above embodiments, edge points are detected from both the left side and the right side of the captured image captured by the camera 10, but in a case where the information indicating the travel path along which the vehicle is traveling is obtained, edge points may be detected from either one of the left side and the right side of the captured image, and the reference point position and the travel path, can be detected likewise in the case.
[0058] Furthermore, in the above embodiments, edge points are detected from both the left side and the right side of a road having a single lane, but the lane of the subject road is not limited to a single lane. More specifically, edge points may be detected from a three-lane road by modifying the threshold of the subject lane width, for example. In this case, when one white line is hidden, the reference point position can be calculated from a white line on an outer side or an inner side thereof, and therefore the travel path can be detected with a high degree of precision even when a white line is hidden. .
[0059] Moreover, in the above embodiments, an, edge point at which the luminance increases when the luminance detection position is moved from the left side to the right side on the captured image is set as the rising edge point, and an edge point at which the luminance decreases when the luminance detection position is moved from the left side to the right side on the captured image is set as the falling edge point, but instead, an edge point at which the luminance increases when the luminance detection position is moved from the right side to the left side on the captured image may be set as the rising edge point, and an edge point at which the luminance decreases when the luminance detection position is moved from the right side to the left side on the captured image may be set as the falling edge point.

Claims

CLAIMS:
1. A travel path detection apparatus comprising:
an edge detection unit configured to detect an edge point, which is a point at which a luminance of a road varies;
a reference point calculation unit configured to calculate a reference point as a candidate of a travel path of a vehicle on the basis of a left side edge point and a right side edge point detected on the road by the edge detection unit;
a reference point position voting unit configured to vote for a position of the reference point calculated by the reference point calculation unit; and
a travel path detection unit configured to detect information indicating the travel path of the vehicle on the basis of the position of the reference point voted for by the reference point position voting unit.
2. The travel path detection apparatus according to claim 1, wherein the reference point is a midpoint between the left side edge point and the right side edge point.
3. The travel path detection apparatus according to claim 1 or 2, wherein:
the edge detection unit is configured to detect a rising edge point at which the luminance increases when a luminance detection position is moved from one side to another side of left and right sides of a captured image of the road, and a falling edge point at which the luminance decreases when the luminance detection position is moved from the one side to the other side of the left and right sides of the captured image; and
the reference point calculation unit is configured to calculate the reference point on the basis of a left side rising edge point and a right side falling edge point, and calculate the reference point on the basis of a left side falling edge point and a right side rising edge point.
4. The travel path detection apparatus according to any one of claims 1 to 3, further 21
comprising:
a distance calculation unit configured to calculate a distance between the left side edge point and the right side edge point; and
a pitching angle variation calculation unit configured to, when a difference between a lane width of the road and one of an average value and a median of the distance calculated by the distance calculation unit equals or exceeds a threshold, calculate a variation amount in a pitching angle of the vehicle using the lane width of the road and the one of the average value and the median,
wherein, when the difference between the lane width of the road and the one of the average value and the median equals or exceeds the threshold, the distance calculation unit is configured to correct the pitching angle of the vehicle by the variation amount in the pitching angle, calculated by the pitching angle variation calculation unit, and then recalculate the distance between the left side edge point and the right side edge point before the reference point calculation unit calculates the reference point.
5. The travel path detection apparatus according to any one of claims 1 to 4, wherein the edge point is an edge point of a white line on the road.
6. The travel path detection apparatus "according to any one of claims 1 to 5, wherein the reference point position voting unit is configured to vote for the position of the reference point by recording, as the candidate of the travel path of the vehicle, the position of the reference point calculated by the reference point calculation unit.
7. The travel path detection apparatus according to any one of claims 1 to 6, wherein the travel path detection unit is configured to detect the position of the reference point having the largest number of votes, among a plurality of the positions of the reference points voted for by the reference point position voting unit, as the information indicating the travel path of the vehicle.
8. A travel path detection method comprising:
detecting an edge point, which is a point at which a luminance of a road varies;
calculating a reference point as a candidate of a travel path of a vehicle on the basis of a left side edge point and a right side edge point on the road;
voting for a position of the calculated reference point; and
detecting information indicating the travel path of the vehicle on the basis of the reference point position that has been voted for.
PCT/IB2014/000432 2013-04-08 2014-03-28 Travel path detection apparatus and travel path detection method WO2014167393A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2013-080589 2013-04-08
JP2013080589 2013-04-08
JP2013-267026 2013-12-25
JP2013267026A JP2014219960A (en) 2013-04-08 2013-12-25 Track detection device and track detection method

Publications (1)

Publication Number Publication Date
WO2014167393A1 true WO2014167393A1 (en) 2014-10-16

Family

ID=50687519

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/000432 WO2014167393A1 (en) 2013-04-08 2014-03-28 Travel path detection apparatus and travel path detection method

Country Status (2)

Country Link
JP (1) JP2014219960A (en)
WO (1) WO2014167393A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3036180A1 (en) * 2015-05-11 2016-11-18 Valeo Schalter & Sensoren Gmbh METHOD FOR DETERMINING THE PLATE OF A MOTOR VEHICLE
DE102018100292A1 (en) * 2018-01-09 2019-07-11 Connaught Electronics Ltd. Detecting a lane marking by a lane keeping warning system of a vehicle

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006268199A (en) 2005-03-22 2006-10-05 Honda Motor Co Ltd Vehicular image processing system, method, and program, and vehicle
US20100086174A1 (en) * 2007-04-19 2010-04-08 Marcin Michal Kmiecik Method of and apparatus for producing road information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006268199A (en) 2005-03-22 2006-10-05 Honda Motor Co Ltd Vehicular image processing system, method, and program, and vehicle
US20100086174A1 (en) * 2007-04-19 2010-04-08 Marcin Michal Kmiecik Method of and apparatus for producing road information

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
BEHRINGER R ET AL: "Simultaneous estimation of pitch angle and lane width from the video image of a marked road", INTELLIGENT ROBOTS AND SYSTEMS '94. 'ADVANCED ROBOTIC SYSTEMS AND THE REAL WORLD', IROS '94. PROCEEDINGS OF THE IEEE/RSJ/GI INTERNATIONAL CO NFERENCE ON MUNICH, GERMANY 12-16 SEPT. 1994, NEW YORK, NY, USA,IEEE, vol. 2, 12 September 1994 (1994-09-12), pages 966 - 973, XP010141957, ISBN: 978-0-7803-1933-2, DOI: 10.1109/IROS.1994.407536 *
BERTOZZI M ET AL: "Real-time lane and obstacle detection on the GOLD system", INTELLIGENT VEHICLES SYMPOSIUM, 1996., PROCEEDINGS OF THE 1996 IEEE TOKYO, JAPAN 19-20 SEPT. 1996, NEW YORK, NY, USA,IEEE, US, 19 September 1996 (1996-09-19), pages 213 - 218, XP010209737, ISBN: 978-0-7803-3652-0, DOI: 10.1109/IVS.1996.566380 *
CHAN-YU HUANG ET AL: "Driver Assistance System Using Integrated Information from Lane Geometry and Vehicle Direction", INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE, 2007. ITSC 2007. IEEE, IEEE, PI, 1 September 2007 (2007-09-01), pages 986 - 991, XP031151492, ISBN: 978-1-4244-1395-9 *
FENG GAO ET AL: "Navigation line detection based on robotic vision in natural vegetation-embraced environment", IMAGE AND SIGNAL PROCESSING (CISP), 2010 3RD INTERNATIONAL CONGRESS ON, IEEE, PISCATAWAY, NJ, USA, 16 October 2010 (2010-10-16), pages 2596 - 2600, XP031810489, ISBN: 978-1-4244-6513-2 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3036180A1 (en) * 2015-05-11 2016-11-18 Valeo Schalter & Sensoren Gmbh METHOD FOR DETERMINING THE PLATE OF A MOTOR VEHICLE
DE102018100292A1 (en) * 2018-01-09 2019-07-11 Connaught Electronics Ltd. Detecting a lane marking by a lane keeping warning system of a vehicle

Also Published As

Publication number Publication date
JP2014219960A (en) 2014-11-20

Similar Documents

Publication Publication Date Title
US10452928B2 (en) Road vertical contour detection using a stabilized coordinate frame
JP5733395B2 (en) In-vehicle image recognition apparatus, imaging axis adjustment apparatus, and lane recognition method
JP5926080B2 (en) Traveling lane marking recognition device and program
US8244027B2 (en) Vehicle environment recognition system
US8655081B2 (en) Lane recognition system, lane recognition method, and lane recognition program
US8417022B2 (en) Stereo-image processing apparatus
JP4858574B2 (en) Object detection device
US8670590B2 (en) Image processing device
JP4616046B2 (en) VEHICLE IMAGE PROCESSING SYSTEM, VEHICLE IMAGE PROCESSING METHOD, VEHICLE IMAGE PROCESSING PROGRAM, AND VEHICLE
US8749631B2 (en) Vehicle detecting system
US9352746B2 (en) Lane relative position estimation method and system for driver assistance systems
JP2001092970A (en) Lane recognizing device
US10235579B2 (en) Vanishing point correction apparatus and method
US8160300B2 (en) Pedestrian detecting apparatus
EP3282389A1 (en) Image processing apparatus, image capturing apparatus, moving body apparatus control system, image processing method, and program
KR101805883B1 (en) Real time Lane Departure Warning Method and Warning System for Vehicle with Improving Processing Speed
JP3807651B2 (en) White line recognition device
WO2014167393A1 (en) Travel path detection apparatus and travel path detection method
JP6649859B2 (en) Vehicle position estimation device and vehicle position estimation method
JP5559650B2 (en) Lane estimation device
KR101609819B1 (en) Apparatus for Inter-Vehicle Distance Estimation
JP6963490B2 (en) Vehicle control device
JP6521796B2 (en) Stereo image processing device
JP5742676B2 (en) Lane boundary recognition device
JP5129094B2 (en) Vehicle periphery monitoring device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14723105

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14723105

Country of ref document: EP

Kind code of ref document: A1