WO2017163368A1 - 走路検出方法及び走路検出装置 - Google Patents
走路検出方法及び走路検出装置 Download PDFInfo
- Publication number
- WO2017163368A1 WO2017163368A1 PCT/JP2016/059399 JP2016059399W WO2017163368A1 WO 2017163368 A1 WO2017163368 A1 WO 2017163368A1 JP 2016059399 W JP2016059399 W JP 2016059399W WO 2017163368 A1 WO2017163368 A1 WO 2017163368A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lane
- lane change
- road
- vehicle
- runway
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 103
- 230000008859 change Effects 0.000 claims abstract description 177
- 238000000034 method Methods 0.000 description 14
- 230000002093 peripheral effect Effects 0.000 description 14
- 230000006870 function Effects 0.000 description 11
- 239000000284 extract Substances 0.000 description 7
- 239000003550 marker Substances 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000012888 cubic function Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
- B60W30/12—Lane keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30256—Lane; Road marking
Definitions
- the present invention relates to a road detection method and a road detection device.
- Patent Document 1 an apparatus for detecting a traveling lane from a road surface image is known (Patent Document 1).
- Patent Document 1 first, a horizontal edge histogram is created for a plurality of edge points back-projected on road surface coordinates. Then, the peak position of the edge histogram is obtained, and the edge group contributing to the peak position is extracted as one group, thereby detecting the lane marker.
- the present invention has been made in view of the above problems, and its object is to detect a runway and a runway detection that can detect a runway without being affected by a change in the distance between a vehicle and a runway feature point due to a lane change. Is to provide a device.
- One aspect of the present invention is a road detection method for detecting a road boundary based on a plurality of road characteristic points detected by a target detection sensor mounted on a vehicle, and when a lane change of the vehicle is detected. Considering the amount of lane change, determine the continuity of the road feature points detected before the lane change is completed with respect to the road feature points detected after the lane change is completed, and based on the continuity between the road feature points Detect runway boundaries.
- the continuity between the runway feature points before and after the lane change is considered according to the amount of movement in the vehicle width direction accompanying the lane change. For this reason, a runway can be detected without being affected by the change in the distance between the vehicle and the runway feature point due to the lane change.
- FIG. 1 is a block diagram illustrating a configuration of a travel path detection apparatus 1 according to the first embodiment.
- FIG. 2 is a flowchart showing an example of a road detection method using the road detection device 1 shown in FIG.
- FIG. 3A is an overhead view showing a state in which the vehicle 51 is traveling in the left lane of a two-lane road with a gentle right curve.
- FIG. 3B (a) is an overhead view showing an example of the second peripheral map generated from the first peripheral map of FIG. 3A
- FIG. 3B (b) is the second peripheral map of FIG. 3B (a). It is a graph which shows an example of the histogram produced
- FIG. 3A is an overhead view showing a state in which the vehicle 51 is traveling in the left lane of a two-lane road with a gentle right curve.
- FIG. 3B (a) is an overhead view showing an example of the second peripheral map generated from the first peripheral map of FIG. 3A
- FIG. 3B (b)
- FIG. 4A is an overhead view showing a so-called lane change in which the lane in which the vehicle 51 travels is changed from a right lane to a left lane on a two-lane road with a gentle right curve.
- 4B is an overhead view showing an example of a second peripheral map generated from the first peripheral map of FIG. 4A.
- FIG. 4C (a) shows a third map in which the runway feature point FP detected before the lane change completion time (Tc) is deleted from the second surrounding map of FIG. 4B, and FIG. 4C (b) It is a graph which shows an example of the histogram produced
- FIG. 5 is a flowchart illustrating an example of a road detection method using the road detection device 1 according to the second embodiment.
- FIG. 6A is an overhead view showing an example of a second peripheral map generated from the first peripheral map of FIG. 4A.
- FIG. 6B (a) is a third map obtained by offsetting the lane change amount (OF) included in the y coordinate of the runway feature point FP detected before the lane change starts from the second surrounding map of FIG. 6A.
- FIG. 6B (b) is a graph showing an example of a histogram generated from the third map.
- the travel path detection device 1 detects the boundary of the travel path on which the vehicle travels from the travel path feature points on the road surface detected by the sensors mounted on the vehicle.
- the travel path detection device 1 includes a target detection sensor 11 mounted on the vehicle, a movement amount detection sensor 10 that detects the movement amount of the vehicle based on the movement speed of the vehicle and the yaw rate of the vehicle, and the target detection sensor 11.
- a travel path detection circuit 12 for detecting a travel path boundary based on the plurality of travel path feature points and the travel amount of the vehicle detected by the travel amount detection sensor 10.
- the target detection sensor 11 detects white lines (including lane markers) marked on the road surface around the vehicle.
- the target detection sensor 11 includes a camera 34 attached to the vehicle, and an image processing circuit 35 that detects a road marking including a white line from a digital image captured by the camera 34.
- the detected road marking is expressed as a feature point group including a plurality of road feature points indicating the position.
- the image processing circuit 35 may detect, for example, a location (luminance edge) where the brightness of the image changes sharply or discontinuously as the runway feature point.
- the camera 34 includes a wide-angle lens that is fixed to the vehicle with its imaging direction facing the front of the vehicle and can capture a wide angle of view. Therefore, the camera 34 can also detect a white line (lane marker) that the vehicle straddles while the vehicle is changing lanes.
- the movement amount detection sensor 10 includes a wheel speed sensor 31, a yaw rate sensor 32, and a movement amount detection circuit 33.
- the wheel speed sensor 31 detects the rotational speed of the wheel provided in the vehicle.
- the yaw rate sensor 32 detects the yaw rate of the vehicle.
- the movement amount detection circuit 33 detects the movement amount of the vehicle in a predetermined time from the rotational speed of the wheel and the yaw rate of the vehicle.
- the moving amount of the vehicle includes, for example, the moving direction and the moving distance of the vehicle.
- the runway detection circuit 12 can be realized using a microcomputer including a CPU (Central Processing Unit), a memory, and an input / output unit.
- a computer program (running path detection program) for causing the microcomputer to function as the running path detection circuit 12 is installed in the microcomputer and executed.
- the microcomputer functions as the runway detection circuit 12.
- the road detection circuit 12 is realized by software.
- the road detection circuit 12 may be configured by preparing dedicated hardware for executing the following information processing. Is possible.
- the plurality of circuits (21, 24, 25) included in the travel path detection circuit 12 may be configured by individual hardware.
- the travel path detection circuit 12 may also be used as an electronic control unit (ECU) used for other control related to the vehicle.
- ECU electronice control unit
- the track detection circuit 12 includes a surrounding map generation circuit 21, a lane change detection circuit 25, and a track boundary estimation circuit 24.
- the surrounding map generation circuit 21 accumulates a plurality of runway feature points detected by the target detection sensor 11 mounted on the vehicle based on the movement amount of the vehicle. Specifically, the surrounding map generation circuit 21 connects the history of the feature point groups detected by the target detection sensor 11 based on the movement amount of the vehicle between the time points when the feature point groups are detected. A map around the vehicle (first surrounding map) is generated. In other words, the surrounding map generation circuit 21 connects the runway feature points observed at different times in consideration of the amount of movement of the vehicle. Thereby, the detection history of the runway feature points is accumulated, and the first surrounding map is generated.
- the camera 34 images the road surface around the vehicle every predetermined time.
- the movement amount detection sensor 10 detects the movement direction and the movement distance of the vehicle in this predetermined time.
- the surrounding map generation circuit 21 moves the position of the runway feature point in the direction opposite to the moving direction of the vehicle by the moving distance of the vehicle.
- the surrounding map generation circuit 21 accumulates the detection history of the plurality of road feature points by repeating the above and connecting the plurality of road feature points observed at different times in consideration of the amount of movement of the vehicle. A first surrounding map is generated.
- the lane change detection circuit 25 detects the lane change from the road surface image in front of the vehicle captured by the camera 34. Specifically, since the imaging direction of the camera 34 is fixed with respect to the vehicle, it can be determined from the position of the lane marker on the image whether the vehicle straddles the lane marker. The lane change detection circuit 25 detects the lane change when it is determined that the vehicle crosses the lane marker. A lane change may be detected in a state where the vehicle is actually straddling, or a lane change may be detected at a point when straddling is predicted. Of course, the lane change detection circuit 25 may determine the lane change based on not only the image of the camera 34 but also other information.
- the lane change may be determined from the combination of the position of the vehicle on the map and the operating state of the direction indicator, or the combination of the steering angle or the turning angle and the operating state of the direction indicator. Furthermore, as will be described later, it may be determined from the continuity of the road feature points FP in the second surrounding map.
- FIG. 3A shows three runway boundaries (SKa, SKb, SKc) that define the two-lane road.
- the first surrounding map generated by the surrounding map generation circuit 21 includes a feature point group (not shown) detected along three runway boundaries (SKa, SKb, SKc).
- plane coordinates are used in which the position of the vehicle 51 is the origin, the traveling direction of the vehicle 51 is the x axis, and the vehicle width direction of the vehicle 51 is the y axis.
- the track boundary estimation circuit 24 detects a track boundary based on a plurality of accumulated track feature points, that is, a first surrounding map. Specifically, first, the continuity of a plurality of runway feature points included in the first surrounding map is determined. The road boundary estimation circuit 24 detects the road boundary based on the continuity between the road characteristic points.
- the processing operation of the road boundary estimation circuit 24 will be described in detail.
- the runway boundary estimation circuit 24 determines the continuity of a plurality of runway feature points based on the frequency of coordinates in the vehicle width direction (y-axis direction). For example, the runway shape estimation circuit 22 takes the position of the vehicle 51 as the origin, the vehicle width direction of the vehicle 51 as the y axis, and the axis orthogonal to the y axis as the time axis (t axis), taking into account the movement amount of the vehicle 51. Without creating a second neighborhood map. As shown in FIG. 3B (a), the runway boundary estimation circuit 24 detects a plurality of runway feature points FP included in the first surrounding map shown in FIG. 3A with its detection time (t) and position in the vehicle width direction ( Plot on the second surrounding map based on y coordinate.
- the runway feature point FP is a plot along a straight line parallel to the t-axis.
- the runway boundary estimation circuit 24 votes the runway feature points FP on the second neighboring map to a one-dimensional histogram along the y-axis as shown in FIG. 3B (b).
- the lane boundary estimation circuit 24 can determine the continuity of a plurality of lane feature points from the histogram.
- the road boundary estimation circuit 24 detects a peak (y coordinate) of the histogram, and extracts a road boundary point group by grouping the road characteristic points FP on the second peripheral map for each peak. For example, grouping can be performed by causing each of the runway feature points FP voted on the histogram to belong to the closest peak. By grouping the runway feature points FP on the second surrounding map, the runway feature points FP can be grouped more easily than when the runway feature points FP on the first surrounding map are grouped. The plurality of grouped runway feature points FP constitute one runway boundary point group. Thus, the runway boundary estimation circuit 24 can determine the continuity between the runway feature points FP based on the frequency of the position (y coordinate) of the runway feature points FP in the vehicle width direction.
- the road boundary estimation circuit 24 can simultaneously extract a plurality of parallel road boundary point groups.
- the road boundary estimation circuit 24 can fit a plurality of curves to the boundary point group by approximating the road characteristic points FP with a curve using a known method instead of using a histogram. Then, the runway boundary estimation circuit 24 can determine whether or not the plurality of fitted curves are parallel lines.
- the road boundary estimation circuit 24 estimates the shape of each road boundary (SKa, SKb, SKc) from each of the extracted road boundary point groups. Specifically, the road boundary estimation circuit 24 applies each curve boundary (SKa, SKb, SKc) by applying a curve expressed by a road model function to each road boundary point group in the first surrounding map. ) Shape.
- the runway boundary estimation circuit 24 calculates the coefficients a, b, c, and d.
- function fitting by the least square method may be used, but robust estimation such as RANSAC (Random sample consensus) may be used when more stability is desired.
- the road boundary point group is easily extracted using the peak (y coordinate) of the histogram regardless of the shape of the road. be able to.
- FIG. 4A is an overhead view showing a so-called lane change in which the lane in which the vehicle 51 travels is changed from the right lane to the left lane on a two-lane road with a gentle right curve.
- the first peripheral map generated by the peripheral map generation circuit 21 is a feature point group (not shown) detected along the three track boundaries (SKa, SKb, SKc), as in FIG. 3A. including.
- the lateral position (y coordinate) of the road boundary (SKa, SKb, SKc) with respect to the vehicle 51 changes during the lane change period.
- the second surrounding map is different from FIG. 3B (a).
- the travel amount of the vehicle 51 is not taken into consideration, and therefore the runway feature point FP in the period from the start of the lane change to the completion (period in which the lane change is performed).
- the y-coordinate changes.
- a deviation occurs between the y coordinate of the road feature point FP detected before the lane change starts and the y coordinate of the road feature point FP detected after the lane change is completed. Therefore, even if the runway feature point FP on the second surrounding map shown in FIG. 4B is voted for a one-dimensional histogram along the y-axis, the peak similar to FIG. 3B (b) is not obtained. Therefore, it becomes difficult to accurately extract the road boundary point group with the peak as a reference.
- the road boundary estimation circuit 24 estimates the road boundary based on the continuity between the road characteristic points detected after the lane change is completed. Specifically, as shown in FIG. 4C (a), the third neighborhood where the runway feature point FP detected before the time (Tc) when the lane change is completed is deleted from the second neighborhood map of FIG. 4B. Generate a map. Then, the road boundary estimation circuit 24 extracts a road boundary point group using the peak (y coordinate) of the histogram generated from the third surrounding map, as shown in FIG. 4C (b). Since the y-coordinate of the FP of the runway feature point detected after the lane change is almost constant, the runway boundary point group can be easily extracted using the peak (y coordinate) of the histogram in the same manner as in FIG. 3B. can do.
- the lane boundary estimation circuit 24 applies each curve boundary (SKa, SKa,) by applying a curve expressed by a road model function to each lane boundary point group in the first peripheral map.
- the shape of SKb, SKc) is estimated.
- the lane boundary estimation circuit 24 sets the time (Tc) when the lane change is completed based on the time when the lane change detection circuit 25 detects the lane change. For example, the length of the lane change period is set to 5 seconds in advance, and the lane change is completed when a predetermined time (2.5 seconds) has elapsed from the time when the lane change detection circuit 25 detects the lane change.
- the time (Tc) may be set. The period during which the lane is changed and the predetermined time can be adjusted according to the lane width and the vehicle speed included in the map information.
- step S01 the lane change detection circuit 25 detects a lane change from an image of a road surface in front of the vehicle captured by the camera 34. Then, a lane change flag is set. Specifically, a lane change is detected based on the y coordinate of the runway feature point FP. When the sign (+/ ⁇ ) of the y coordinate of the runway feature point FP is reversed, the lane change may be detected. Proceeding to step S02, the road boundary estimation circuit 24 sets a point at which the lane change is completed from the speed of the vehicle at the time when the lane change is detected. Specifically, the time (Tc) when the lane change is completed is set based on the time when the lane change detection circuit 25 detects the lane change.
- the time (Tc) when the lane change is completed 2.5 seconds after the time when the lane change detection circuit 25 detects the lane change is set as the time (Tc) when the lane change is completed. Note that it is not necessary to estimate the time (Tc) when the strict lane change is completed, and the length of the period during which the lane change is performed may be set to be sufficiently long. It is only necessary to be able to suppress the influence on the shape estimation of the road boundary in the first surrounding map. In addition, the lane change completion time (Tc) can be estimated with sufficient accuracy by a simple method.
- step S03 the road boundary estimation circuit 24 determines whether or not a lane changing flag is set. If the lane changing flag is set (YES in S03), it is determined that the lane is being changed, that is, the lane change has started but has not been completed, and the process proceeds to step S04. On the other hand, if the lane changing flag is not set (NO in S03), it is determined that the lane is not being changed, and the process proceeds to step S05.
- step S04 the road boundary estimation circuit 24 deletes the road feature point FP detected before the time (Tc) when the lane change is completed from the second peripheral map of FIG. FIG. 4C (a)) is generated.
- Tc time
- Tc lane change completion time
- step S05 the track boundary estimation circuit 24 votes the track feature point FP on the third surrounding map to a one-dimensional histogram along the y-axis as shown in FIG. 4C (b). If the flag is not set, a histogram is created using the second surrounding map.
- step S06 the road boundary estimation circuit 24 determines the continuity of a plurality of road characteristic points from the histogram. Specifically, the peak (y coordinate) of the histogram is detected, and the road boundary point group is extracted by grouping the road characteristic points FP on the second peripheral map for each peak.
- the road boundary estimation circuit 24 applies each curve boundary (SKa, SKa,) by applying a curve expressed by a road model function to each extracted road boundary point group in the first surrounding map.
- the shape of SKb, SKc) is estimated.
- the lane boundary estimation circuit 24 considers the lane change amount (offset amount) before the lane change for the detected lane feature point FP after the lane change is completed.
- the continuity of the detected road feature points FP is determined, and a road boundary is detected based on the continuity between the road feature points FP.
- the continuity between the road feature points FP before and after the lane change is taken into account according to the amount of movement in the vehicle width direction (lane change amount) accompanying the lane change.
- the continuity between the road feature points FP is determined in consideration of “ignoring the road feature points FP detected before the lane change is completed”. For this reason, a runway can be detected without being influenced by the change in the distance between the vehicle 51 and the runway feature point FP accompanying the lane change.
- the road boundary estimation circuit 24 estimates the road boundary based on the continuity between the road characteristic points FP detected after the lane change is completed. For this reason, even when the movement amount (lane change amount) of the vehicle in the vehicle width direction due to the lane change cannot be accurately estimated, the change in the lateral position (y coordinate) of the road feature point FP before the lane change is completed The road boundary can be estimated without the influence.
- the track boundary estimation circuit 24 determines continuity between the track feature points based on the frequency of the position of the track feature points in the vehicle width direction.
- the road boundary estimation circuit 24 expresses the road characteristic points on the surrounding map as a one-dimensional histogram accumulated with respect to the y axis in the vehicle width direction of the vehicle.
- the lane boundary can be regarded as a peak on the histogram, and the lane shape can be easily estimated for each lane boundary.
- the runway boundary estimation circuit 24 sets a period from the start of the lane change to the completion based on the time when the lane change of the vehicle 51 is detected.
- the time when the lane change is completed is set based on the time when the lane change of the vehicle 51 is detected. Thereby, the runway feature point FP detected after the lane change is completed can be accurately specified.
- the lane boundary estimation circuit 24 corrects the position (y coordinate) in the vehicle width direction of the lane feature point FP detected before the lane change starts with the lane change amount. Then, the road boundary is estimated based on the continuity between the corrected road characteristic points. Specifically, the road boundary estimation circuit 24 detects the road boundary by combining the corrected road characteristic point FP ′ and the road characteristic point FP detected after the lane change is completed. The road boundary estimation circuit 24 excludes the road characteristic points FP detected from the start to completion of the lane change in the same manner as in the first embodiment.
- the block configuration of the runway detection device 1 is the same as that in FIG.
- FIG. 6A shows a second surrounding map generated from the first surrounding map shown in FIG. 4A, similarly to FIG. 4B.
- the runway boundary estimation circuit 24 When the vehicle 51 changes lanes, the runway boundary estimation circuit 24 generates a second surrounding map shown in FIG. 6A from the first surrounding map. Then, as shown in FIG. 6B (a), the lane change direction is the same as the lane change amount (OF) in the y coordinate of the runway feature point FP detected before the time (Ts) when the lane change is started. Move in the opposite direction. Thereby, the lane change amount (OF) included in the y coordinate of the road feature point FP detected before the time (Ts) when the lane change is started can be offset.
- the lane change amount (OF) included in the y coordinate of the road feature point FP detected before the time (Ts) when the lane change is started can be offset.
- the runway boundary estimation circuit 24 generates the fourth surrounding map shown in FIG. 6B (a) from the second surrounding map of FIG. 6A.
- the offset amount (lane change amount) is a lane width for one lane, and can be set in advance from an average lane width.
- the runway boundary estimation circuit 24 performs an offset process to group the runway feature points. For this reason, some errors are allowed between the actual lane width and the offset amount (lane change amount). Therefore, the offset amount (lane change amount) can be determined in advance.
- the road boundary estimation circuit 24 extracts a road boundary point group using the histogram peak (y coordinate) generated from the fourth surrounding map, as shown in FIG. 6B (b).
- the y coordinate of the corrected road feature point FP ′ substantially coincides with the y coordinate of the road feature point FP detected after the lane change is completed. For this reason, the runway boundary point group can be easily extracted using the peak (y coordinate) of the histogram as in FIG. 3B.
- the road boundary estimation circuit 24 estimates the shape of each road boundary (SKa, SKb, SKc) by fitting a curve expressed by a road model function to each road boundary point group. To do.
- the lane boundary estimation circuit 24 sets the time (Ts) when the lane change is started based on the time when the lane change detection circuit 25 detects the lane change. For example, the length of the lane change period is set in advance to 5 seconds, and the lane change starts at a predetermined time (2.5 seconds) before the time when the lane change detection circuit 25 detects the lane change. What is necessary is just to set as time (Ts) performed. The period during which the lane is changed and the predetermined time can be adjusted according to the lane width and the vehicle speed included in the map information.
- step S10 is executed instead of step S02 of FIG. 2
- steps S11 and S12 are executed instead of step S04 of FIG.
- the other steps S01, S03, S05 to S07 are the same as those in FIG.
- step S10 the lane boundary estimation circuit 24 sets a point where the lane change is started and a point where the lane change is completed based on the speed of the vehicle when the lane change is detected. Specifically, based on the time when the lane change detection circuit 25 detects the lane change, the time (Ts) when the lane change is started and the time (Tc) when the lane change is completed are set. For example, the road boundary estimation circuit 24 sets a time 2.5 seconds before the time when the lane change detection circuit 25 detects the lane change as the time (Ts) when the lane change starts. Then, 2.5 seconds after the time when the lane change detection circuit 25 detects the lane change is set as the time (Tc) when the lane change is completed.
- Ts time at which strict lane change is started
- the length of the period during which the lane change is performed may be set sufficiently long. It is only necessary to be able to suppress the influence on the shape estimation of the road boundary in the first surrounding map, and the lane change start time (Ts) with sufficient accuracy can be estimated by a simple method.
- step S11 If the lane change flag is set (YES in S03), the process proceeds to step S11.
- the y coordinate of the runway feature point FP changes during the lane change, and the y coordinate of the runway feature point FP detected before the lane change starts and the y coordinate of the runway feature point FP detected after the lane change is completed. There is a gap between them.
- step S11 the road boundary estimation circuit 24 detects the road detected from the time when the lane change is started (Ts) to the time when the lane change is completed (Tc) from the second surrounding map of FIG. 6A.
- the feature point FP is deleted.
- the lane change flag is raised.
- the lane boundary estimation circuit 24 further uses the y coordinate of the lane feature point FP detected before the time (Ts) when the lane change is started, as shown in FIG. 6B (a). Move in the opposite direction by the same amount as (OF). Thereby, the lane change amount (OF) included in the y coordinate of the road feature point FP detected before the time (Ts) when the lane change is started can be offset.
- the road boundary estimation circuit 24 uses the corrected road characteristic point FP ′ shown in FIG. 6B (a) and the road characteristic point FP detected after the time (Tc) when the lane change is completed as shown in FIG. 6B. Vote on a one-dimensional histogram along the y-axis, as shown in (b).
- the road boundary estimation circuit 24 detects the peak (y coordinate) of the histogram shown in FIG. 6B (b), and sets the road characteristic point (FP, FP ′) on the fourth peripheral map for each peak.
- a road boundary point group is extracted by grouping.
- the road boundary estimation circuit 24 applies each curve boundary (SKa, SKa,) by applying a curve expressed by a road model function to each extracted road boundary point group in the first surrounding map. The shape of SKb, SKc) is estimated.
- the lane boundary estimation circuit 24 considers the lane change amount (offset amount) before the lane change for the detected lane feature point FP after the lane change is completed.
- the continuity of the detected road feature points FP is determined, and a road boundary is detected based on the continuity between the road feature points FP.
- the continuity between the road feature points FP before and after the lane change is taken into account according to the amount of movement in the vehicle width direction (lane change amount) accompanying the lane change.
- “the lane characteristic point FP detected between the start and completion of the lane change is ignored, and the position of the lane characteristic point FP detected before the lane change is started is offset.
- the continuity between the runway feature points FP is determined in consideration of “moving considering the amount”. For this reason, a runway can be detected without being influenced by the change in the distance between the vehicle 51 and the runway feature point FP accompanying the lane change.
- the distance between the vehicle and the runway feature point changes by the amount of movement (OF) of the vehicle in the vehicle width direction. Therefore, the distance between the vehicle and the runway feature point is corrected by the amount of movement (OF) of the vehicle in the vehicle width direction.
- OF amount of movement
- the runway boundary estimation circuit 24 sets a period from the start of the lane change to the completion based on the time when the lane change of the vehicle 51 is detected.
- the time (Ts) when the lane change is started and the time (Tc) when the lane change is completed are set. Thereby, the road feature point FP detected before the lane change is started and the road feature point FP detected after the lane change is completed can be accurately specified.
- the processing circuit includes a programmed processing device such as a processing device including an electrical circuit.
- Processing devices also include devices such as application specific integrated circuits (ASICs) and conventional circuit components arranged to perform the functions described in the embodiments.
- ASICs application specific integrated circuits
- the stand-alone type roadway detection device 1 including the movement amount detection sensor 10 and the target detection sensor 11 is exemplified, but the roadway detection device is a client server model using a computer network via a wireless communication network.
- the vehicle 51 (client) including the movement amount detection sensor 10 and the target detection sensor 11 is connected to the travel path detection device (server) via the computer network.
- the server provided with the runway detection circuit 12 shown in FIG. 1 can be connected to the movement amount detection sensor 10 and the target detection sensor 11 via the computer network.
- the travel path detection device is mainly configured by the travel path detection circuit 12 (server), and the movement amount detection sensor 10 and the target detection sensor 11 are not included in the travel path detection device.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
次に、図面を参照して、実施形態を詳細に説明する。
第2実施形態に係わる走路境界推定回路24は、車線変更が開始する前に検出した走路特徴点FPの車幅方向の位置(y座標)を車線変更量で補正する。そして、補正後の走路特徴点間の連続性に基づいて走路境界を推定する。具体的に、走路境界推定回路24は、補正後の走路特徴点FP’と車線変更が完了した後に検出した走路特徴点FPとを組み合わせて、走路境界を検出する。なお、走路境界推定回路24は、車線変更が開始してから完了するまでに検出した走路特徴点FPを、第1実施形態と同様にして、排除する。走路検出装置1のブロック構成は、図1と同じであり、図示及び説明を省略する。
10 移動量検出センサ
11 物標検出センサ
12 走路検出回路
24 走路境界推定回路
25 車線変更検出回路
51 車両
FP、FP’ 走路特徴点
OF 車線変更量(オフセット量)
Claims (6)
- 車両に搭載された物標検出センサで検出された複数の走路特徴点を、前記車両の移動量に基づいて蓄積し、前記蓄積された複数の走路特徴点に基づいて走路境界を検出する走路検出回路を用いた走路検出方法であって、
前記走路検出回路は、前記車両の車線変更を検出した場合には、車線変更量を考慮して、前記車線変更が完了した後に検出した前記走路特徴点に対する前記車線変更が完了する前に検出した前記走路特徴点の連続性を判定し、走路特徴点間の連続性に基づいて前記走路境界を検出する
ことを特徴とする走路検出方法。 - 前記走路検出回路は、
前記車線変更が開始する前に検出した前記走路特徴点の車幅方向の位置を、前記車線変更による前記車両の車幅方向の移動量で補正し、
補正後の前記走路特徴点間の連続性に基づいて前記走路境界を推定する
ことを特徴とする請求項1に記載の走路検出方法。 - 前記走路検出回路は、前記車線変更が完了した後に検出された前記走路特徴点間の連続性に基づいて前記走路境界を推定することを特徴とする請求項1に記載の走路検出方法。
- 前記走路検出回路は、前記走路特徴点の車幅方向の位置の度数に基づいて前記走路特徴点間の連続性を判定することを特徴とする請求項1~3のいずれか一項に記載の走路検出方法。
- 前記走路検出回路は、車両の車線変更を検出した時刻に基づいて、前記車線変更が開始してから完了するまでの期間を設定することを特徴とする請求項1~4のいずれか一項に記載の走路検出方法。
- 車両に搭載された物標検出センサで検出された複数の走路特徴点を、前記車両の移動量に基づいて蓄積する周辺地図生成回路と、
前記蓄積された複数の走路特徴点に基づいて走路境界を検出する走路境界推定回路と
を備える走路検出装置であって、
前記走路境界推定回路は、前記車両の車線変更を検出した場合には、車線変更量に考慮して、前記車線変更が完了した後に検出した前記走路特徴点に対する前記車線変更が完了する前に検出した前記走路特徴点の連続性を判定し、走路特徴点間の連続性に基づいて前記走路境界を検出する
ことを特徴とする走路検出装置。
Priority Applications (10)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP16895404.8A EP3435353B1 (en) | 2016-03-24 | 2016-03-24 | Travel path detection method and travel path detection device |
JP2018506708A JP6658868B2 (ja) | 2016-03-24 | 2016-03-24 | 走路検出方法及び走路検出装置 |
MX2018011510A MX369231B (es) | 2016-03-24 | 2016-03-24 | Metodo de deteccion de carril de circulacion y dispositivo de deteccion de carril de circulacion. |
BR112018069501-7A BR112018069501B1 (pt) | 2016-03-24 | 2016-03-24 | Método de detecção de pista de deslocamento e dispositivo de detecção de pista de deslocamento |
PCT/JP2016/059399 WO2017163368A1 (ja) | 2016-03-24 | 2016-03-24 | 走路検出方法及び走路検出装置 |
RU2018137213A RU2695011C1 (ru) | 2016-03-24 | 2016-03-24 | Способ (варианты) и устройство обнаружения полос движения |
KR1020187028452A KR20180122382A (ko) | 2016-03-24 | 2016-03-24 | 주로 검출 방법 및 주로 검출 장치 |
CA3018663A CA3018663C (en) | 2016-03-24 | 2016-03-24 | Travel lane detection method and travel lane detection device |
US16/087,422 US10275666B2 (en) | 2016-03-24 | 2016-03-24 | Travel lane detection method and travel lane detection device |
CN201680083846.3A CN109074741B (zh) | 2016-03-24 | 2016-03-24 | 行进路检测方法及行进路检测装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/059399 WO2017163368A1 (ja) | 2016-03-24 | 2016-03-24 | 走路検出方法及び走路検出装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017163368A1 true WO2017163368A1 (ja) | 2017-09-28 |
Family
ID=59900074
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/059399 WO2017163368A1 (ja) | 2016-03-24 | 2016-03-24 | 走路検出方法及び走路検出装置 |
Country Status (10)
Country | Link |
---|---|
US (1) | US10275666B2 (ja) |
EP (1) | EP3435353B1 (ja) |
JP (1) | JP6658868B2 (ja) |
KR (1) | KR20180122382A (ja) |
CN (1) | CN109074741B (ja) |
BR (1) | BR112018069501B1 (ja) |
CA (1) | CA3018663C (ja) |
MX (1) | MX369231B (ja) |
RU (1) | RU2695011C1 (ja) |
WO (1) | WO2017163368A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020087001A (ja) * | 2018-11-27 | 2020-06-04 | 株式会社デンソー | 車線位置情報出力装置 |
US11093761B2 (en) * | 2019-03-06 | 2021-08-17 | GM Global Technology Operations LLC | Lane position sensing and tracking in a vehicle |
US11584371B2 (en) | 2020-07-15 | 2023-02-21 | Toyota Research Institute, Inc. | Systems and methods for using R-functions and semi-analytic geometry for lane keeping in trajectory planning |
RU210463U1 (ru) * | 2021-11-17 | 2022-04-15 | федеральное государственное бюджетное образовательное учреждение высшего образования "Новгородский государственный университет имени Ярослава Мудрого" | Телевизионное устройство для измерения координат изображения объекта |
CN114396933B (zh) * | 2021-12-31 | 2024-03-08 | 广州小鹏自动驾驶科技有限公司 | 一种车道拓扑构建方法、装置、车辆和存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003203298A (ja) * | 2002-12-11 | 2003-07-18 | Honda Motor Co Ltd | 走行区分線認識装置を備えた自動走行車両 |
JP2005100000A (ja) * | 2003-09-24 | 2005-04-14 | Aisin Seiki Co Ltd | 路面走行レーン検出装置 |
JP2007241468A (ja) * | 2006-03-06 | 2007-09-20 | Toyota Motor Corp | 車線変更検出装置 |
JP2010102427A (ja) * | 2008-10-22 | 2010-05-06 | Nec Corp | 車線区画線検出装置、車線区画線検出方法、及び車線区画線検出プログラム |
JP2014076689A (ja) * | 2012-10-09 | 2014-05-01 | Toyota Motor Corp | 車両制御装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4819169A (en) * | 1986-09-24 | 1989-04-04 | Nissan Motor Company, Limited | System and method for calculating movement direction and position of an unmanned vehicle |
GB9317983D0 (en) * | 1993-08-28 | 1993-10-13 | Lucas Ind Plc | A driver assistance system for a vehicle |
JP3209671B2 (ja) | 1995-11-27 | 2001-09-17 | 富士通テン株式会社 | カーブ路判定装置 |
JP2002029347A (ja) | 2000-07-17 | 2002-01-29 | Honda Motor Co Ltd | 車両用走行区分線検出装置 |
US6894606B2 (en) * | 2000-11-22 | 2005-05-17 | Fred Forbes | Vehicular black box monitoring system |
US6819779B1 (en) * | 2000-11-22 | 2004-11-16 | Cognex Corporation | Lane detection system and apparatus |
JP2003322522A (ja) * | 2002-05-07 | 2003-11-14 | Daihatsu Motor Co Ltd | 車間距離検出装置及び検出方法 |
JP4822099B2 (ja) * | 2005-07-11 | 2011-11-24 | アイシン・エィ・ダブリュ株式会社 | ナビゲーション装置及びナビゲーション方法 |
JP5321497B2 (ja) * | 2010-02-22 | 2013-10-23 | 株式会社デンソー | 白線認識装置 |
US20120022739A1 (en) * | 2010-07-20 | 2012-01-26 | Gm Global Technology Operations, Inc. | Robust vehicular lateral control with front and rear cameras |
JP5591613B2 (ja) * | 2010-07-27 | 2014-09-17 | 株式会社小糸製作所 | 車両検出装置 |
DE102010048760A1 (de) * | 2010-09-17 | 2011-07-28 | Daimler AG, 70327 | Verfahren zur Erzeugung eines Straßenmodells |
RU2571871C2 (ru) * | 2012-12-06 | 2015-12-27 | Александр ГУРЕВИЧ | Способ определения границ дороги, формы и положения объектов, находящихся на дороге, и устройство для его выполнения |
JP6141788B2 (ja) * | 2014-04-14 | 2017-06-07 | 本田技研工業株式会社 | レーンマーク認識装置 |
CN103942960B (zh) * | 2014-04-22 | 2016-09-21 | 深圳市宏电技术股份有限公司 | 一种车辆变道检测方法及装置 |
US20170089717A1 (en) * | 2015-09-29 | 2017-03-30 | Garmin Switzerland Gmbh | Use of road lane data to improve traffic probe accuracy |
-
2016
- 2016-03-24 EP EP16895404.8A patent/EP3435353B1/en active Active
- 2016-03-24 KR KR1020187028452A patent/KR20180122382A/ko active Search and Examination
- 2016-03-24 MX MX2018011510A patent/MX369231B/es active IP Right Grant
- 2016-03-24 CA CA3018663A patent/CA3018663C/en active Active
- 2016-03-24 WO PCT/JP2016/059399 patent/WO2017163368A1/ja active Application Filing
- 2016-03-24 CN CN201680083846.3A patent/CN109074741B/zh active Active
- 2016-03-24 BR BR112018069501-7A patent/BR112018069501B1/pt active IP Right Grant
- 2016-03-24 JP JP2018506708A patent/JP6658868B2/ja active Active
- 2016-03-24 US US16/087,422 patent/US10275666B2/en active Active
- 2016-03-24 RU RU2018137213A patent/RU2695011C1/ru active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003203298A (ja) * | 2002-12-11 | 2003-07-18 | Honda Motor Co Ltd | 走行区分線認識装置を備えた自動走行車両 |
JP2005100000A (ja) * | 2003-09-24 | 2005-04-14 | Aisin Seiki Co Ltd | 路面走行レーン検出装置 |
JP2007241468A (ja) * | 2006-03-06 | 2007-09-20 | Toyota Motor Corp | 車線変更検出装置 |
JP2010102427A (ja) * | 2008-10-22 | 2010-05-06 | Nec Corp | 車線区画線検出装置、車線区画線検出方法、及び車線区画線検出プログラム |
JP2014076689A (ja) * | 2012-10-09 | 2014-05-01 | Toyota Motor Corp | 車両制御装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3435353A4 * |
Also Published As
Publication number | Publication date |
---|---|
JP6658868B2 (ja) | 2020-03-04 |
EP3435353A1 (en) | 2019-01-30 |
EP3435353B1 (en) | 2022-03-02 |
CN109074741B (zh) | 2021-08-10 |
CN109074741A (zh) | 2018-12-21 |
MX369231B (es) | 2019-11-01 |
JPWO2017163368A1 (ja) | 2019-03-14 |
CA3018663C (en) | 2019-04-30 |
RU2695011C1 (ru) | 2019-07-18 |
BR112018069501A2 (pt) | 2019-01-29 |
MX2018011510A (es) | 2019-01-10 |
EP3435353A4 (en) | 2019-03-27 |
CA3018663A1 (en) | 2017-09-28 |
KR20180122382A (ko) | 2018-11-12 |
US10275666B2 (en) | 2019-04-30 |
US20190102632A1 (en) | 2019-04-04 |
BR112018069501B1 (pt) | 2024-01-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017163368A1 (ja) | 走路検出方法及び走路検出装置 | |
JP4962581B2 (ja) | 区画線検出装置 | |
EP2040196A1 (en) | Road shape estimating device | |
JP6272204B2 (ja) | 道路区画線情報取得装置 | |
JP6617828B2 (ja) | 走路検出方法及び走路検出装置 | |
JP2012159469A (ja) | 車両用画像認識装置 | |
JP7024176B2 (ja) | 走路検出方法及び走路検出装置 | |
JP2017078607A (ja) | 車両位置推定装置及びプログラム | |
JP2013190842A (ja) | レーンマーク検出装置およびレーンマーク検出方法 | |
JP2012159470A (ja) | 車両用画像認識装置 | |
JP2019190847A (ja) | ステレオカメラ装置 | |
JP6243319B2 (ja) | 逸脱判定装置 | |
JP6666826B2 (ja) | 地図データ作成装置、地図データ作成方法およびプログラム | |
JP6416654B2 (ja) | 白線検出装置 | |
JP2015121954A (ja) | 輝度値算出装置及び車線検出システム | |
JP2010002334A (ja) | 路面勾配推定装置 | |
JP5959245B2 (ja) | レーンマーク検出装置およびレーンマークの信頼度算出方法 | |
JP2012089005A (ja) | 画像処理装置、及び画像処理方法 | |
JP2018072637A (ja) | オルソ画像作成装置、オルソ画像作成方法およびプログラム | |
JP6334773B2 (ja) | ステレオカメラ | |
KR20160107474A (ko) | 부품 이미지 보정 방법 | |
JP3329321B2 (ja) | 走行車両検出装置および走行車両検出方法 | |
JP2014186515A (ja) | 走行路検出装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018506708 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 3018663 Country of ref document: CA |
|
WWE | Wipo information: entry into national phase |
Ref document number: MX/A/2018/011510 Country of ref document: MX |
|
ENP | Entry into the national phase |
Ref document number: 20187028452 Country of ref document: KR Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112018069501 Country of ref document: BR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016895404 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016895404 Country of ref document: EP Effective date: 20181024 |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16895404 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 112018069501 Country of ref document: BR Kind code of ref document: A2 Effective date: 20180924 |