WO2018030159A1 - 認識装置、及び、認識方法 - Google Patents
認識装置、及び、認識方法 Download PDFInfo
- Publication number
- WO2018030159A1 WO2018030159A1 PCT/JP2017/027125 JP2017027125W WO2018030159A1 WO 2018030159 A1 WO2018030159 A1 WO 2018030159A1 JP 2017027125 W JP2017027125 W JP 2017027125W WO 2018030159 A1 WO2018030159 A1 WO 2018030159A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- moving body
- line
- correction
- range
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/025—Active steering aids, e.g. helping the driver by actively influencing the steering system after environment evaluation
- B62D15/0265—Automatic obstacle avoidance by steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D15/00—Steering not otherwise provided for
- B62D15/02—Steering position indicators ; Steering position determination; Steering aids
- B62D15/029—Steering assistants using warnings or proposing actions to the driver without influencing the steering system
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/588—Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/402—Type
- B60W2554/4029—Pedestrians
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/801—Lateral distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/802—Longitudinal distance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/80—Spatial relation or speed relative to objects
- B60W2554/804—Relative longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2710/00—Output or target parameters relating to a particular sub-units
- B60W2710/20—Steering systems
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present disclosure relates to a recognition device that recognizes a travel range in which a vehicle travels, and a recognition method that is executed by the recognition device.
- the shape of a travel range in which a vehicle travels is recognized by an imaging device, a radar device, or the like.
- an imaging device e.g., a laser scanner
- a radar device e.g., a radar sensor
- Deviation from the traveling range of the host vehicle is suppressed by applying torque to the steering device so that the host vehicle moves toward the center of the traveling range.
- the shape of the recognized travel range may be different from the actual travel range due to erroneous recognition of the travel range. Can happen. Control that deviates from the driving range based on the misrecognized shape of the driving range suppresses the deviation from the driving range even though the vehicle does not deviate or is less likely to deviate. There is a case where an unnecessary operation for executing the control is performed. In addition, in spite of the departure from the traveling range or the possibility of deviating from the traveling range, there is a malfunction that does not execute the control for suppressing the deviation from the traveling range.
- Patent Document 1 There is a recognition device described in Patent Document 1 that suppresses erroneous recognition of the travel range.
- the recognition apparatus described in Patent Literature 1 the road shape recognized by the imaging device, the radar device, or the like is corrected in order to suppress erroneous recognition of the road shape.
- the recognition apparatus described in Patent Document 1 in addition to the road shape, the position of a stationary object that exists in front of the traveling direction of the host vehicle is acquired. The road shape is corrected based on the detected position of the stationary object. By correcting the road shape in this way, the non-operation and unnecessary operation of the control that suppresses the deviation from the traveling range are suppressed.
- the scenes where the road shape can be corrected are limited.
- a stationary object is not necessarily present at the end of the traveling range, and if there is no stationary object, the end of the traveling range may not be corrected.
- the stationary object is present at a position separated from the end of the travel range, if the end of the travel range is corrected based on the position of the stationary object, the actual travel range is corrected. There may be a gap in the position. In this case, there is a possibility that it is difficult to solve the above-described problem in performing control for suppressing deviation from the traveling range.
- the present disclosure has been made to solve the above-described problems, and a main purpose thereof is to provide a recognition device that can appropriately recognize a travel range in which the vehicle travels.
- a recognition device that is mounted on a vehicle and recognizes a travel range in which the vehicle travels, and that detects an end line that defines one end in a lateral direction perpendicular to the traveling direction of the vehicle in the travel range
- a part detection unit an object detection unit that detects a position of a moving body that exists in front of the traveling direction of the vehicle, and the moving body that is positioned between the vehicle and the end line in the lateral direction.
- a correction unit that corrects the position of the end line detected by the end detection unit in the lateral direction between the vehicle and the moving body or within the width of the moving body.
- the vehicle side When there is a moving body in front of the moving direction of the vehicle, the vehicle side overtakes or moves the moving body from the position where the moving body exists in the lateral direction that is orthogonal to the moving direction of the vehicle. It is the range that can pass the body. Moreover, the position itself where the moving body exists is a range in which the vehicle can travel following the moving body. That is, it can be said that both the vehicle side from the position where the moving body exists and the position where the moving body exists are within a range in which the vehicle can travel. In the above configuration, when the end line that is one end of the travel range in which the vehicle travels is detected, the position of the end line is corrected between the vehicle and the moving body or the position of the moving body by the correcting unit. is doing. Thereby, even if the position shift has occurred between the detected end line and the actual end line, it is possible to appropriately define one end of the travel range in which the vehicle can travel.
- FIG. 1 is a configuration diagram of a driving assistance ECU that functions as a recognition device.
- FIG. 2 is a diagram for explaining processing when there is no preceding vehicle in the first embodiment.
- FIG. 3 is a diagram for explaining processing when a preceding vehicle exists in the first embodiment.
- FIG. 4 is a diagram illustrating processing when the road is a curve in the first embodiment.
- FIG. 5 is a flowchart showing the processing according to the first embodiment.
- FIG. 6 is a diagram for explaining processing according to the second embodiment.
- FIG. 7 is a diagram for explaining processing according to the third embodiment.
- FIG. 8 is a diagram for explaining processing according to the fourth embodiment.
- FIG. 9 is a diagram illustrating another example of processing according to the fourth embodiment.
- the recognition apparatus is mounted on a vehicle (own vehicle) and recognizes a travel range in which the vehicle travels.
- a vehicle own vehicle
- the structure of the system containing driving assistance ECU20 which is a recognition apparatus is demonstrated.
- the imaging device 11 is a device including a monocular camera or a stereo camera such as a CCD image sensor, a CMOS image sensor, or a near infrared sensor.
- the imaging device 11 is attached, for example, near the upper end of the windshield of the own vehicle and near the center in the vehicle width direction, and photographs an area that spreads in a predetermined angle range toward the front of the own vehicle from an overhead viewpoint. And the imaging device 11 transmits the image
- the radar device 12 is, for example, a known millimeter wave radar that uses a high frequency signal in the millimeter wave band as a transmission wave.
- the radar device 12 is provided at the front end portion of the own vehicle and can detect a region that falls within a predetermined detection angle. The position of the target within the detection range is detected. Specifically, an exploration wave is transmitted at a predetermined period, and a reflected wave is received by a plurality of antennas. The distance to the target is calculated from the transmission time of the exploration wave and the reception time of the reflected wave. Further, the relative velocity is calculated from the frequency of the reflected wave reflected by the target, which has changed due to the Doppler effect.
- the azimuth of the target is calculated from the phase difference of the reflected waves received by the plurality of antennas. If the position and orientation of the target can be calculated, the relative position of the target with respect to the host vehicle can be specified.
- the radar device 12 transmits a survey wave, receives a reflected wave, calculates a reflection position, and a relative speed every predetermined period, and transmits the calculated reflection position and relative speed to the driving support ECU 20.
- the driving support ECU 20 is a computer including a CPU, a ROM, a RAM, an I / O, and the like. This driving assistance ECU 20 implements each function by the CPU executing a program installed in the ROM.
- the image recognition unit 21 of the driving assistance ECU 20 extracts feature points indicating moving bodies, road structures, travel lane markings, and the like existing around the host vehicle from the image acquired from the imaging device 11. Specifically, an edge point is extracted based on the luminance information of the image acquired from the imaging device, and Hough transform is performed on the extracted edge point.
- Hough transform for example, points on a straight line in which a plurality of edge points are continuously arranged or points where the straight lines are orthogonal to each other are extracted as feature points.
- the object detection unit 22 performs pattern matching with a feature point group pattern stored in advance for a feature point group including a plurality of feature points acquired from the image recognition unit 21, and extracts an object corresponding to the feature point group. .
- the reflection position acquired from the radar apparatus 12 is converted into coordinates in the image acquired from the imaging apparatus 11, and the reflection position and relative velocity acquired from the radar apparatus 12 are associated with an object in the image.
- the information acquired from the radar device 12 includes the relative speed with the own vehicle in addition to the reflection position. Therefore, it is determined whether the object extracted from the image by the image recognition unit 21 moves in the same direction as the own vehicle or moves in the opposite direction to the own vehicle.
- the object is a vehicle and is traveling in the same direction as the own vehicle, it can be a preceding vehicle or a parallel running vehicle, and the object is a vehicle and is traveling in the opposite direction to the own vehicle. If it is, it can be set as an oncoming vehicle.
- the edge detection unit 23 of the driving assistance ECU 20 extracts a feature point group extending from the own vehicle side to the far side along the traveling direction of the own vehicle from the feature point group of the image acquired from the image recognition unit 21.
- the feature point group is defined as an end line indicating the end of the road on which the vehicle is traveling. This end line is obtained based on, for example, a travel lane line drawn to divide the roadway and the sidewalk, a curb provided at the end of the road, a guardrail, and the like.
- the correction unit 24 corrects the end line of the travel range acquired from the end detection unit 23 based on the position of the moving body detected by the object detection unit 22. The processing executed by the correction unit 24 will be described later.
- the support control unit 25 transmits a control command to at least one of the notification device 41 and the steering control device 42 when the vehicle is likely to depart from the travel range and when the vehicle deviates from the travel range. . Specifically, based on the center of the image acquired from the image recognition unit 21, it is determined whether or not the own vehicle straddles the end line. Further, based on the relative angle of the end line with respect to the center line of the image, it is determined whether there is a possibility of straddling the end line when the host vehicle continues straight ahead.
- the support control unit 25 acquires a control signal from the blinker 31 in making these determinations. And if the control signal is acquired from the turn signal 31, even if the own vehicle is likely to deviate from the traveling range or the own vehicle deviates from the traveling range, the control to the notification device 41 and the steering control device 42 is performed. The command is not transmitted. This is to respect the driver's intention to change lanes.
- the notification device 41 is a speaker, display, or buzzer installed in the passenger compartment of the vehicle. If the notification device 41 receives a control command from the driving support ECU 20, the alarm device 41 issues an alarm indicating that the vehicle has deviated from the road or an alarm indicating that the vehicle may deviate from the road. Output.
- the steering control device 42 is a device that performs steering control of the host vehicle based on the control command transmitted from the driving support ECU 20 so that the host vehicle keeps traveling on the track.
- the steering control device 42 receives a control command from the support control unit 25, the steering control device 42 performs steering control so that the vehicle moves away from the end of the travel range and moves to the vicinity of the center of the travel range.
- the steering control device 42 gives torque to the steering of the vehicle so as not to affect the change in the traveling direction, and informs the driver that the vehicle may deviate from the road. It is good also as what performs control.
- lateral direction the direction orthogonal to the traveling direction of the host vehicle 50
- lateral distance L the distance from the lateral end of the host vehicle 50 in the lateral direction
- the assistance control unit 25 detects the approach to the end of the traveling range 71 of the host vehicle 50, that is, one of the left end 61 and the right end 62 of the road, and the notification device 41 and the steering control device 42 are detected. The control command to is sent.
- the other vehicle is the preceding vehicle 51 and is traveling in the same direction as the host vehicle 50, that is, toward the upper side of the drawing.
- the preceding vehicle 51 is traveling closer to the left end 61 of the road than the own vehicle 50 in the lateral direction and spaced apart from the own vehicle 50 in the lateral direction. That is, it is assumed that the preceding vehicle 51 exists between the own vehicle 50 and the left end 61 of the road in the lateral direction.
- the correction unit 24 obtains the lateral distance L between the end of the preceding vehicle 51 on the own vehicle 50 side and the end of the own vehicle 50 on the preceding vehicle 51 side. Then, the correction unit 24 translates the left end 61 of the road, which is the end line detected by the end detection unit 23, to the right, and from the left end 61 of the host vehicle 50 to the left by the lateral distance L.
- a correction line 63 is set at a distant position.
- the relative angle between the traveling direction of the host vehicle 50 and the correction line 63 is equal to the relative angle between the traveling direction of the host vehicle 50 and the left end 61. Since the correction unit 24 sets the correction line 63 in this way, the travel range 72 of the host vehicle 50 is a range between the right end 62 of the road and the correction line 63.
- FIG. 3 shows an example in which the road is a straight line and the travel range 71 defined by the left end 61 and the right end 62 of the road is a straight line.
- the road may be a curve. The correction of the end line when the road is a curve will be described with reference to FIG.
- the position of the preceding vehicle 51 is temporarily stored in a memory provided in the driving assistance ECU 20 until the host vehicle 50 reaches the past position of the preceding vehicle 51 for a predetermined period, for example, a position in the traveling direction.
- the correction unit 24 reads the lateral position of the preceding vehicle 51 at the time when the position in the traveling direction of the preceding vehicle 51 is equal to the position in the traveling direction of the host vehicle 50.
- the correcting unit 24 obtains a lateral distance L that is a difference between the read lateral position, that is, the past lateral position of the preceding vehicle 51 and the current lateral position of the host vehicle 50.
- the correction unit 24 translates the left end 61 of the road, which is the end line detected by the end detection unit 23, to the right, and from the left end 61 of the host vehicle 50 to the left by the lateral distance L.
- a correction line 63 is set at a distant position.
- the course of the own vehicle 50 is predicted based on the detected left end 61 or the position of the preceding vehicle 51, and the end of the own vehicle 50 on the preceding vehicle 51 side and the preceding vehicle 51 are estimated based on the predicted course. It is good also as what calculates
- step S101 an end line is acquired in step S101, and it is determined in step S102 whether a moving object is detected. If a positive determination is made in step S102, that is, if a moving object is detected, the process proceeds to step S103.
- step S103 the lateral distance L between the end of the own vehicle 50 on the moving body side and the end of the moving body on the own vehicle 50 side is obtained. Subsequently, the process proceeds to step S104, and using the lateral distance L obtained in step S103, the position of the end line is changed to a position separated from the own vehicle 50 by the lateral distance L to be a correction line 63. Then, a series of processing ends.
- step S102 determines whether or not moving object has been detected. If an affirmative determination is made in step S105, that is, if the end line has been corrected in the previous control cycle, the process proceeds to step S106, and the end line correction is terminated. Therefore, the travel range is defined by the end line instead of the correction line 63. Then, a series of processing ends.
- step S105 if a negative determination is made in step S105, that is, if the end line has not been corrected in the previous cycle, the series of processing ends. Therefore, the process of dividing the travel range by the end line instead of the correction line 63 is continued.
- step S105 Since the end line is corrected when the moving body is detected, in the process of step S105, instead of determining whether the end line has been corrected in the previous period, the moving body is detected in the previous period. You may determine whether it detected.
- step S101 it is assumed that the end line is acquired in step S101. However, if the end line cannot be acquired, the process for determining whether or not the vehicle has deviated from the traveling range is not performed. Good.
- the driving support ECU 10 has the following effects.
- the host vehicle 50 in the lateral direction that is a direction orthogonal to the traveling direction of the host vehicle 50 is greater than the position where the preceding vehicle 51 exists.
- the side is a range in which the preceding vehicle 51 can be overtaken. That is, it can be said that the own vehicle 50 side is within the range in which the own vehicle 50 can travel from the position where the preceding vehicle 51 exists.
- the left end 61 that is one end of the road on which the host vehicle 50 travels is detected, the position of the left end 61 is determined between the host vehicle 50 and the preceding vehicle 51 by the correction unit 24.
- the correction line 63 is corrected. Thereby, even if the position shift has occurred between the detected left end 61 and the actual left end 61, one end of the travel range 71 in which the vehicle can travel can be properly defined.
- the correction unit 24 corrects the left end 61 to set the correction line 63
- the deviation from the traveling range is required unless the shape of the correction line 63 is the same as the actual shape of the left end 61.
- the accuracy of the suppression control is lowered.
- the left end 61 is corrected to be the correction line 63
- the left end 61 is translated between the own vehicle 50 and the preceding vehicle 51 to be the correction line 63.
- the shape of the correction line 63 can be along the actual road shape. Thereby, especially in the curve section of a road, deviation control from a run range can be performed appropriately.
- Second Embodiment 1st Embodiment demonstrated the case where the edge part of the own vehicle 50 and the edge part of the preceding vehicle 51 were spaced apart about the horizontal direction. In this regard, there may occur a case where the position of the own vehicle 50 and the position of the preceding vehicle 51 partially overlap in the lateral direction. In this embodiment, the process in the case where a part of horizontal position of the own vehicle 50 and the preceding vehicle 51 overlaps is added.
- the first correction line 63 is equivalent to the correction line 63 in the first embodiment.
- a straight line or a curve passing through the opposite side of the preceding vehicle 51 from the own vehicle 50 side, that is, the end on the left end 61 side of the preceding vehicle 51 is defined as a second correction line 64.
- a follow-up range 73 that is an area between the first correction line 63 and the second correction line 64 is set.
- the driving assistance ECU performs control to keep the distance from the preceding vehicle 51 constant.
- the driving support ECU 10 according to the present embodiment has the following effects in addition to the effects exhibited by the driving support ECU 10 according to the first embodiment.
- the first correction line 63 provided on the own vehicle 50 side of the preceding vehicle 51 and the second correction line 64 provided on the left end 61 side of the preceding vehicle 51 are set.
- the range between the first correction line 63 and the second correction line 64 is a range in which the vehicle should travel following the preceding vehicle 51.
- the right end 62 side of the first correction line 63 is a range where the vehicle may travel so as to pass the preceding vehicle 51. Therefore, by setting the first correction line 63 and the second correction line 64 as in the present embodiment, it is possible to appropriately define the range in which the host vehicle 50 can travel.
- a part of the control is added to the control executed by the driving support ECU 10 according to the first embodiment. Specifically, the correction line setting process when the moving body is a person is partially changed.
- the end line correction processing executed by the driving support ECU 20 according to the present embodiment will be described with reference to FIG.
- the correction unit 24 obtains a first distance Lx1 that is a lateral distance from the pedestrian 52.
- the first distance Lx1 is equivalent to the lateral distance L in the first embodiment, and is obtained based on the difference in lateral position between the vehicle 50 and the pedestrian 52. Since the lateral width of the pedestrian 52 is smaller than that of the vehicle, the lateral position of the pedestrian 52 may be set at the center of the pedestrian 52, and the pedestrian 52 as in the first embodiment. You may set to the edge part of the own vehicle 50 side.
- the correction unit 24 obtains a longitudinal distance Ly that is a distance between the own vehicle 50 and the pedestrian 52 in the traveling direction of the own vehicle 50.
- the longitudinal distance Ly is obtained as a difference between the end of the own vehicle 50 on the traveling direction side and the position of the pedestrian 52.
- the correction unit 24 sets the correction line 65 using the first distance Lx1 and the vertical distance Ly obtained as described above.
- the correction line 65 translates the left end 61 of the road to a position separated from the end of the vehicle 50 on the pedestrian 52 side by a first distance Lx1, and near the position where the pedestrian 52 exists.
- the predetermined range 65a is a range protruding toward the own vehicle 50 side. More specifically, with respect to a range that is separated by a distance b in the longitudinal direction around a position that is separated from the own vehicle 50 by the longitudinal distance Ly, the distance from the own vehicle 50 is corrected to a position that is separated by the second distance Lx2. Set so that line 65 is located.
- the second distance Lx2 is set to a value smaller than the first distance Lx1 by a predetermined value (for example, 1 m).
- the distance from the vehicle 50 in the predetermined range 65a is maintained at the second distance Lx2 until the distance b from the longitudinal center is increased, and gradually increases from the position b to the position a from the center.
- the first distance Lx1 is maintained beyond the distance b from the center.
- the correction line 65 is set so as to protrude toward the host vehicle 50 in the predetermined range 65a in this way, the travel range 74 of the host vehicle 50 defined by the right end 62 of the road and the correction line 65 is a walking. In the vicinity of the person 52, the range becomes narrower.
- the second distance Lx2 and the length of the predetermined range 65a in the vertical direction are set to predetermined values, but may be changed according to the first distance Lx1 and the vertical distance Ly. It is good also as what changes according to the speed of 50.
- the predetermined range 65a has a target shape centered on the position of the pedestrian 52 in the vertical direction, it may have a different shape.
- the lateral distance between the correction line 65 and the host vehicle 50 may be set as the first distance Lx1. This is because if the own vehicle 50 moves beyond the position of the pedestrian 52, the risk of contact between the own vehicle 50 and the pedestrian 52 is reduced.
- the lateral positions of the own vehicle 50 and the pedestrian 52 may overlap as in the second embodiment.
- the first distance Lx1 is obtained.
- the first distance Lx1 is a negative value.
- the correction line 65 is set based on the value of Lx1. That is, as the correction line 65 is set, the host vehicle 50 is positioned on the correction line 65. Therefore, if the correction line 65 is set when the lateral position of the host vehicle 50 and the pedestrian 52 overlap, the operating conditions of the notification device 41 and the steering control device 42 are satisfied with the setting, and the notification device 41 and the steering control device 42 are operated.
- the driving support ECU according to the present embodiment has the following effects in addition to the effects exhibited by the driving support ECU according to the first embodiment.
- the correction line 65 of the predetermined range 65a is projected to the vehicle 50 side.
- the position separated from the vehicle can be set as the travel range of the host vehicle 50.
- the oncoming vehicle 53 is employed as the moving body when the correction unit 24 performs the correction line setting process. Processing executed by the driving support ECU 20 in the present embodiment will be described with reference to FIG. FIG. 8 shows an example of processing in the left-hand traffic country.
- the end detection unit 23 detects the center line 67 as an end line based on the feature points in the image acquired from the image recognition unit 21.
- This center line 67 can also be called the right end of the lane in which the host vehicle 50 travels.
- the correction unit 24 corrects the center line 67 based on the end position of the oncoming vehicle 53 on the own vehicle 50 side. Specifically, the lateral distance L between the position of the opposite vehicle 53 side end portion of the own vehicle 50 and the position of the opposite vehicle 53 end portion of the own vehicle 50 is obtained, and the position of the center line 67 is determined as the position of the own vehicle 50.
- the correction line 68 is corrected to a position separated from the oncoming vehicle 53 side end by a lateral distance L.
- the end detection unit 23 detects the left end 61 and the right end 62 of the road as end lines as in the first embodiment. At this time, if the object detection unit 22 detects the oncoming vehicle 53, the correction unit 24 detects the end of the own vehicle 50 on the oncoming vehicle 53 side, that is, the right end 62 that is an end line, and the own vehicle of the oncoming vehicle 53. A lateral distance L, which is a lateral distance from the end portion on the side of the car 50, is obtained. Then, the correction unit 24 corrects the position of the right end portion 62 of the road to the position separated from the own vehicle 50 by the lateral distance L using the obtained lateral distance L to obtain a correction line 68.
- control that is performed using the correction line 68 set in FIGS. 8 and 9 and that is performed by the support control unit 25 and that suppresses deviation from the travel range is the same as in the first embodiment. Description will be omitted.
- the driving support ECU 20 in the present embodiment has the following effects in addition to the effects exhibited by the driving support ECU 20 according to the first embodiment.
- the host vehicle 50 in the lateral direction that is a direction orthogonal to the traveling direction of the own vehicle 50 is greater than the position where the oncoming vehicle 53 exists.
- the side is a range where the oncoming vehicle 53 can pass. That is, it can be said that the own vehicle 50 side is a range in which the own vehicle 50 can travel from the position where the oncoming vehicle 53 exists.
- the correction unit 68 sets the correction line 68 between the host vehicle 50 and the oncoming vehicle 53. It is said. As a result, even if the detected center line 67 or right end 62 has a deviation from the actual position, one end of the travel range 75 in which the vehicle can travel can be properly defined.
- the edge part of a driving range shall be switched immediately from a correction line to an edge line.
- the correction line may be gradually changed to the end line.
- the detection of the moving object is temporarily not performed during, for example, one to several control cycles, the detection of the moving object is not performed for a predetermined period, for example, several control cycles.
- the road edge may be switched from the correction line to the end line.
- the lateral distance L between the end of the own vehicle 50 on the road end side and the end of the moving body on the own vehicle side is obtained, and the lateral distance L is separated from the end of the own vehicle 50 on the road end side.
- a correction line was set at the position.
- the correction line may be set so as to be positioned between the vehicle 50 and the moving body.
- the end line indicating the road edge is acquired based on the image acquired by the imaging device 11.
- the radar device 12 detects a shoulder step on the road edge, a guard rail provided on the road edge, and the like by the radar device 12, and acquires an end line based on the position of the shoulder step, the guard rail, and the like. It is good.
- only one of the first correction line 63 and the second correction line 64 may be obtained.
- the same processing as in the first embodiment is performed.
- the travel range in which the left end is defined by the second correction line 64 is the travel range in which the preceding vehicle 51 has already traveled. Therefore, even if the travel range is defined by the second correction line 64 and the right end 62, it can be said that the host vehicle 50 can travel within the travel range.
- the driver is assisted using the correction line, but the present invention can also be applied to a system in which part or all of the driving operation is automatically performed by the ECU.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Traffic Control Systems (AREA)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/323,870 US20190176887A1 (en) | 2016-08-11 | 2017-07-26 | Recognition device and recognition method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2016-158303 | 2016-08-11 | ||
| JP2016158303A JP2018026023A (ja) | 2016-08-11 | 2016-08-11 | 認識装置、及び、認識方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018030159A1 true WO2018030159A1 (ja) | 2018-02-15 |
Family
ID=61162917
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2017/027125 Ceased WO2018030159A1 (ja) | 2016-08-11 | 2017-07-26 | 認識装置、及び、認識方法 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20190176887A1 (enExample) |
| JP (1) | JP2018026023A (enExample) |
| WO (1) | WO2018030159A1 (enExample) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10926760B2 (en) * | 2018-03-20 | 2021-02-23 | Kabushiki Kaisha Toshiba | Information processing device, information processing method, and computer program product |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7150245B2 (ja) * | 2018-06-01 | 2022-10-11 | マツダ株式会社 | 車両用警報システム |
| JP7150246B2 (ja) * | 2018-06-01 | 2022-10-11 | マツダ株式会社 | 車両用警報システム |
| JP7150247B2 (ja) * | 2018-06-01 | 2022-10-11 | マツダ株式会社 | 車両用警報システム |
| CN113688653B (zh) * | 2020-05-18 | 2024-06-28 | 富士通株式会社 | 道路中心线的识别装置及方法、电子设备 |
| KR20210150922A (ko) * | 2020-06-04 | 2021-12-13 | 현대모비스 주식회사 | 차량의 주행 제어 시스템 및 방법 |
| CN114694116A (zh) * | 2022-03-24 | 2022-07-01 | 商汤集团有限公司 | 一种道路边界检测方法、装置、电子设备和存储介质 |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004078333A (ja) * | 2002-08-12 | 2004-03-11 | Nissan Motor Co Ltd | 走行経路生成装置 |
| JP2004149035A (ja) * | 2002-10-31 | 2004-05-27 | Honda Motor Co Ltd | 車両用追従走行制御装置 |
| JP2011134071A (ja) * | 2009-12-24 | 2011-07-07 | Denso Corp | 仮想白線設定方法、仮想白線設定装置及びそれを用いた針路変更支援装置 |
-
2016
- 2016-08-11 JP JP2016158303A patent/JP2018026023A/ja active Pending
-
2017
- 2017-07-26 US US16/323,870 patent/US20190176887A1/en not_active Abandoned
- 2017-07-26 WO PCT/JP2017/027125 patent/WO2018030159A1/ja not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2004078333A (ja) * | 2002-08-12 | 2004-03-11 | Nissan Motor Co Ltd | 走行経路生成装置 |
| JP2004149035A (ja) * | 2002-10-31 | 2004-05-27 | Honda Motor Co Ltd | 車両用追従走行制御装置 |
| JP2011134071A (ja) * | 2009-12-24 | 2011-07-07 | Denso Corp | 仮想白線設定方法、仮想白線設定装置及びそれを用いた針路変更支援装置 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10926760B2 (en) * | 2018-03-20 | 2021-02-23 | Kabushiki Kaisha Toshiba | Information processing device, information processing method, and computer program product |
Also Published As
| Publication number | Publication date |
|---|---|
| US20190176887A1 (en) | 2019-06-13 |
| JP2018026023A (ja) | 2018-02-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN107251127B (zh) | 车辆的行驶控制装置以及行驶控制方法 | |
| US10688994B2 (en) | Vehicle control apparatus | |
| US10384681B2 (en) | Vehicle cruise control device and cruise control method | |
| WO2018030159A1 (ja) | 認識装置、及び、認識方法 | |
| US11086011B2 (en) | Target detection device | |
| US10037700B2 (en) | Driving support apparatus for a vehicle | |
| JP6468136B2 (ja) | 走行支援装置及び走行支援方法 | |
| WO2016159288A1 (ja) | 物標存在判定方法及び装置 | |
| WO2018074288A1 (ja) | 車両認識装置及び車両認識方法 | |
| WO2018074287A1 (ja) | 車両制御装置 | |
| JP6504078B2 (ja) | 衝突予測装置 | |
| US11091197B2 (en) | Driving support apparatus | |
| US12459504B2 (en) | Vehicle control device | |
| JP7006203B2 (ja) | 軌跡設定装置 | |
| JP5606468B2 (ja) | 車両周辺監視装置 | |
| JP6733616B2 (ja) | 車両制御装置 | |
| JP7304378B2 (ja) | 運転支援装置、運転支援方法、およびプログラム | |
| CN110576858B (zh) | 车辆控制装置、车辆控制方法和记录介质 | |
| JP7716510B2 (ja) | 周辺監視装置及びプログラム | |
| JP2022050966A (ja) | 物体検出装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17839232 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17839232 Country of ref document: EP Kind code of ref document: A1 |