WO2019003603A1 - 衝突推定装置および衝突推定方法 - Google Patents
衝突推定装置および衝突推定方法 Download PDFInfo
- Publication number
- WO2019003603A1 WO2019003603A1 PCT/JP2018/016260 JP2018016260W WO2019003603A1 WO 2019003603 A1 WO2019003603 A1 WO 2019003603A1 JP 2018016260 W JP2018016260 W JP 2018016260W WO 2019003603 A1 WO2019003603 A1 WO 2019003603A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- moving object
- vehicle
- collision
- trajectory
- traveling direction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q9/00—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling
- B60Q9/008—Arrangement or adaptation of signal devices not provided for in one of main groups B60Q1/00 - B60Q7/00, e.g. haptic signalling for anti-collision purposes
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/06—Direction of travel
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/14—Yaw
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/18—Steering angle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/05—Type of road, e.g. motorways, local streets, paved or unpaved roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
- B60W2554/40—Dynamic objects, e.g. animals, windblown objects
- B60W2554/404—Characteristics
- B60W2554/4048—Field of view, e.g. obstructed view or direction of gaze
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2555/00—Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
- B60W2555/60—Traffic rules, e.g. speed limits or right of way
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/09—Taking automatic action to avoid collision, e.g. braking and steering
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
Definitions
- the present disclosure relates to the estimation of a collision between a vehicle and a moving object different from the vehicle.
- the movement trajectory of the moving object is estimated, and the collision possibility between the own vehicle and the moving object Various methods for estimating have been proposed.
- the movement trajectory of the moving object can be estimated more accurately by measuring the position of the moving object by taking a longer time using a camera or a millimeter wave radar, and the estimation accuracy of the collision possibility is It can improve.
- the response operation such as an alarm output may be delayed.
- Patent No. 5729416 gazette
- Such a problem is not limited to the case where a predetermined region is used as a region involved in determining whether to ease the reference condition used for collision estimation (collision determination) as in Patent Document 1, but collision estimation is performed. It may also occur when it is used as a region involved in the determination of whether or not. That is, the above problem occurs even in a configuration in which collision estimation is performed when a moving object is found in a predetermined region, and no collision estimation is performed when a moving object is found in a region other than the predetermined region. obtain. Because of this, there is a need for a technology that can perform collision estimation with high accuracy even when a change in the traveling direction of the vehicle occurs.
- the present disclosure has been made to solve at least a part of the problems described above, and can be implemented as the following modes.
- a collision estimation device which is mounted on a vehicle and estimates a collision between the moving object different from the vehicle and the vehicle.
- the collision estimation device estimates the trajectory of the moving object based on information obtained in time series from a first sensor used to recognize the moving object; Moving object trajectory estimation unit; an obstruction identification unit that identifies the position and size of an obstruction that exists on the vehicle in the traveling direction of the vehicle; and acquiring direction change information indicating a change in the traveling direction Moving object extraction region based on the position and size of the shielding object using the direction change information acquisition unit; the position and size of the shielding object specified, and the acquired direction change information; A moving object extraction area setting unit for setting the moving object extraction area setting unit; and a locus of the vehicle estimated when the moving object is recognized in the moving object extraction area based on the information obtained from the first sensor; Track of moving object And a collision estimation unit that estimates presence or absence of a collision between the vehicle and the moving object using the acquired direction change information; and the moving object extraction area setting unit is configured
- the collision estimation apparatus of this aspect of the area near the shield, the area near the outer peripheral surface of the shield facing the trajectory of the vehicle after the change in the traveling direction indicated by the direction change information is extracted as moving objects Since the region is set, collision estimation can be performed accurately even when a change in the traveling direction of the vehicle occurs.
- the present disclosure can also be implemented in various forms other than the collision estimation device.
- it can be realized in the form of a collision estimation method, a computer program for realizing such a method, a storage medium storing such a computer program, a vehicle equipped with a collision estimation device, and the like.
- FIG. 1 is a block diagram showing the configuration of a collision estimation apparatus according to an embodiment of the present disclosure
- FIG. 2 is a flowchart showing the procedure of the collision estimation process
- FIG. 3 is an explanatory view showing an example of the collision estimation process
- FIG. 4 is a flowchart showing the procedure of moving object extraction area setting processing
- FIG. 5 is an explanatory view showing a setting example of a moving object extraction area
- FIG. 6 is an explanatory view showing an example of the collision estimation process
- FIG. 7 is an explanatory view showing a setting example of a moving object extraction area
- FIG. 1 is a block diagram showing the configuration of a collision estimation apparatus according to an embodiment of the present disclosure
- FIG. 2 is a flowchart showing the procedure of the collision estimation process
- FIG. 3 is an explanatory view showing an example of the collision estimation process
- FIG. 4 is a flowchart showing the procedure of moving object extraction area setting processing
- FIG. 5 is an explanatory view showing a setting example of a
- FIG. 8 is an explanatory view showing an example of the trajectory of the host vehicle when the traveling direction changes in the second embodiment
- FIG. 9 is an explanatory view showing a setting example of a moving object extraction area
- FIG. 10 is an explanatory view showing a setting example of the moving object extraction area.
- “moving objects different from vehicles” means movable objects or creatures such as pedestrians, bicycles, motorcycles, and unmanned vehicles.
- a vehicle on which the collision estimation device 10 is mounted may be referred to as a "own vehicle”.
- the collision estimation device 10 is configured by an ECU (Electronic Control Unit) equipped with a microcomputer and a memory.
- ECU Electronic Control Unit
- the collision estimation device 10 is electrically connected to various devices mounted on the vehicle, and exchanges data with the various devices. Specifically, as shown in FIG. 1, the collision estimation device 10 includes the millimeter wave radar 21, the imaging device 22, the yaw rate sensor 23, the steering angle sensor 24, the vehicle speed sensor 25, the brake ECU 201, and the alarm ECU 202, respectively. It is connected to to exchange data.
- the millimeter wave radar 21 uses radio waves in the millimeter wave band to detect the presence or absence of an object on the traveling direction side of the vehicle (forward when the vehicle is moving forward), the distance from the object to the vehicle, the object The position of the object, the size of the object, the shape of the object and the relative velocity of the object to the vehicle are detected.
- the “object” detected by the millimeter wave radar 21 is more accurately a set of a plurality of detection points (targets).
- the millimeter wave radar 21 When the ignition of the host vehicle is turned on, the millimeter wave radar 21 repeatedly executes irradiation of millimeter wave radio waves and reception of its reflected wave, and detection of an object (target).
- the imaging device 22 is configured of an imaging camera provided with a lens for collecting light and a light receiving element, and images the traveling direction side of the host vehicle to obtain a captured image.
- the imaging device 22 When the ignition of the host vehicle is turned on, the imaging device 22 repeatedly executes acquisition of a captured image (frame image). For example, an image of 30 frames is acquired per second.
- the frame rate at the time of imaging is not limited to 30 frames per second, and may be an arbitrary value.
- the yaw rate sensor 23 detects the yaw rate (rotational angular velocity) of the host vehicle.
- the yaw rate sensor 23 repeatedly executes detection of the yaw rate when the ignition of the host vehicle is turned on.
- the steering angle sensor 24 detects the steering wheel steering angle of the host vehicle.
- the steering angle sensor 24 repeatedly executes the detection of the steering angle when the ignition of the host vehicle is turned on.
- the vehicle speed sensor 25 detects the speed of the host vehicle.
- the vehicle speed sensor 25 repeatedly executes the detection of the speed of the host vehicle when the ignition of the host vehicle is turned on.
- the brake ECU 201 is an ECU for brake control, and is electrically connected to the collision estimation device 10 and the brake mechanism 211.
- the brake ECU 201 determines the timing to apply the brake and the amount of braking (the amount of braking) and controls the brake mechanism 211.
- the brake mechanism 211 is composed of a sensor, a motor, a valve, a pump, various actuators, and the like involved in the brake control.
- the alarm ECU 202 is an ECU for alarm output, and is electrically connected to the collision estimation device 10 and the alarm mechanism 212.
- the alarm ECU 202 determines the timing of outputting an alarm and the content of the output, and controls the alarm mechanism 212. In the present embodiment, the alarm is output as a sound for alerting a collision with a moving object.
- the alarm mechanism 212 is a device related to audio output such as a speaker and an amplifier.
- the brake ECU 201 and the alarm ECU 202 control the brake mechanism 211 and the alarm mechanism 212, respectively, to cope with collision avoidance. Execute the action. Specifically, an automatic brake is applied and an alarm is issued.
- the collision estimation device 10 includes a vehicle trajectory estimation unit 11, a moving object trajectory estimation unit 12, a shielding object identification unit 13, a moving object extraction area setting unit 14, a direction change information acquisition unit 15, and a collision estimation unit 16. Is equipped. These functional units 11 to 16 are realized by the microcomputer of the collision estimation device 10 executing a control program stored in advance in the collision estimation device 10.
- the vehicle locus estimation unit 11 estimates the locus of the host vehicle based on values obtained periodically in time series from the yaw rate sensor 23, the steering angle sensor 24, and the vehicle speed sensor 25. Specifically, the yaw rate, steering angle, and vehicle speed, which are obtained periodically, are stored as a history, and the planned passing position and the planned passing time of the vehicle are estimated as the trajectory of the vehicle based on the history of a predetermined period. Do.
- the moving object trajectory estimation unit 12 generates a moving object trajectory based on values periodically obtained from the millimeter wave radar 21 in time series and values obtained periodically from the imaging device 22 (frame image data). Estimate the trajectory. Specifically, by combining the position and distance of each target obtained from the millimeter wave radar 21 and the image data obtained from the imaging device 22, the type, position, size, moving direction, and moving speed of the moving object are obtained. presume. Then, the estimated passing position of the moving object and the estimated passing time are estimated as the trajectory of the moving object.
- the type of moving object may be estimated, for example, by pattern matching based on the shape in the frame image.
- the obstacle identification unit 13 identifies the position and the size of the obstacle located on the traveling direction side with respect to the host vehicle.
- the shield is an object that may block moving objects from the millimeter wave radar 21 and the imaging device 22, and means, for example, a non-moving object such as a stopped or parked vehicle, a telephone pole, or a signboard.
- the above-mentioned "do not move” may also include a state of moving at a low speed (forward or backward), in addition to the stopped state. For example, a vehicle that moves at a speed of less than 20 km per hour in the same direction as the traveling direction of the host vehicle is also an object that does not move and corresponds to a shield. Only the stopped state may mean "do not move".
- the shield identification unit 13 determines the position of the shield based on values periodically obtained from the millimeter wave radar 21 in time series and values periodically obtained from the imaging device 22 (frame image data). And identify the size.
- the moving object extraction area setting unit 14 uses the position and the size of the shielding specified by the shielding specification unit 13 and the later-described direction change information acquired by the direction change information acquiring unit 15 to generate a shielding.
- Set a moving object extraction area based on the position and size of The moving object extraction region is a trajectory of a moving object used when estimating a collision with the vehicle under a predetermined condition when a moving object is recognized (extracted) in such a region, and a region different from such a region.
- the trajectory estimated based on the values of the millimeter wave radar 21 and the imaging device 22 obtained in a shorter time than when a moving object is recognized in is used in the region. Details of the moving object extraction area will be described later.
- the direction change information acquisition unit 15 acquires direction change information indicating a change in the traveling direction of the host vehicle.
- the direction change information means a change in the steering wheel steering angle obtained from the steering angle sensor 24.
- the traveling direction of the host vehicle changes.
- the collision estimation unit 16 includes the trajectory of the host vehicle estimated by the vehicle trajectory estimation unit 11, the trajectory of the moving object estimated by the moving object trajectory estimation unit 12, and the direction change information obtained by the direction change information acquisition unit 15. That is, the steering angle is used to estimate the presence or absence of a collision between the host vehicle and the automatic body.
- the collision estimation apparatus 10 having the above configuration can perform collision estimation with high accuracy even when a change in the traveling direction of the host vehicle occurs by executing a collision estimation process described later.
- the above-mentioned millimeter wave radar 21 and imaging device 22 correspond to a subordinate concept of the first sensor in the means for solving the problem. Further, the steering angle sensor 24 corresponds to a subordinate concept of the second sensor in the means for solving the problem. Further, the imaging device 22 corresponds to a subordinate concept of the imaging unit in the means for solving the problem.
- Collision estimation process The collision estimation process shown in FIG. 2 is executed in the collision estimation device 10 when the ignition of the host vehicle is turned on.
- the vehicle locus estimation unit 11 estimates the locus of the host vehicle (step S100).
- the vehicle VL ⁇ b> 1 which is the host vehicle, has traveled straight ahead in the second lane Ln ⁇ b> 2 on a two-lane road to enter the intersection CR.
- the locus Tr0 of the vehicle VL1 As, the locus which goes straight on the second lane Ln2 is estimated.
- the shield specifying unit 13 specifies the position and the size of the shield located on the traveling direction side with respect to the host vehicle (step S105).
- the shield identifying unit 13 identifies the vehicle VL3 as a shield and the position thereof And identify the size.
- the person m2 walks (crosses) the first lane Ln1 in the vicinity of the vehicle VL2, and the person m3 walks on the sidewalk in the vicinity of the vehicle VL3.
- step S110 the moving object extraction area setting unit 14 and the direction change information acquisition unit 15 execute moving object extraction area setting processing (step S110).
- step S110 the direction change information acquisition unit 15 acquires direction change information (step S205).
- step S210 the direction change information acquiring unit 15 determines whether there is a change in the traveling direction of the host vehicle (step S210).
- the moving object extraction area setting unit 14 moves the area near the side of the shield facing the trajectory of the own vehicle (Step S215).
- the moving object extraction area Ar1 is set for the vehicle VL2 which is the shield, and the shield is A moving object extraction area Ar2 is set for the vehicle VL3.
- the moving object extraction area set in step S215 that is, the moving object extraction area set when there is no change in the traveling direction will be described with reference to FIG.
- the details of the moving object extraction area Ar2 are shown in FIG.
- the moving object extraction area Ar2 is a locus of the vehicle VL1 along the orthogonal direction from the center C1 along the orthogonal direction orthogonal to the traveling direction D1 in the front end surface S1 of the vehicle VL3 which is a shield.
- a side from the center C1 to the first point P1 separated by a predetermined first distance L1 toward the side closer to Tr0 is defined in advance from the rear end E1 of the vehicle VL3 as a shield from the center C1 described above It is set as a rectangular area in a plan view with one side extending up to the second point P2 separated by the second distance L2.
- a moving object extraction area Ar1 shown in FIG. 3 is also set by the same method.
- the first distance L1 may be, for example, 1.5 m (meters).
- the second distance L2 may be, for example, 5.0 m.
- the values of the first distance L1 and the second distance L2 are not limited to these values, and may be set to arbitrary values.
- the rear end E1 means an end of the targets of the vehicle VL3 obtained by the millimeter wave radar 21 that is determined based on the target on the farthest side along the traveling direction D1. Therefore, the actual rear end (the deepest point along the traveling direction D1) of the vehicle VL3 may not be set as the rear end E1.
- the direction change information acquisition unit 15 changes the direction obtained in step S205. It is determined whether or not the change in the traveling direction indicated by the information is a change in the direction of passing the front side of the front end face of the shield (step S220).
- the direction change information acquisition unit 15 determines the position and size of the shield identified in step S105, and the position of the vehicle (for example, the current position of the vehicle identified when identifying the trajectory of the vehicle in step S100). And the steering wheel steering angle obtained from the steering angle sensor 24 may be used to make the determination in step S220.
- step S220 If it is determined that the change in the traveling direction is not a change in the direction of passing the front side end face of the shield (step S220: NO), the above-described step S215 is performed. On the other hand, when it is determined that the change in the traveling direction is a change in the direction of passing the front side end face of the shielding object (step S220: YES), the moving object extraction area setting unit 14 An area closer to the front end face of the shield facing the trajectory of the host vehicle is set as a moving object extraction area (step S225).
- the positional relationship between the vehicles VL1 to VL3 and the positional relationship of the persons m2 and m3 described later are the same as those in the example of FIG.
- the example of FIG. 6 is different from the example of FIG. 3 in that the vehicle VL1 is traveling straight on the second lane Ln2, but has started to turn the rudder to the right to turn right.
- the change ⁇ D in the traveling direction is acquired in step S205, and it is determined that the change ⁇ D is a change in the direction passing the front side end face S1 of the vehicle VL3 as shown in FIG. Be done.
- step S225 is executed, and as shown in FIG. 6, a moving object extraction area Ar3 is set in the vicinity of the vehicle VL3.
- the moving object extraction area Ar3 is different from the moving object extraction area Ar2 shown in FIG.
- Moving object extraction area set in step S225 that is, moving object extraction set in the case where there is a change in the traveling direction and the change is in the direction of passing the front side end face of the shield Details of the area will be described with reference to FIG.
- FIG. 7 shows details of the moving object extraction area Ar3.
- the moving object extraction region Ar3 is a third distance L3 predetermined in the front side along the traveling direction D1 from the center C1 along the orthogonal direction orthogonal to the traveling direction D1 at the near side end surface S1.
- the third distance L3 may be, for example, 4.0 m (meters).
- the fourth distance L4 may be, for example, 5.0 m.
- the fifth distance L5 may be, for example, 9.0 m.
- the values of the third distance L3, the fourth distance L4, and the fifth distance L5 are not limited to these values, and may be set to arbitrary values.
- the moving object trajectory estimation unit 12 determines whether or not there is a moving object on the traveling direction side (forward side along the traveling direction D1). Is determined (step S115). If it is determined that there is no moving object on the traveling direction side (step S115: NO), the process returns to step S100 described above.
- step S110 If it is determined that there is a moving object on the traveling direction side (step S110: YES), the moving object trajectory estimation unit 12 determines whether a moving object exists in the moving object extraction area (step S120). If it is determined in step S115 that there are a plurality of moving objects, the process of step S120 and subsequent steps is executed for each moving object.
- the moving object trajectory estimation unit 12 determines the image data of the standard number of frames and the period during which the image data of the number of frames is obtained The trajectory of the moving object is estimated on the basis of the measurement result of the millimeter wave radar 21 obtained in the period according to (step S125).
- the standard number of frames is five.
- the number of frames is not limited to five and may be an arbitrary number of frames.
- the moving object trajectory estimation unit 12 reduces the number of frames of image data and the number of image data of such frames.
- the trajectory of the moving object is estimated based on the measurement result of the millimeter wave radar 21 obtained in the period according to the obtained period (step S135).
- the reduced number of frames means the number of frames smaller than the “standard number of frames” in step S125 described above, and is “3 frames” in the present embodiment.
- the number of frames is not limited to three, and may be an arbitrary number of frames smaller than the standard number of frames.
- step S135 unlike the above-described step S125, movement is performed based on the image data of the reduced frame number and the measurement result of the millimeter wave radar 21 obtained in the period according to the period in which the image data of the frame number is obtained. Since the trajectory of the object is estimated, the time required to estimate the trajectory of the moving object is shorter than in step S125.
- step S140 After execution of step S125 or step S135 described above, the collision estimation unit 16 moves the vehicle and the vehicle based on the trajectory of the vehicle estimated in step S100 and the trajectory of the moving object estimated in step S125 or step S135. The presence or absence of a collision with an object is estimated (step S140).
- step S115 to step S140 described above will be specifically described using the example of FIG. 3 and the example of FIG.
- the moving object trajectory estimation unit 12 recognizes the person m2 as the moving object in the moving object extraction area Ar1, and recognizes the person m3 as the moving object on the sidewalk beside the oncoming lane Ln10. . Therefore, in step S115, it is determined that there are a plurality of moving objects (two persons m2 and m3) on the traveling direction D1 side. Since the person m2 exists inside the moving object extraction area Ar1, step S135 is executed, and the trajectory Tr2 of the person m2 is estimated using frame images of three frames.
- step S140 determines that "collision has occurred".
- step S125 is executed, and the trajectory Tr3 of the person m3 is estimated using the frame image of 5 frames.
- step S135 is executed similarly to the person m2, and the frame image of three frames is used to generate the person m3.
- Trajectory Tr4 is estimated.
- the locus Tr4 of the person m3 estimated in the example of FIG. 6 is the same as the locus Tr3 estimated in the example of FIG. For this reason, also in the example of FIG. 6, it is determined in step S140 that "no collision has occurred".
- step S140 After execution of step S140 described above, the process returns to step S100 described above.
- the collision estimation device 10 when it is determined that "collision occurrence" as a result of step S140, the collision estimation device 10 notifies the brake ECU 201 and the alarm ECU 202 of "collision occurrence" and information on the collision position, and such information Based on the above, the corresponding action for collision avoidance described above is performed.
- collision estimation apparatus 10 of the first embodiment described above in the area near the shield (vehicle VL3), the shielding facing the trajectory of the vehicle after the change ⁇ D of the traveling direction D1 indicated by the direction change information Since the region near the outer peripheral surface (front side end surface S1) of the object is set as the moving object extraction region Ar3, collision estimation can be performed with high accuracy even when a change in the traveling direction of the vehicle occurs.
- a moving object extraction region Ar2 is set as a moving object extraction region Ar2 with a rectangular region in plan view having one side up to the second point P2 separated by the second distance L2 as a moving object extraction region Ar2.
- the change in the traveling direction D1 indicated by the direction change information is a change ⁇ D from the traveling direction D1 to a direction passing through the front side end face S1 of the shielding object (vehicle VL3)
- the front side end face S1 A third point P3 separated by a predetermined third distance L3 from the center C1 along the orthogonal direction orthogonal to the advancing direction D1 along the advancing direction D1 and an orthogonal direction from the third point P3 A side from a fourth point P4 to a third point P3 separated by a predetermined fourth distance L4 on the side away from the locus Tr0 of the vehicle is one side
- a fifth predetermined from the fourth point P4 along the traveling direction D1 An area in a rectangular shape in plan view with one side extending from the fifth point P5 separated by the distance L5 to the fourth point P4 is set as the moving object extraction area Ar3, so that the locus of the vehicle VL1 after the traveling direction changes.
- a moving object person m2 and person m3
- the moving object is recognized in a region different from the moving object extraction region Using the trajectory of the moving object estimated based on the information obtained from the first sensor (the millimeter wave radar 21 and the imaging device 22) in a shorter time than in the case where it is determined that there is a collision
- “moving objects (person m2 and person m3) appear from the shade of the shielding object (vehicle VL2 and vehicle VL3) and move toward the trajectory Tr0 of the vehicle or the trajectory of the vehicle after the change of the traveling direction D1” Even in a situation where there is relatively little time delay until performing the response operation for collision avoidance, collision estimation can be completed in a short time.
- the direction change information acquisition unit 15 acquires direction change information based on the value obtained from the steering angle sensor 24, that is, the steering wheel steering angle of the own vehicle, and therefore specifies the change in the traveling direction D1 of the vehicle VL1 with high accuracy. it can.
- step S135 In the first embodiment, the “number of reduced frames” in step S135 is fixed at “3”, but in the second embodiment, the number of reduced frames in step S135 is the trajectory of the vehicle after the change of the traveling direction. It is determined according to the distance between and the shield.
- the apparatus configuration of the collision estimation apparatus 10 of the second embodiment and the other procedures of the collision estimation process are the same as the apparatus configuration of the collision estimation apparatus of the first embodiment and the procedure of the procedure of the collision estimation process.
- the number of reduced frames in step S135 differs between the case where the trajectory of the host vehicle VL1 after the change of the traveling direction is the trajectory Tr11 and the case where the trajectory Tr12. Specifically, in the case of the trajectory Tr11, the number of reduced frames is “2”, and in the case of the trajectory Tr12, the number of reduced frames is “3”. Such a difference is caused by the difference in distance between the trajectories Tr11 and Tr12 and the front end surface S1 of the vehicle VL3. A distance L11 between the trajectory Tr11 and the near side end surface S1 is smaller than a predetermined threshold distance Lth. In this case, the number of reduced frames is set to “2”.
- the distance L12 between the locus Tr12 and the near end surface S1 is larger than the threshold distance Lth.
- the number of reduced frames is set to “3”.
- “the distance between the trajectory and the vehicle VL3” means, in the present embodiment, the distance between each trajectory and the center C1 in the direction orthogonal to the traveling direction D1 at the near side end surface S1. In addition, it may mean the distance between each trajectory and the end on the side closest to the initial trajectory Tr0 of the vehicle VL1 at the near side end surface S1.
- the number of frame images used to estimate the trajectory of the person m3 is set to be smaller as the trajectory of the host vehicle VL1 after the change in the traveling direction is closer to the vehicle VL3 as a shielding object.
- the time required for collision estimation can be shortened as the trajectory of the host vehicle VL1 is closer to the vehicle VL3 which is a shielding object, and it is possible to suppress the failure to cope with the collision avoidance performed later.
- the collision estimation device of the second embodiment described above has the same effect as the collision estimation device 10 of the first embodiment.
- the first sensor the millimeter wave radar 21 and the imaging device 22
- the trajectory of the host vehicle VL1 after the change in the traveling direction is closer to the shield (person m2) Since the presence or absence of a collision is estimated by using the estimated trajectory of a moving object, the collision can be made in a shorter time even in a situation in which there is less time for performing a response operation for collision avoidance.
- the estimation can be completed to improve the possibility of collision avoidance.
- the moving object extraction area (moving object extraction area Ar1 and moving object extraction area Ar2) in the case where there is no change in the traveling direction is a vehicle (vehicle VL2 or Although the position and the size of the vehicle VL3) are set as a reference, the present disclosure is not limited thereto.
- a moving object extraction area may be set based on the positions and sizes of a plurality of shields. Specifically, for example, as shown in FIG. 9, a vehicle VL4 stopping at a distance ⁇ L1 away from the vehicle VL3 with respect to the vehicle VL3 is recognized as a shield along with the above-described vehicle VL3.
- the moving object extraction area Ar4 may be set as follows.
- the orthogonal direction Of the plurality of shields (two vehicles VL3 and VL4) from the center C1 to a point P6 separated by a sixth distance L6 determined in advance on the side approaching the trajectory Tr0 of the vehicle VL1 along the A rectangular area with one side from the rear end E2 of the innermost shield (vehicle VL4) to a point P7 separated by a predetermined seventh distance L7 along the traveling direction D1 is a moving object extraction area Ar4. It may be set.
- the sixth distance L6 may be, for example, 1.5 m as in the above-described first distance L1.
- the seventh distance L7 may be, for example, 5.0 m as in the case of the second distance L2 described above.
- the values of the sixth distance L6 and the seventh distance L7 are not limited to these values, and may be set to arbitrary values.
- the moving object extraction area set when there is a change in the traveling direction and the change is a change in a direction passing the front side end face of the shield may be set based on the size. Specifically, for example, as shown in FIG. 10, as a shield together with the above-mentioned vehicle VL3, it is a direction perpendicular to the traveling direction D1 with respect to the vehicle VL3, and stops along the side away from the track Tr0.
- the moving object extraction area Ar5 may be set as follows.
- the third point P3 on the near side by the above-described third distance L3 is set as one vertex, and the farthest block (vehicle VL5) from the locus Tr0 of the plurality of blocks (two vehicles VL3 and VL5)
- the above-mentioned fourth distance L4 in the direction away from the locus Tr0 Moving object extraction is a rectangular area in plan view with the ninth point P9 as far apart as the apex and the tenth point P10 as the apex away from the ninth point P9 by the fifth distance L5 along the traveling direction D1 as the
- the moving object extraction region reduces the number of frame images used when estimating the trajectory of the moving object when the moving object is recognized in such a region
- the trajectory of a moving object is estimated, such as the number of frames used when estimating the trajectory of a moving object when the moving object is recognized in a region different from the region
- the number of frames used in estimating the trajectory of the moving object is used to determine whether the number of frame images to be used is the standard number of frames or the reduced number of frames, the present disclosure is not limited thereto.
- the trajectory of the moving object is estimated and collision estimation is performed, and when the moving object is recognized in a region different from the region, the trajectory of the moving object is estimated and collision estimation And may be used to determine whether to estimate the trajectory of a moving object and to perform collision estimation.
- the moving object extraction area when there is no change in the traveling direction of the host vehicle and there is one shield is the moving object extraction areas Ar1 and Ar2 in the first and second embodiments described above. It is not limited to the position and shape.
- the end point on the front side end face (end face S1) of the shielding object on the side closest to the trajectory Tr0 of the vehicle (vehicle VL1) is taken as a vertex, and a rectangular area having a side parallel to the traveling direction D1 is moved It may be an object extraction area.
- a point at a predetermined distance in the direction orthogonal to the traveling direction D1 from the end point closest to the trajectory Tr0 of the own vehicle (vehicle VL1) in the front end face of the shield is taken as a vertex and parallel to the traveling direction D1.
- a rectangular area having a side may be used as a moving object extraction area.
- a virtual line parallel to the traveling direction D1 passing through the center along the orthogonal direction orthogonal to the traveling direction D1 at the front end face of the shielding object, and the shielding object on the side closer to the locus Tr0 of the own vehicle (vehicle VL1) A circular area having a predetermined radius centered on an intersection of a traveling direction D1 passing through the center along the traveling direction D1 of the side face and a virtual line orthogonal to the traveling direction D1 may be used as a moving object extraction area.
- the moving object extraction regions in these examples can all be set using the position and size of the shield. That is, in general, an area based on the position and size of the shield may be set as the moving object extraction area based on the position and size of the shield.
- the direction change information means the change of the steering wheel steering angle obtained from the steering angle sensor 24, but the present disclosure is not limited to this.
- it may mean a change in tire steering angle obtained from a not-shown tire steering angle sensor mounted on the host vehicle.
- it may mean a change in the yaw rate obtained from the yaw rate sensor 23.
- it may mean information indicating the operating state of the direction indication device (turn indicator) mounted on the host vehicle.
- the operation state of the turn indicator is an operation state indicating a right turn
- the direction change information indicates that the change in the traveling direction of the host vehicle is on the right.
- the direction change information indicates that the change in the traveling direction of the host vehicle is on the left side.
- a vehicle includes map information for a navigation device (not shown)
- it may mean information obtained by specifying the type of a road on which the vehicle travels based on the map information.
- the type of the road on which the vehicle travels is based on the map information is a left turn lane
- it indicates that the change in the traveling direction of the vehicle is on the left side.
- it may mean the type of sign painted on the road in the captured image obtained by the imaging device 22.
- the type of sign painted on the road in the captured image is a sign indicating a left turn, it indicates that the change in the traveling direction of the host vehicle is on the left side.
- the vehicle locus estimation unit 11 determines the locus of the vehicle based on values periodically obtained from the yaw rate sensor 23, the steering angle sensor 24, and the vehicle speed sensor 25.
- the present disclosure is not limited thereto.
- the trajectory of the host vehicle may be estimated based on the history of the position information of the host vehicle obtained chronologically by the GPS device.
- the moving object trajectory estimation unit 12 calculates a value periodically obtained from the millimeter wave radar 21 and a value periodically obtained from the imaging device 22 (frame image
- the trajectory of a moving object may be estimated based on only values periodically obtained from the millimeter wave radar 21.
- the millimeter wave radar 21 corresponds to a subordinate concept of the first sensor in the means for solving the problem.
- the trajectory of the moving object may be estimated based on only values (frame image data) periodically obtained from the imaging device 22.
- the imaging device 22 corresponds to a subordinate concept of the first sensor in the means for solving the problem.
- threshold distance Lth is set, but a plurality of threshold distances may be set. In this way, according to the distance between the trajectory of the host vehicle VL1 and the shield (vehicle VL3), the time required for the collision estimation can be shortened more precisely, so that the collision avoidance performed later is performed. It is possible to more reliably suppress the failure to cope with the problem in time.
- a part of the configuration implemented by hardware may be replaced by software, and conversely, a part of the configuration implemented by software may be replaced by hardware
- the “computer-readable recording medium” is not limited to portable recording media such as flexible disks and CD-ROMs, but is fixed to internal storage devices in computers such as various RAMs and ROMs, and computers such as hard disks. Also includes an external storage device. That is, "computer-readable recording medium” has a broad meaning including any recording medium that can fix data packets, not temporarily.
- the present disclosure is not limited to the above-described embodiments, and can be implemented with various configurations without departing from the scope of the present disclosure.
- the technical features in the embodiments corresponding to the technical features in the modes described in the section of the summary of the invention can be used to solve some or all of the problems described above, or the effects of the effects described above. Replacements and combinations can be made as appropriate to achieve part or all. Also, if the technical features are not described as essential in the present specification, they can be deleted as appropriate.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201880042560.XA CN110809792B (zh) | 2017-06-29 | 2018-04-20 | 碰撞推定装置以及碰撞推定方法 |
| US16/727,536 US11351997B2 (en) | 2017-06-29 | 2019-12-26 | Collision prediction apparatus and collision prediction method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017127191A JP6747389B2 (ja) | 2017-06-29 | 2017-06-29 | 衝突推定装置および衝突推定方法 |
| JP2017-127191 | 2017-06-29 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/727,536 Continuation US11351997B2 (en) | 2017-06-29 | 2019-12-26 | Collision prediction apparatus and collision prediction method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019003603A1 true WO2019003603A1 (ja) | 2019-01-03 |
Family
ID=64740588
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/016260 Ceased WO2019003603A1 (ja) | 2017-06-29 | 2018-04-20 | 衝突推定装置および衝突推定方法 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US11351997B2 (enExample) |
| JP (1) | JP6747389B2 (enExample) |
| CN (1) | CN110809792B (enExample) |
| WO (1) | WO2019003603A1 (enExample) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020166338A1 (ja) * | 2019-02-12 | 2020-08-20 | 株式会社デンソー | 運転支援装置 |
| CN115432007A (zh) * | 2022-09-28 | 2022-12-06 | 深圳海星智驾科技有限公司 | 用于车辆自动驾驶系统的碰撞检测方法、装置、电子设备 |
Families Citing this family (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10643084B2 (en) | 2017-04-18 | 2020-05-05 | nuTonomy Inc. | Automatically perceiving travel signals |
| US10816985B2 (en) * | 2018-04-17 | 2020-10-27 | Baidu Usa Llc | Method on moving obstacle representation for trajectory planning |
| US10960886B2 (en) * | 2019-01-29 | 2021-03-30 | Motional Ad Llc | Traffic light estimation |
| DE102020214033A1 (de) * | 2020-11-09 | 2022-05-12 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren und Vorrichtung zum Steuern einer Sicherheitseinrichtung eines Fahrzeugs und Sicherheitssystem für ein Fahrzeug |
| DE102020214031A1 (de) * | 2020-11-09 | 2022-05-12 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren und Vorrichtung zum Steuern einer Sicherheitseinrichtung eines Fahrzeugs und Sicherheitssystem für ein Fahrzeug |
| CN116080635B (zh) * | 2021-11-08 | 2025-12-05 | 北京智行者科技股份有限公司 | 碰撞检测方法、电子设备及移动装置 |
| CN115009272B (zh) * | 2022-07-08 | 2023-01-31 | 北京捷升通达信息技术有限公司 | 一种基于激光雷达路障分类识别的车辆全自主越障方法 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008187347A (ja) * | 2007-01-29 | 2008-08-14 | Toshiba Corp | 車載用ナビゲーション装置、路面標示識別プログラム及び路面標示識別方法 |
| JP2013024709A (ja) * | 2011-07-20 | 2013-02-04 | Nissan Motor Co Ltd | ナビゲーション装置、及びナビゲーション方法 |
| JP5729416B2 (ja) * | 2013-04-26 | 2015-06-03 | 株式会社デンソー | 衝突判定装置、および衝突緩和装置 |
| JP2015203922A (ja) * | 2014-04-11 | 2015-11-16 | 株式会社デンソー | 認知支援システム |
| WO2016104198A1 (ja) * | 2014-12-25 | 2016-06-30 | クラリオン株式会社 | 車両制御装置 |
Family Cites Families (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102006007173A1 (de) * | 2006-02-08 | 2007-08-09 | Valeo Schalter Und Sensoren Gmbh | Fahrzeugumfelderkennungssystem, insbesondere zur Erkennung von seitlich auf das Fahrzeug zukommenden Objekten und/oder zukommendem Kreuzungsverkehrs sowie Verfahren hierfür |
| KR101141874B1 (ko) * | 2008-06-04 | 2012-05-08 | 주식회사 만도 | 위험 지역의 검출 장치, 방법 및 그를 이용한 보행자 검출장치 |
| CN101645204A (zh) * | 2009-07-30 | 2010-02-10 | 杨迎春 | 车辆间避险通信装置及其数据处理方法 |
| DE102009041556A1 (de) * | 2009-09-15 | 2010-06-17 | Daimler Ag | Fahrzeug mit Totwinkelassistenzeinrichtung |
| CN102096803B (zh) * | 2010-11-29 | 2013-11-13 | 吉林大学 | 基于机器视觉的行人安全状态识别系统 |
| RU2564268C1 (ru) * | 2011-08-10 | 2015-09-27 | Тойота Дзидося Кабусики Кайся | Устройство помощи при вождении |
| JP5863481B2 (ja) * | 2012-01-30 | 2016-02-16 | 日立マクセル株式会社 | 車両用衝突危険予測装置 |
| DE102013113054B4 (de) * | 2012-12-03 | 2022-01-27 | Denso Corporation | Zielerfassungsvorrichtung zum Vermeiden einer Kollision zwischen einem Fahrzeug und einem durch einen an dem Fahrzeug montierten Sensor erfassten Ziel |
| JP2014151861A (ja) * | 2013-02-13 | 2014-08-25 | Fuji Heavy Ind Ltd | 車両接近報知装置 |
| KR101843073B1 (ko) * | 2013-05-31 | 2018-03-28 | 도요타 지도샤(주) | 차량의 운전 지원 장치 및 차량 탑재 컴퓨터 |
| DE102013010235A1 (de) * | 2013-06-18 | 2014-12-18 | Volkswagen Aktiengesellschaft | Kraftfahrzeug mit einer Detektionseinrichtung für andere Verkehrsteilnehmer |
| CN103680208B (zh) * | 2013-12-16 | 2016-05-04 | 宁波工程学院 | 来车预警系统 |
| DE102014205014A1 (de) * | 2014-03-18 | 2015-09-24 | Ford Global Technologies, Llc | Verfahren und Vorrichtung zum Erfassen von bewegten Objekten in der Umgebung eines Fahrzeugs |
| CN104036275B (zh) * | 2014-05-22 | 2017-11-28 | 东软集团股份有限公司 | 一种车辆盲区内目标对象的检测方法及其装置 |
| US9649148B2 (en) | 2014-07-24 | 2017-05-16 | Arthrocare Corporation | Electrosurgical system and method having enhanced arc prevention |
| JP2017097681A (ja) * | 2015-11-26 | 2017-06-01 | マツダ株式会社 | 標識認識システム |
| US10479373B2 (en) * | 2016-01-06 | 2019-11-19 | GM Global Technology Operations LLC | Determining driver intention at traffic intersections for automotive crash avoidance |
| CN106530825B (zh) * | 2016-11-16 | 2019-01-22 | 淮阴工学院 | 基于st-mrf模型的电动自行车与汽车交通冲突检测方法 |
-
2017
- 2017-06-29 JP JP2017127191A patent/JP6747389B2/ja active Active
-
2018
- 2018-04-20 WO PCT/JP2018/016260 patent/WO2019003603A1/ja not_active Ceased
- 2018-04-20 CN CN201880042560.XA patent/CN110809792B/zh active Active
-
2019
- 2019-12-26 US US16/727,536 patent/US11351997B2/en active Active
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2008187347A (ja) * | 2007-01-29 | 2008-08-14 | Toshiba Corp | 車載用ナビゲーション装置、路面標示識別プログラム及び路面標示識別方法 |
| JP2013024709A (ja) * | 2011-07-20 | 2013-02-04 | Nissan Motor Co Ltd | ナビゲーション装置、及びナビゲーション方法 |
| JP5729416B2 (ja) * | 2013-04-26 | 2015-06-03 | 株式会社デンソー | 衝突判定装置、および衝突緩和装置 |
| JP2015203922A (ja) * | 2014-04-11 | 2015-11-16 | 株式会社デンソー | 認知支援システム |
| WO2016104198A1 (ja) * | 2014-12-25 | 2016-06-30 | クラリオン株式会社 | 車両制御装置 |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2020166338A1 (ja) * | 2019-02-12 | 2020-08-20 | 株式会社デンソー | 運転支援装置 |
| JP2020134981A (ja) * | 2019-02-12 | 2020-08-31 | 株式会社デンソー | 運転支援装置 |
| JP7275623B2 (ja) | 2019-02-12 | 2023-05-18 | 株式会社デンソー | 運転支援装置 |
| CN115432007A (zh) * | 2022-09-28 | 2022-12-06 | 深圳海星智驾科技有限公司 | 用于车辆自动驾驶系统的碰撞检测方法、装置、电子设备 |
Also Published As
| Publication number | Publication date |
|---|---|
| US20200130683A1 (en) | 2020-04-30 |
| JP6747389B2 (ja) | 2020-08-26 |
| JP2019012314A (ja) | 2019-01-24 |
| US11351997B2 (en) | 2022-06-07 |
| CN110809792B (zh) | 2022-05-24 |
| CN110809792A (zh) | 2020-02-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019003603A1 (ja) | 衝突推定装置および衝突推定方法 | |
| CN109891262B (zh) | 物体探测装置 | |
| JP6084192B2 (ja) | 物体認識装置 | |
| US20180182247A1 (en) | Apparatus and method for supporting collision avoidance of vehicle | |
| EP3608635A1 (en) | Positioning system | |
| JP7119428B2 (ja) | 運転支援装置 | |
| US10885789B2 (en) | Device and method for lateral guidance assistance for a road vehicle | |
| EP3007149B1 (en) | Driving assistance device for vehicles and onboard computer | |
| JP2017010498A (ja) | 車両制御装置 | |
| WO2017014080A1 (ja) | 運転支援システム | |
| WO2014033957A1 (ja) | 衝突判定装置及び衝突判定方法 | |
| JP2014228943A (ja) | 車両用外界センシング装置、その軸ズレ補正プログラム及びその軸ズレ補正方法 | |
| US9540002B2 (en) | Target recognition apparatus | |
| JP2012089114A (ja) | 障害物認識装置 | |
| WO2018207782A1 (ja) | 駐車空間検出装置 | |
| JP2010078387A (ja) | 車線判定装置 | |
| JP6722084B2 (ja) | 物体検出装置 | |
| JP6411933B2 (ja) | 車両状態判定装置 | |
| CN107408346A (zh) | 车辆控制装置以及车辆控制方法 | |
| JP2019012314A5 (enExample) | ||
| WO2019003602A1 (ja) | 衝突推定装置および衝突推定方法 | |
| KR20140076866A (ko) | 선행차량의 거동을 이용한 주행 차로의 곡률 측정 방법 및 장치 | |
| US7057502B2 (en) | Vehicle drive assist apparatus | |
| JP7323356B2 (ja) | 駐車支援装置及び駐車支援方法 | |
| JP5166975B2 (ja) | 車両周辺監視装置および車両周辺監視方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18823192 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18823192 Country of ref document: EP Kind code of ref document: A1 |