JP2014241036A - Vehicle driving assist system - Google Patents

Vehicle driving assist system Download PDF

Info

Publication number
JP2014241036A
JP2014241036A JP2013122865A JP2013122865A JP2014241036A JP 2014241036 A JP2014241036 A JP 2014241036A JP 2013122865 A JP2013122865 A JP 2013122865A JP 2013122865 A JP2013122865 A JP 2013122865A JP 2014241036 A JP2014241036 A JP 2014241036A
Authority
JP
Japan
Prior art keywords
target
vehicle
unit
calculated
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2013122865A
Other languages
Japanese (ja)
Inventor
小幡 康
Yasushi Obata
康 小幡
洋志 亀田
Hiroshi Kameda
洋志 亀田
Original Assignee
三菱電機株式会社
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社, Mitsubishi Electric Corp filed Critical 三菱電機株式会社
Priority to JP2013122865A priority Critical patent/JP2014241036A/en
Publication of JP2014241036A publication Critical patent/JP2014241036A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/80Technologies aiming to reduce greenhouse gasses emissions common to all road transportation technologies
    • Y02T10/84Data processing systems or methods, management, administration

Abstract

PROBLEM TO BE SOLVED: To provide a vehicle driving assist system that decides prior to steering of an own vehicle whether the steering is permitted.SOLUTION: A steering permission deciding unit 3 includes an own vehicle motion prediction block 31 that calculates a future predictive moving range of an own vehicle in a case where the own vehicle is steered, a target motion prediction block 32 that extracts a target, which has a possibility of a collision with the own vehicle when the own vehicle is steered, using a target trail and target type recorded in a tracked trail discrimination result 24, and calculates a future predictive moving range of the extracted target to be attained when the own vehicle is steered, and a collision probability evaluation block 33 that senses a possibility of a collision using the predictive moving range of the own vehicle calculated by the own vehicle motion prediction block 31 and the predictive moving range of the target calculated by the target motion prediction block 32. Prior to steering of the own vehicle, whether the steering is permitted can be decided.

Description

  The present invention observes surrounding automobiles, two-wheeled vehicles, pedestrians, or stationary targets using sensors such as radars and cameras mounted on automobiles, and a vehicle that supports smooth driving so as to avoid collision with them. The present invention relates to a driving support device.

  A lot of papers and patent documents have cited technologies for assisting driving based on information obtained from the vehicle-mounted sensor, and various proposals have been made for devices and methods for realizing them.

A typical example is a collision prevention technique.
This is a technology that observes a target with a radar and an optical sensor, determines the possibility of a collision when the distance and approach speed to the vehicle reach a threshold value, issues a warning, or controls the driving of the car itself It is.
As documents describing an implementation example of this technology, there are a “collision avoidance notification system” in Patent Document 1 below and a “collision risk estimation device and driver support device” in Patent Document 2 below.

JP 2006-31443 A JP 2008-282097 A Japanese Patent Application Laid-Open No. 08-271617

Since the conventional apparatus is configured as described above, it is a function that prevents a near-term danger that occurs after the start of steering by the host vehicle steering, and the start of steering itself must be determined by the driver.
Therefore, there is a problem that it is not possible to determine whether or not steering is possible before steering the host vehicle.

  The present invention provides a vehicle driving support device that determines whether or not steering is possible before steering the host vehicle.

  The vehicle driving support device according to the present invention is identified by the own vehicle motion prediction unit that calculates a predicted own vehicle movement range when the host vehicle is steered, and the target track and target identification unit estimated by the target tracking unit. A target motion prediction unit that extracts a target that may cause a collision by steering the host vehicle using the target type, and calculates a future target predicted moving range during the steering of the host vehicle with respect to the extracted target; A collision probability evaluation unit that detects the possibility of collision using the vehicle predicted movement range calculated by the vehicle motion prediction unit and the target predicted movement range calculated by the target motion prediction unit and determines whether or not steering is possible; Is.

According to the present invention, the possibility of collision is detected by using the own vehicle predicted moving range calculated by the own vehicle motion predicting unit and the target predicted moving range calculated by the target motion predicting unit to determine whether steering is possible.
Therefore, there is an effect that it is possible to determine whether steering is possible before the host vehicle is steered.

It is a block diagram which shows the vehicle driving assistance device by Embodiment 1 of this invention. It is a flowchart which shows a target recognition procedure. It is explanatory drawing which shows the example of a target detection result. It is explanatory drawing which shows the example of a target tracking result. It is explanatory drawing which shows the example of the probability density of the attribute for every target classification. It is explanatory drawing which shows the example of a tracking target identification result. It is explanatory drawing which shows the example of the rectilinear approximation of a tracking track | orbit, and a residual. It is a flowchart which shows the steering decision | availability determination procedure. It is explanatory drawing which shows the example of a motion prediction result of the own vehicle. It is explanatory drawing which shows the example of an extraction result of a collision candidate target. It is explanatory drawing which shows the example of a target exercise | movement prediction result.

Embodiment 1 FIG.
FIG. 1 is a block diagram showing a vehicle driving support apparatus according to Embodiment 1 of the present invention.
In the figure, a sensor group 1 is a radar and a camera mounted on an automobile.
The target recognition part 2 estimates the state about the target around the own vehicle.
The steering propriety determination unit 3 uses the state about the target obtained from the target recognition unit 2 to determine whether the host vehicle can be steered.
The user 4 is a driver of the own vehicle.

In the target recognition unit 2, the target detection unit 21 extracts an observation value of the target position from a signal from the sensor group 1 and stores the observation position as an accumulated observation value 22.
The target tracking unit 23 tracks the time series of the observation values extracted by the target detection unit 21, estimates the target track including the target position and the target speed, and stores it as the tracking track / identification result 24.
The target identifying unit 25 identifies a target type using the target track estimated by the target tracking unit 23 and stores it as a tracking track / identification result 24.

In the steering possibility determination unit 3, the host vehicle motion prediction unit 31 calculates a future host vehicle predicted movement range when the host vehicle is steered.
The target motion prediction unit 32 uses the target track and target type stored in the tracking track / identification result 24 to extract a target that may cause a collision by steering the host vehicle.
Further, a future target predicted movement range is calculated for the extracted target while the vehicle is steered.
The collision probability evaluation unit 33 detects the possibility of collision using the vehicle predicted movement range calculated by the vehicle motion prediction unit 31 and the target predicted movement range calculated by the target motion prediction unit 32, and determines whether or not steering is possible. To do.

  In the example of FIG. 1, it is assumed that the target recognition unit 2 and the steering availability determination unit 3 that are components of the vehicle driving support device are configured by, for example, hardware of a semiconductor circuit board on which a microcomputer or the like is mounted. doing.

Next, the operation will be described.
The wake extraction processing procedure of the first embodiment is shown in FIGS. 2 and 8 for the processing of observation values for one sampling time.
FIG. 2 is a flowchart showing a target recognition procedure.
The operation of the target recognition unit 2 will be described with reference to FIG.

In step ST1 of “target detection”, the target detection unit 21 calculates information on the target position obtained from the sensor group 1 by conventional signal processing.
Thereby, static information based on the target position at the current time is obtained.

FIG. 3 is an explanatory diagram showing an example of the target detection result.
FIG. 3 shows the result obtained by observing the surroundings of the vehicle with two sensors A and B and detecting the target.
Sensor A and sensor B detect substantially the same target position for the same target, although the number of detected targets is different.
The target detection unit 21 stores this target position as the accumulated observation value 22.

In step ST2 of “target tracking”, the target tracking unit 23 estimates the exact position and speed of the target from the time series of the target position information obtained in step ST1 of “target detection” by the tracking process.
By this time series processing, speed is added to the information of the target motion to become dynamic, and the information obtained in the “target detection” step ST1 becomes more accurate.
An example of processing when a Kalman filter is used for this estimation is shown below.

First, the observation value at the latest observation time is read from the sensor group 1.
Next, the gate of the existing wake is calculated, and further, it is checked whether or not the input observation value is in the gate, and it is determined to which wake the incoming observation value can be correlated.
Here, the wake is assumed to be a vector having the following four elements consisting of the target position and velocity in the two-dimensional space of xy.








When the vector of observation values is obtained as polar and coordinate values of distance and azimuth, each is expressed by the following equations (7) to (9).




The inside and outside of the gate is determined by the success or failure of the inequality of the following equation (10).

By using the observation value determined to be inside the gate by the above-mentioned gate inside / outside determination, the motion specification estimated value at the latest time of the wake is calculated.
However, when there are a plurality of existing wakes and a specific observation value enters the gates of the wakes, a correlation determination process is required to associate the observation value with the existing wake one-on-one.
In particular, since there are often a plurality of targets around the automobile, this correlation problem becomes important.
Several methods for determining this correlation while generating a plurality of hypotheses have been proposed so far, and an example thereof is shown in “Target Tracking Device” of Patent Document 3.

When the observation values are assigned to the existing wakes by the correlation determination, smooth specifications at the observation time of these wakes are calculated, and for the updated wake, the likelihood of the wake corresponding to the correlation result is further calculated.
The smooth vector is calculated by the following equation (12).

The smooth error covariance matrix is calculated by the following equation (14).
Further, the likelihood (likelihood) of the wake is calculated by the following equation (15) on the assumption that the probability distribution of the observed value is a Gaussian distribution centered on the predicted observed value (three-dimensional position).

FIG. 4 is an explanatory diagram showing an example of the target tracking result.
FIG. 4 shows a result obtained by the tracking process. When the target speed is added to the detected target position, the moving target track and the stationary target track are obtained as the tracking result.
The target tracking unit 23 stores the target track as a tracking track / identification result 24.

  In the next “target identification” step ST3, the target identification unit 25 applies a Bayesian estimation method or the like, and selects a tracking target from attribute information such as a target speed, an automobile, a two-wheeled vehicle, a pedestrian, and a guardrail (stop object). Classify into:

FIG. 5 is an explanatory diagram showing an example of the probability density of attributes for each target type.
FIG. 5 is an example of the probability density function of the attribute for each target type when the attribute is two types of speed and the target size (spread) on the image obtained from the optical sensor.


In the identification, the likelihood for each target type of each attribute is multiplied, and the type having the maximum value is set as the type of the tracking target.
That is, the following equation (16)
The target type that realizes is set as a candidate for the tracking target type.

Whether the above-described candidate type can be determined is determined as follows.
For example, in the above equation (16), if the target type that achieves the maximum likelihood is a person, if the following equation (17) is satisfied, the type of the tracking target is determined as “person”.

Here, the posterior probability on the left side is calculated by the following equation (18).


FIG. 6 is an explanatory diagram showing an example of the tracking target identification result.
FIG. 6 is an example of information obtained by the above identification processing. The tracking target is detected separately for an automobile, a two-wheeled vehicle, a pedestrian, and a guardrail, and the position and speed of the tracking target are also identified.
The target identification unit 25 stores this identification result as a tracking track / identification result 24.

In addition to the conventional identification result, the target identification unit 25 of the first embodiment further identifies whether the target motion is stable or unstable.
This stability reflects, for example, the risk of a collision when the vehicle turns right as follows.
-Motorcycles that forcibly interrupt other cars are unstable and more likely to collide.
-Automobiles that rarely change lanes are more stable and more prone to collisions.

For unstable targets, it is necessary to pay more attention to the right turn than for stable targets.
The movement change of the following equation (19) is calculated from the tracking trajectory generated by connecting the tracking smooth values at the past sampling times.

FIG. 7 is an explanatory diagram showing an example of the straight-forward approximation of the tracking trajectory and the residual.
A target with a large change in movement is determined as an “unstable target”.
If this holds, the target is identified as an unstable target and the following equation (21)
If it is, identify it as a stable goal.

The above is the process of the target recognizing unit 25, whereby the following is estimated for each tracking track.
-Estimated position and speed of tracking target-Estimated error covariance matrix (smooth error covariance matrix)
-Target type / Stable / unstable distinction These pieces of information are input to the steering availability determination unit 3 and are used to determine whether steering is possible.

FIG. 8 is a flowchart showing a procedure for determining whether or not steering is possible.
The operation of the steering availability determination unit 3 will be described with reference to FIG.
While the target recognizing unit 2 operates whenever an observation value is obtained from the sensor group 1, the steering availability determination unit 3 is activated when the driver indicates the intention to turn right by a winker or the like.

In step ST4 of “own vehicle motion prediction”, the own vehicle motion prediction unit 31 uses the right turn motion model (steering motion model) as the future position and speed when the own vehicle starts the right turn motion from the current time. And the predicted movement range until the right turn is completed is calculated from the prediction error covariance matrix.
The predicted time is a plurality of discrete times set according to a certain sampling interval from the current time to the completion of the right turn.

Here, the vehicle movement at each sampling time is a vector having four elements of the following equation (22), which is composed of the position and speed of the vehicle in the xy two-dimensional space.



A representative single value may be used, or a plurality of typical values may be prepared and predicted for each, and weighted and integrated.



FIG. 9 is an explanatory diagram illustrating an example of a motion prediction result of the own vehicle.
The ellipse in the figure indicates the predicted movement range from time t1 to time t4, and has a spread corresponding to the standard deviation of the prediction error covariance matrix with the future predicted position by the right turn motion model as the center. It is.

In step ST5 of “collision candidate target extraction”, the target motion prediction unit 32 extracts a target in the vicinity of the right turn motion trajectory of the host vehicle.
Specifically, only the area of the maximum movement from each tracking position (integration of the speed obtained by adding the target maximum acceleration to the time required for the right turn time) and the movement target that the vehicle's right turn trajectory collides are extracted.

FIG. 10 is an explanatory view showing an example of the result of extracting a collision candidate target.
The inside of the broken line frame in the figure is the collision candidate target.
Note that the target A is excluded from the collision candidate target because the moving range of the maximum speed does not collide with the vehicle's right turn trajectory.

The target identifying unit 25 determines whether or not there is a possibility of a collision between the target and the host vehicle based on optical image information from a camera or the like in the sensor group 1.
For example, when it is clear that there is no possibility that the oncoming vehicle collides with the own vehicle due to the operation of the blinker, the target identifying unit 25 determines this.
The target motion prediction unit 32 excludes the target determined by the target identification unit 25 as having no possibility of a collision with the host vehicle from the collision candidate target.
In this way, useless calculation objects can be reduced.

In step ST6 of “target motion prediction”, the target motion prediction unit 32 uses the prediction error covariance matrix including the current position and speed of the target calculated by the target tracking unit 23, and their smooth error covariance matrix. Calculate the future target movement range until the car turns right.
The predicted time is a plurality of discrete times set according to a certain sampling interval from the current time to the completion of the right turn.
These sampling times are completely matched with the sampling times set in step ST4 of “own vehicle motion prediction”.
In the prediction error covariance matrix, an element of the possibility of acceleration for each target type is added.

The target future predicted position is calculated by the following equation (27).






If the uncertainty of the target movement is set to a higher mobility target (for example, an automobile and a motorcycle have the same mobility, and a pedestrian has a lower mobility), this parameter is Set larger.



The more unstable the target, the larger this parameter is set.
As the set value, a plurality of discrete values according to the degree of anxiety may be prepared. For example, as shown in the following equation (33), the residual average value calculated in step ST3 of “target identification” It is good also as a value proportional to.

FIG. 11 is an explanatory diagram illustrating an example of a target motion prediction result.
The ellipse in the figure indicates the predicted movement range of the automobile, the two-wheeled vehicle, and the pedestrian from the time t1 to the time t4, and the standard deviation portion of the prediction error covariance matrix with the future predicted position of each target as the center. It has the spread of.

In step ST7 of “steerability determination”, the collision probability evaluation unit 33 determines whether or not the own vehicle and the predicted movement range of the target overlap with the target until the own vehicle reaches the destination according to the following equation (34). Determine the possibility of doing.
If there is a possibility of collision, it is determined that “right turn is impossible”.


The left side is the probability that the vehicle and the target will be at the same position, and is an integration over the entire position space of the simultaneous existence probability density at a specific position of the vehicle and the target, as shown in the following equation (35).

This is approximated by numerical calculation.

  In step ST8 of “operation is possible?”, If it is determined that “right turn is possible”, a message is displayed to the user (driver) 4 that it is safe to make a right turn at the present time.

If it is determined that a right turn is not possible, a series of determination processes are performed again after step ST9 of “a certain time has elapsed”.
This is repeated until it is determined that a right turn is possible.

According to the first embodiment, the steering propriety determination unit 3 includes the own vehicle motion prediction unit 31 that calculates the future predicted movement range of the own vehicle when the host vehicle is steered, and the tracking track / identification result 24. Using the saved target track and target type, the target that may collide is extracted by the steering of the vehicle, and the future target movement range of the extracted target during the steering of the vehicle is calculated. The possibility of collision is detected using the target motion prediction unit 32 to perform, the own vehicle predicted movement range calculated by the own vehicle motion prediction unit 31, and the target predicted movement range calculated by the target motion prediction unit 32, and whether or not steering is possible is determined. And a collision probability evaluation unit 33 for determination.
Thus, it is possible to determine whether or not steering is possible before steering the host vehicle.

According to the first embodiment, the target motion prediction unit 32 calculates the target predicted movement range using the first prediction error covariance matrix.
Therefore, the target predicted movement range can be widened by the standard deviation of the first prediction error covariance matrix, and the target predicted movement range can be set within a reasonable range to avoid the risk of collision. Can be calculated.

According to the first embodiment, the target identification unit 25 determines the instability of the target motion using the target track estimated by the target tracking unit 23, and the target motion prediction unit 32 determines the first prediction error covariance. The matrix is calculated using a first future motion uncertainty matrix based on the instability of the target motion determined by the target identification unit 25.
Therefore, the target predicted movement range can be calculated in consideration of the instability of the target motion, and it is possible to make a more reliable determination as to whether or not steering is possible.

According to the first embodiment, the target motion prediction unit 32 is based on the mobility of the target motion calculated by using the target type identified by the target identification unit 25 for the first prediction error covariance matrix. Calculate with the second future motion uncertainty matrix.
Therefore, the target predicted movement range can be calculated in consideration of the mobility of the target motion, and it is possible to make a more reliable determination as to whether or not steering is possible.

According to the first embodiment, the host vehicle motion prediction unit 31 calculates the host vehicle predicted travel range using the steering motion model and the second prediction error covariance matrix.
Therefore, the own vehicle predicted movement range can be assumed to have a spread due to the standard deviation of the second prediction error covariance matrix with the future predicted position based on the steering motion model as the center. It is possible to calculate within a reasonable range for avoiding the risk of collision.

According to the first embodiment, the collision probability evaluation unit 33 determines whether or not steering is possible based on the simultaneous existence probability density calculated using the own vehicle predicted movement range and the target predicted movement range.
Therefore, it is possible to make a more reliable determination as to whether or not steering is possible.

According to the first embodiment, the target identifying unit 25 determines whether or not there is a possibility of a collision between the target and the host vehicle based on the optical image information from the sensor 1, and the target motion predicting unit 32 is the target identifying unit 25. Are excluded from calculation targets of the target predicted movement range for targets determined as having no possibility of collision with the vehicle.
Therefore, useless calculation objects can be reduced and calculation efficiency can be increased.

  In the present invention, within the scope of the invention, any combination of the embodiments, or any modification of any component in each embodiment, or omission of any component in each embodiment is possible. .

  DESCRIPTION OF SYMBOLS 1 Sensor group, 2 Target recognition part, 3 Steering availability determination part, 4 User, 21 Target detection part, 22 Accumulated observation value, 23 Target tracking part, 24 Tracking track | truck / identification result, 25 Target identification part, 31 Own vehicle motion prediction Part, 32 target motion prediction part, 33 collision probability evaluation part.

Claims (9)

  1. A target detection unit that extracts an observed value of the target position from a signal from the sensor;
    A target tracking unit that estimates a target track including a target position and a target speed by tracking a time series of observation values extracted by the target detection unit;
    A target identification unit for identifying a target type using the target track estimated by the target tracking unit;
    A vehicle motion prediction unit that calculates a future vehicle predicted movement range when the vehicle is steered;
    The target track estimated by the target tracking unit and the target type identified by the target identification unit are used to extract a target that may cause a collision by steering the host vehicle, and steering the host vehicle with respect to the extracted target. A target motion prediction unit for calculating a future target predicted movement range during
    A collision probability evaluation unit that detects the possibility of collision using the vehicle predicted movement range calculated by the vehicle motion prediction unit and the target predicted movement range calculated by the target motion prediction unit, and determines whether steering is possible; A vehicle driving support apparatus provided.
  2. The target motion prediction unit
    The vehicle driving support apparatus according to claim 1, wherein the target predicted movement range is calculated by the first prediction error covariance.
  3. The target identification unit
    Using the target track estimated by the target tracking unit, determine the instability of the target motion,
    The first prediction error covariance is
    The vehicle driving support device according to claim 2, wherein the vehicle driving support device is calculated by a first future motion uncertainty based on the instability of the target motion determined by the target identification unit.
  4. The first prediction error covariance is
    The target tracking unit is calculated by extrapolating the smooth error covariance calculated when the target track is estimated, and is calculated by the first estimated error covariance of the target motion specifications at the current time. The vehicle driving support device according to claim 2.
  5. The first prediction error covariance is
    3. The vehicle driving support device according to claim 2, wherein the vehicle driving support device is calculated by a second future motion uncertainty based on the mobility of the target motion, which is calculated using the target type identified by the target identifying unit. .
  6. The own vehicle motion prediction unit
    The vehicle driving support device according to claim 1, wherein the own vehicle predicted movement range is calculated by a steering motion model and a second prediction error covariance.
  7. The second prediction error covariance is
    The vehicle error is calculated by a second estimated error covariance of the vehicle motion specifications at the current time, which is calculated by extrapolating an observation error of the vehicle motion observed by the vehicle sensor or GPS. 6. The vehicle driving support device according to 6.
  8. The collision probability evaluation unit
    2. The vehicle driving support apparatus according to claim 1, wherein whether or not steering is possible is determined from a simultaneous existence probability calculated using the predicted predicted moving range of the host vehicle and the predicted predicted moving range.
  9. The target identification unit
    Determine the possibility of a collision between the target and the vehicle based on the optical image information from the sensor,
    The target motion prediction unit
    2. The vehicle driving support apparatus according to claim 1, wherein the target determined by the target identification unit as having no possibility of a collision with the host vehicle is excluded from the target prediction movement range calculation target.
JP2013122865A 2013-06-11 2013-06-11 Vehicle driving assist system Pending JP2014241036A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013122865A JP2014241036A (en) 2013-06-11 2013-06-11 Vehicle driving assist system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013122865A JP2014241036A (en) 2013-06-11 2013-06-11 Vehicle driving assist system

Publications (1)

Publication Number Publication Date
JP2014241036A true JP2014241036A (en) 2014-12-25

Family

ID=52140253

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013122865A Pending JP2014241036A (en) 2013-06-11 2013-06-11 Vehicle driving assist system

Country Status (1)

Country Link
JP (1) JP2014241036A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016162270A (en) * 2015-03-03 2016-09-05 三菱電機株式会社 Proximity detection device and proximity detection method
WO2017002441A1 (en) * 2015-07-02 2017-01-05 三菱電機株式会社 Route prediction device
US10504370B2 (en) 2015-04-02 2019-12-10 Denso Corporation Collision avoidance apparatus, collision avoidance system, and driving support method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016162270A (en) * 2015-03-03 2016-09-05 三菱電機株式会社 Proximity detection device and proximity detection method
US10504370B2 (en) 2015-04-02 2019-12-10 Denso Corporation Collision avoidance apparatus, collision avoidance system, and driving support method
WO2017002441A1 (en) * 2015-07-02 2017-01-05 三菱電機株式会社 Route prediction device
WO2017002258A1 (en) * 2015-07-02 2017-01-05 三菱電機株式会社 Route prediction device
JPWO2017002441A1 (en) * 2015-07-02 2017-09-14 三菱電機株式会社 Route prediction device

Similar Documents

Publication Publication Date Title
Mukhtar et al. Vehicle detection techniques for collision avoidance systems: A review
US10407060B2 (en) Driver assistance apparatus and method for operating the same
JP6017044B2 (en) Driver assist system and method of operating driver assist system
US9583003B2 (en) Vehicle danger notification control apparatus
Keller et al. Active pedestrian safety by automatic braking and evasive steering
US20170032675A1 (en) Vehicular environment estimation device
CN104118382B (en) Collision determines that equipment, collision mitigate equipment and collision determination method
EP2463843B1 (en) Method and system for forward collision warning
US9889858B2 (en) Confidence estimation for predictive driver assistance systems based on plausibility rules
DE102013113619A1 (en) Probabilistic target selection and hazard assessment procedures and application to an intersection collision warning system
Nguyen et al. Stereo-camera-based urban environment perception using occupancy grid and object tracking
US10007854B2 (en) Computer vision based driver assistance devices, systems, methods and associated computer executable code
EP2549456B1 (en) Driving assistance device
Gandhi et al. Pedestrian protection systems: Issues, survey, and challenges
EP3101641A1 (en) Collision avoidance assistance device for a vehicle
Polychronopoulos et al. Sensor fusion for predicting vehicles' path for collision avoidance systems
US7436982B2 (en) Vehicle surroundings monitoring apparatus
US8949018B2 (en) Driving assistance device and driving assistance method
DE102018101125A1 (en) Recurrent deep neuronal convolution network for the detection of objects
CN102765365B (en) Pedestrian detection method based on machine vision and pedestrian anti-collision warning system based on machine vision
JP3846494B2 (en) Moving obstacle detection device
JP5297078B2 (en) Method for detecting moving object in blind spot of vehicle, and blind spot detection device
DE102004035842B4 (en) Dual disparate sensing object detection and detection system
US10246030B2 (en) Object detection apparatus and driving assistance apparatus
Schulz et al. Pedestrian intention recognition using latent-dynamic conditional random fields