WO2011125135A1 - Collision prevention support device - Google Patents

Collision prevention support device Download PDF

Info

Publication number
WO2011125135A1
WO2011125135A1 PCT/JP2010/002620 JP2010002620W WO2011125135A1 WO 2011125135 A1 WO2011125135 A1 WO 2011125135A1 JP 2010002620 W JP2010002620 W JP 2010002620W WO 2011125135 A1 WO2011125135 A1 WO 2011125135A1
Authority
WO
WIPO (PCT)
Prior art keywords
device
moving object
vehicle
position
image
Prior art date
Application number
PCT/JP2010/002620
Other languages
French (fr)
Japanese (ja)
Inventor
田崎豪
森屋彰久
佐々木隆
金野晃
Original Assignee
株式会社 東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社 東芝 filed Critical 株式会社 東芝
Priority to PCT/JP2010/002620 priority Critical patent/WO2011125135A1/en
Publication of WO2011125135A1 publication Critical patent/WO2011125135A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/16Type of information
    • B60K2370/179Distances to obstacles or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0097Predicting future conditions

Abstract

Disclosed is a collision prevention support device including a display device installed on a first moving body, that superimposes an image on an actual scene; an information acquisition device that measures a position and a speed of a second moving body existing in a vicinity of the first moving body; and a size measuring device that measures a size of the second moving body. Further comprised is a path position predicting device that includes a first predicting device that predicts a position and a path after a set time of the first moving body; and a second first predicting device that predicts a position and a path after a set time of the second moving body. An image creation device creates a vehicle image of the first moving body at the predicted position of the first moving body according to the predicted position, and a vehicle image of the second moving body obtained from the size measuring device, at the predicted position of the second moving body, and outputs these to the display device. This enables display of a safe position after the set time of the first moving body.

Description

Collision avoidance support device

The present invention relates to a collision avoidance support device.

For those who drive a moving object such as a car (hereinafter referred to as the own vehicle), in order to travel safely, the moving object (hereinafter referred to as another vehicle) around the own vehicle is detected, and the predicted position is Understanding is important. Therefore, in Japanese Patent Application Laid-Open Nos. 2007-323178 and 2007-27350, the predicted position of another surrounding vehicle after a predetermined time is fixed to a display device (head-up display) mounted on the vehicle. What is indicated by a flat mark of the present invention has been invented.

However, when the future predicted position of the surrounding other vehicle is displayed by a plane mark, it is said that the driver can not instantaneously perceive the distance relationship between the predicted position of the own vehicle and the predicted position of the surrounding other vehicle. There's a problem. In addition, the flat mark displayed on the head-up display may be an obstacle for the driver.

JP 2007-323178 A Japanese Patent Application Publication No. 2007-27350

Therefore, a display device capable of instantaneously determining the positional relationship between the host vehicle and other surrounding vehicles within several seconds is desired. In addition, it is desirable that the image displayed on the head-up display should not be an obstacle to the driver.

An object of the present invention is to provide a collision avoidance support device that supports the avoidance behavior of a moving object that can possibly collide.

In order to achieve the above object, a collision avoidance assistance device according to the present invention is mounted on a first moving object, and a display device for superimposing an image on a real scene, and a second device existing around the first moving object. An information acquisition device for measuring the position and velocity of the moving object, a size measuring device for measuring the size of the second moving object, and predicting a position and a path of the first moving object after a set time A path position prediction device having a first prediction device to perform the second moving device, and a second prediction device that predicts the route and the position of the second moving object after the set time; predicted position of the first moving object Creating a vehicle image of the first moving object according to its predicted position, and obtaining a vehicle image of the second moving object obtained from the size measuring device at the predicted position of the second moving object Created according to the predicted position, the display Characterized by comprising and an image creating apparatus for outputting.

According to the present invention, it is possible to intuitively know the positional relationship with a surrounding moving object entering a blind spot. In addition, since the positional relationship with the surrounding moving objects is specifically known, the driver's safety can be enhanced.

1 is a block diagram showing the configuration of a collision avoidance assistance device according to an embodiment of the present invention. FIG. 6 is a view showing the operation of the size measuring device of the present embodiment. The figure which shows the principle of the size measurement by the size measurement apparatus of this embodiment. 3 is a flowchart showing the operation of the present embodiment. An example of the picture seen through the driver's seat displayed on the head up display of this embodiment. The example which shows the prediction image after setting time when the self-vehicles of this embodiment try to change lanes.

Hereinafter, embodiments of the present invention will be described with reference to the drawings.

(Example)
FIG. 1 is a block diagram showing the configuration of a collision avoidance assistance device according to an embodiment of the present invention. In the following embodiments, description will be made on the assumption that a car is a moving object. The present invention is applicable not only to automobiles but also to maneuverable moving objects, such as vessels and remote controlled searchers.

The collision avoidance assistance device 100 shown in FIG. 1 is mounted on a moving object (own vehicle) driven by a driver. The collision avoidance assistance device 100 can be optionally turned on / off by the driver.

The collision avoidance support device 100 includes an information acquisition device 110 for acquiring information of a moving object (other vehicle) existing around the host vehicle, a size measurement device 120 for measuring the size of the other vehicle, and several seconds of the host vehicle A route position prediction device 130 that predicts future positions after tens of seconds and other vehicles several seconds to tens of seconds, and an image creation device that generates predicted images after several seconds to tens of seconds of the own vehicle and other vehicles 140, and a display 150 for displaying an image predicted through the windshield of the host vehicle.

The information acquisition device 110 includes a distance measurement device such as a laser sensor 111 and an inter-vehicle communication device 112. Although the laser sensor 111 is used here, a millimeter wave sensor may be used. When the information acquisition device 110 detects another vehicle around the traveling direction of the own vehicle using the laser sensor 111, the speed of the other surrounding vehicle and the relative position from the own vehicle are detected from the signal detected by the laser sensor 111. measure. In addition, wireless communication can be performed with another vehicle by the inter-vehicle communication device 112, and vehicle information such as the speed and size of the other vehicle can be obtained by communication.

The size measuring device 120 has a distance sensor 121 and a vehicle data file 122. The vehicle data file 122 stores data indicating the sizes of various vehicles that are commercially available. The size measuring device 120 detects the size of another vehicle existing around the host vehicle from a response signal to the signal emitted from the distance sensor 121. The distance sensor 121 may share the laser sensor 111 (or the millimeter wave sensor) of the information acquisition device 110.

FIG. 2 is a diagram showing the operation of the size measuring device 120. As shown in FIG. For example, the size measuring device 120 measures the distance to the own vehicle and the other vehicles around the vehicle using distance sensors 121a and 121b provided on the left and right of the own vehicle. At this time, as shown in FIG. 2, the distance sensors 121a and 121b transmit detection signals in the other direction toward the other vehicles present in the surrounding area. Then, the distance sensors 121a and 121b detect reflection signals from other vehicles to acquire point train data indicating the size of the other vehicles.

Next, as shown in FIG. 3, the size measuring apparatus 120 clusters the acquired point sequence data (length direction) and detects one or more line segments 124 and 125 from each cluster by the least square method. . Here, as a detection method of the line segments 124 and 125, another method such as Hough transform may be used.

At this time, as shown on the right side of FIG. 3, when two line segments 124 and 125 can be acquired (longitudinal direction and width direction), the longitudinal direction and width direction of the other vehicle from each end point of each line segment 124 and 125 Estimate the size of

Further, as shown on the left side of FIG. 3, when only one line segment 124 (for example, in the length direction) can be obtained, the length of the line segment 124 and the vehicle data recorded in advance in the vehicle data file 122 And estimate another line segment (for example, the width direction). Thereby, the size in the length direction and the width direction of the other vehicle present in the periphery is estimated. Alternatively, the estimated size (length direction and width direction) may be compared with the vehicle data of the vehicle data file 122, and the approximate size of the vehicle may be used as an output.

Furthermore, when the distance sensor 121 is provided so that the distance in the vertical direction can also be measured, the data of the point at the highest position on the vehicle position is drawn as height data of the moving object from the obtained multipoint data. Reflect on the image. If height data can not be obtained, the height may be estimated by comparing the obtained size on the plane of the moving object with the vehicle data of the vehicle data file 122. Alternatively, the height direction may be a predetermined constant value. Similarly, the estimated size (height direction) may be compared with the vehicle data of the vehicle data file 122, and the approximate size of the vehicle may be used as an output.

The route position prediction device 130 has a future position calculation unit 131 and a host vehicle information generation unit 132. The future position calculation unit 131 outputs relative position information and speed information of the other vehicle present in the surroundings output from the information acquisition device 110, and size information of the other vehicles present in the periphery output from the size measurement device 120. Receive The future position calculation unit 131 predicts a route after a set time (several seconds to several tens of seconds) of another vehicle present in the surroundings, and calculates position information. The future position calculation unit 131 outputs the calculated future position information and the predicted route to the image creating apparatus 140.

On the other hand, the host vehicle information generation unit 132 has a speedometer 133, a steering angle acquisition sensor 134 for the steering wheel, a GPS 135 for detecting the position of the host vehicle, a map information file 136 for storing map information, a turn signal 137, etc. . The host vehicle information generation unit 132 predicts the route and position of the host vehicle after the set time (several seconds to tens of seconds) using the speedometer 133 and the steering angle acquisition sensor 134 of the steering wheel. For example, when the winker signal 137 indicating the change of the traveling path is input, the host vehicle information generation unit 132 predicts the route and the position of the own vehicle several seconds to several tens of seconds after changing the traveling path. Then, the host vehicle information generation unit 132 can output display information indicating whether safe driving is possible from the predicted position of the host vehicle and the predicted position of the other vehicle.

That is, the host vehicle information generation unit 132 acquires map information currently being traveled from the map information file 136. Then, the route that can be moved is limited to the travelable places shown by the map and calculated. The host vehicle information generation unit 132 performs route and position prediction with higher accuracy based on the speed information from the speedometer 133 and the position information obtained from the GPS 135.

The setting of the prediction time may be initially set so as to monotonously decrease with the upper limit and the lower limit as the moving speed of the host vehicle increases. For example, assuming that the velocity v, the upper limit time T, and the lower limit time t, the initial time I (v) can be expressed as the following equation. Here, a is a positive constant.

Figure JPOXMLDOC01-appb-M000001

Other than the above equation, any I (v) may be used as long as it is a monotonically decreasing function with upper and lower limits. Also, the initial time may be a fixed value.

After setting the initial time, it is possible to change the predicted time by the input by the fine adjustment button (not shown). The fine adjustment of the prediction time may be replaced with an input method that can be easily operated during driving, such as voice recognition and gesture recognition, in addition to the button.

Thus, the position prediction information of the host vehicle obtained by the host vehicle information generation unit 132 is output to the image generation device 140. In addition, when there is no advance information such as a map, a method of predicting a future position and a route may be adopted by linearly complementing the position and the velocity.

The image creation device 140 has an image creation unit 141 and a vehicle image file 142. The vehicle image file 142 stores image data of various vehicles commercially available. The image data of a vehicle said here represents the external appearance shape (image of a magnitude | size) of various vehicles including the own vehicle.

The image creating unit 141 receives the predicted position of the host vehicle output from the route position prediction device 130 and the predicted positions of the other vehicles present in the surroundings, and displays the image using the image data of the vehicle from the vehicle image file 142 Create an image for display on device 150. The display device 150 is configured of a head-up display that can superimpose the created image on a real view through the windshield of the host vehicle.

Next, the operation of the collision avoidance assistance device according to the embodiment will be described. FIG. 4 is a flowchart showing the operation of the collision avoidance assistance device.

The user operates the collision avoidance assistance device 100 while traveling on the road (step S200). For example, the collision avoidance assistance device 100 may be designed to operate in conjunction with a winker switch (not shown). That is, when the driver changes lanes on the traveling road, he / she is obliged to turn on the turn signal switch at least 5 seconds ago to notify the driver of the other surrounding vehicles of the intention to change lanes. The collision avoidance assistance device 100 is automatically operated in conjunction with the operation of the winker switch to draw a predicted image after the time set on the head-up display. As a result, the display of the collision avoidance assistance device 100 does not adversely affect the driving operation. In addition, a switch different from the winker switch may be provided, and the on / off of the display may be switched by the on / off of the switch. In addition, in the case other than the automobile, it is possible to interlock with the switch of the turn signal.

When the collision avoidance assistance device 100 is operated, the information acquisition device 110 acquires information of other vehicles existing around the host vehicle. Further, the size measuring device 120 measures the size of the other vehicle present in the surroundings (step S210).

The future position calculation unit 131 of the route position prediction device 130 acquires the speed information and the relative position information of the other vehicle present in the surroundings output from the information acquisition device 110. Further, the future position calculation unit 131 acquires the size information of the other vehicle present in the surroundings, which is output from the size measurement device 120. Then, the future position calculation unit 131 predicts a position after a time set (for example, after 5 seconds) set from the speed information and relative position information of other vehicles present in the surroundings and a route to that position (step S220) .

The host vehicle information generation unit 132 of the route position prediction device 130 uses the signals from the speedometer 133 and the steering angle acquisition sensor 134 of the steering wheel to set the route and position after the time set by the host vehicle (for example, after 5 seconds) Are predicted (step S230). In the prediction of the future position of the host vehicle by the host vehicle information generation unit 132, on the map after a set time (for example, after 5 seconds) from the position information obtained from the GPS 135 and the map information from the map information file 136 Position is taken into account.

The image creation unit 141 of the image creation device 140 acquires size information of other vehicles present in the surroundings output from the route position prediction device 130. In addition, predicted position information and route information after the set time of the other vehicle present in the surroundings output from the future position calculation unit 131 are acquired. Further, predicted position information and route information after the set time of the own vehicle are acquired from the own vehicle information generation unit 132. Then, the image creating unit 141 reads a vehicle image corresponding to the size information of another vehicle from the vehicle image file 142 (step S240).

Then, if it is determined that the read vehicle image is usable from the predicted position information of the own vehicle and the predicted position information of the other vehicle (Yes in step S250), the image creating unit 141 is a vehicle of the other vehicle according to the predicted distance. An image and a predicted route are generated and output to the display device 150 (step S260).

On the other hand, if it is determined that use of the read vehicle image is not possible from the predicted position information of the own vehicle and the predicted position information of the other vehicle (No in step S250), the image creating unit 141 processes the read vehicle image . Specifically, when the distance between both vehicles is distant from the predicted position information of the own vehicle and the predicted position information of the other vehicle, the vehicle image may not be used as it is. In such a case, the read vehicle image (for example, the outside view) is shaped according to the angle obtained from the distance. Then, a new vehicle image is created according to the predicted distance and the angle viewed after the set time. The image creation unit 141 outputs the created vehicle image of the other vehicle and the predicted route to the display device 150 (step S270).

Further, the image creation unit 141 performs reduction processing according to predicted position information after the time set in the back surface image of the own vehicle read from the vehicle image file 142. The image creating unit 141 outputs the created vehicle image of the own vehicle and the predicted route to the display device 150 (step S280).

On the head-up display of the display device 150, the predicted route and the vehicle image of the own vehicle and the other vehicle after the set time are synthesized and displayed (step S290).

FIG. 5 shows an example of an image viewed through the driver's seat displayed on the head-up display of the host vehicle. The predicted route 300 of the own vehicle and the predicted routes 310 and 320 of the other vehicle are drawn in different colors (or patterns) to facilitate identification. Then, an image to be superimposed on the real scene is displayed, including the size of the other vehicle. In the head-up display, the obstacle on driving can be reduced by displaying the object on which the created vehicle image is displayed in a semi-transparent manner.

Although the vehicle images 330, 340, and 350 superimposed on the real scene are simplified, rectangular solids of known sizes are known so that the size and viewing angle change according to the relative position from the host vehicle and the driver's viewpoint. Is a simple three-dimensional representation of the vehicle.

Further, the luminance of the image displayed in FIG. 5 is set so low that the real view can be seen. The luminances of the vehicle image 330 of the own vehicle and the luminances of the vehicle images of the other vehicles 340 and 350 present in the surroundings are set to be different. Instead of setting different luminances, different colors may be used. In FIG. 5, the case where the brightness | luminance of the vehicle image 330 of the own vehicle is higher than the brightness | luminance of the vehicle images 340 and 350 of another vehicle is shown. The brightness of the host vehicle may be lower than the brightness of other vehicles. The image created here is successively updated according to the driver's viewpoint position, the own vehicle, and changes in the state of other vehicles existing around the own vehicle.

FIG. 6 is an example showing a predicted image after setting time (for example, 5 seconds) of the other vehicle traveling on the traveling lane and the own vehicle when, for example, the own vehicle tries to change lanes from the overtaking lane to the traveling lane. It is. The predicted image is a display of the predicted image of the lane change when the host vehicle information generation unit 132 receives the turn signal 137. In FIG. 6A, if it is determined by the route position prediction device 130 that the inter-vehicle distance is sufficient, for example, from the predicted position of the host vehicle after 5 seconds and the predicted position of the other vehicle traveling on the travel lane, It indicates that it is running. However, as shown in FIG. 6B, when it is determined by the route position prediction device 130 that the inter-vehicle distance is insufficient, a mark 400 prompting collision avoidance is displayed. Instead of the mark 400 prompting for collision avoidance, characters for informing the danger may be displayed. Further, a voice may be output to urge "safe driving" by voice output.

According to the present invention, it is possible to specifically grasp the moving object driven by the driver and the predicted position of the surrounding moving object so as to look at the actual landscape from the driver's point of view. The positional relationship with the moving object can be intuitively known. In addition, since the positional relationship with the surrounding moving objects is specifically known, safety during driving can be enhanced. Further, since the vehicle image is represented by a simple three-dimensional view and the luminance is displayed low, information can be presented without disturbing driving without being confused with an actual vehicle.

The present invention is not limited to the embodiment described above, and design changes may be made without departing from the scope of the invention.

100 .. collision avoidance support device, 110 .. information acquisition device, 111 .. laser sensor 112 .. inter-vehicle communication device, 120 .. size measurement device, 121 .. distance sensor,
122. Vehicle data file, 130 .. Route position prediction device 131 .. Future position calculation unit, 132 .. Own vehicle information generation unit, 133 .. Speedometer 134 .. Steering angle sensor, 135 .. GPS, 136 .. Map information data file 137 .. Signal, 140 .. Image creation device, 141 .. Image creation part 142 .. Vehicle image data file, 150 .. Display device

Claims (7)

  1. A display device mounted on a first moving object and superposing an image on a real scene;
    An information acquisition device for measuring the position and velocity of a second moving object existing around the first moving object;
    A size measuring device for measuring the size of the second moving object;
    A first prediction device that predicts a position and a route after the set time of the first moving object; and a second prediction device that predicts the position and the route after the set time of the second moving object A path position prediction device having
    The vehicle image of the first moving object is created at the predicted position of the first moving object according to the predicted position, and the second obtained from the size measuring device at the predicted position of the second moving object An image creating apparatus that creates a vehicle image of the moving object of No. 2 according to its predicted position and outputs it to the display device;
    The collision avoidance assistance device characterized by comprising.
  2. The collision avoidance assistance device according to claim 1, wherein the predicted path of the first moving object is displayed in a color different from the predicted path of the second moving object.
  3. The collision avoidance assistance device according to claim 1, wherein the display device is configured such that an object on which the created vehicle image is displayed is translucent.
  4. The collision avoidance assistance according to claim 1, wherein the image creating device displays the luminance of the vehicle image of the first moving object and the luminance of the vehicle image of the second moving object to be different. apparatus.
  5. The apparatus further comprises a movement direction specification device for specifying the movement direction of the first moving object,
    The collision avoidance assistance device according to claim 1, wherein the image created by the image creation device is superimposed on the display device according to an output signal from the movement direction designation device.
  6. The collision avoidance according to claim 1, wherein the path position prediction device initializes a future time to be displayed according to the velocity of the first moving object, and enables the future time to be finely adjusted. Support device.
  7. The path position prediction apparatus performs an output for promoting collision avoidance when judging that there is a danger from the predicted position of the first moving object and the predicted position of the second moving object. Collision avoidance support device.
PCT/JP2010/002620 2010-04-09 2010-04-09 Collision prevention support device WO2011125135A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/002620 WO2011125135A1 (en) 2010-04-09 2010-04-09 Collision prevention support device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/002620 WO2011125135A1 (en) 2010-04-09 2010-04-09 Collision prevention support device

Publications (1)

Publication Number Publication Date
WO2011125135A1 true WO2011125135A1 (en) 2011-10-13

Family

ID=44762124

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/002620 WO2011125135A1 (en) 2010-04-09 2010-04-09 Collision prevention support device

Country Status (1)

Country Link
WO (1) WO2011125135A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014198551A1 (en) * 2013-06-11 2014-12-18 Robert Bosch Gmbh Method for operating a display device, computer program product, and display device
US8928983B2 (en) 2012-01-31 2015-01-06 Kabushiki Kaisha Toshiba Display apparatus, moving body, and method for mounting display apparatus
US8970453B2 (en) 2009-12-08 2015-03-03 Kabushiki Kaisha Toshiba Display apparatus, display method, and vehicle
JP2015094683A (en) * 2013-11-13 2015-05-18 日産自動車株式会社 Travel guide device for vehicle, and travel guide method for vehicle
EP3095660A3 (en) * 2015-05-19 2016-11-30 Ford Global Technologies, LLC A method and system for increasing driver awareness by modifying the frequency of a visual system
EP3154041A1 (en) * 2015-10-07 2017-04-12 LG Electronics Inc. Vehicle surround monitoring device
JP2017167974A (en) * 2016-03-17 2017-09-21 株式会社東芝 Estimation apparatus, method and program
WO2018088224A1 (en) * 2016-11-09 2018-05-17 ソニー株式会社 Information processing device, information processing method, program, and moving body
CN108609016A (en) * 2018-03-12 2018-10-02 上海伊控动力系统有限公司 A kind of Vehicular intelligent speed limiting system and method based on vehicle net
WO2018180756A1 (en) * 2017-03-31 2018-10-04 アイシン・エィ・ダブリュ株式会社 Drive assistance system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007034988A (en) * 2005-07-29 2007-02-08 Nissan Motor Co Ltd Obstacle avoidance warning device for vehicle
JP2007153307A (en) * 2005-11-09 2007-06-21 Nissan Motor Co Ltd Vehicle driving assist device and vehicle equipped with the same device
JP2007233646A (en) * 2006-02-28 2007-09-13 Toyota Motor Corp Object track prediction method, device, and program
JP2008037167A (en) * 2006-08-02 2008-02-21 Mazda Motor Corp Vehicular information display device
JP2008070998A (en) * 2006-09-13 2008-03-27 Hitachi Ltd Vehicle surroundings information display unit

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007034988A (en) * 2005-07-29 2007-02-08 Nissan Motor Co Ltd Obstacle avoidance warning device for vehicle
JP2007153307A (en) * 2005-11-09 2007-06-21 Nissan Motor Co Ltd Vehicle driving assist device and vehicle equipped with the same device
JP2007233646A (en) * 2006-02-28 2007-09-13 Toyota Motor Corp Object track prediction method, device, and program
JP2008037167A (en) * 2006-08-02 2008-02-21 Mazda Motor Corp Vehicular information display device
JP2008070998A (en) * 2006-09-13 2008-03-27 Hitachi Ltd Vehicle surroundings information display unit

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8970453B2 (en) 2009-12-08 2015-03-03 Kabushiki Kaisha Toshiba Display apparatus, display method, and vehicle
US8928983B2 (en) 2012-01-31 2015-01-06 Kabushiki Kaisha Toshiba Display apparatus, moving body, and method for mounting display apparatus
WO2014198551A1 (en) * 2013-06-11 2014-12-18 Robert Bosch Gmbh Method for operating a display device, computer program product, and display device
CN105264547A (en) * 2013-06-11 2016-01-20 罗伯特·博世有限公司 Method for operating a display device, computer program product, and display device
JP2015094683A (en) * 2013-11-13 2015-05-18 日産自動車株式会社 Travel guide device for vehicle, and travel guide method for vehicle
EP3095660A3 (en) * 2015-05-19 2016-11-30 Ford Global Technologies, LLC A method and system for increasing driver awareness by modifying the frequency of a visual system
CN106166989A (en) * 2015-05-19 2016-11-30 福特全球技术公司 A kind of method and system improving driver alertness
EP3154041A1 (en) * 2015-10-07 2017-04-12 LG Electronics Inc. Vehicle surround monitoring device
US10479274B2 (en) 2015-10-07 2019-11-19 Lg Electronics Inc. Vehicle and control method for the same
JP2017167974A (en) * 2016-03-17 2017-09-21 株式会社東芝 Estimation apparatus, method and program
WO2018088224A1 (en) * 2016-11-09 2018-05-17 ソニー株式会社 Information processing device, information processing method, program, and moving body
WO2018180756A1 (en) * 2017-03-31 2018-10-04 アイシン・エィ・ダブリュ株式会社 Drive assistance system
CN108609016A (en) * 2018-03-12 2018-10-02 上海伊控动力系统有限公司 A kind of Vehicular intelligent speed limiting system and method based on vehicle net

Similar Documents

Publication Publication Date Title
JP6346614B2 (en) Information display system
US9723243B2 (en) User interface method for terminal for vehicle and apparatus thereof
US10293748B2 (en) Information presentation system
EP2981077B1 (en) Periphery monitoring device and program
US8831867B2 (en) Device and method for driver assistance
JP6397934B2 (en) Travel control device
US9283963B2 (en) Method for operating a driver assist system of an automobile providing a recommendation relating to a passing maneuver, and an automobile
JP5316713B2 (en) Lane departure prevention support apparatus, lane departure prevention method, and storage medium
US9248796B2 (en) Visually-distracted-driving detection device
JP5160564B2 (en) Vehicle information display device
DE102008046038B4 (en) Display device and method for displaying an image
JP5492962B2 (en) Gaze guidance system
DE112014003145T5 (en) Head-up ad and program
JP2016506572A (en) Infotainment system
JP4893945B2 (en) Vehicle periphery monitoring device
US9880253B2 (en) Vehicle object monitoring system
EP1564703B1 (en) Vehicle driving assist system
JP2016132374A (en) Collision avoidance control system
JP4847178B2 (en) Vehicle driving support device
JP4075743B2 (en) Vehicle travel support device
US6411898B2 (en) Navigation device
US9586525B2 (en) Camera-assisted blind spot detection
JP5070809B2 (en) Driving support device, driving support method, and program
EP2080668B1 (en) Driving assist device, method and computer program product for a vehicle
EP2988098B1 (en) Driver assistance system with non-static symbol of fluctuating shape

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10849381

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10849381

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: JP