WO2015166558A1 - 周辺監視装置、周辺監視システムおよび周辺監視方法 - Google Patents
周辺監視装置、周辺監視システムおよび周辺監視方法 Download PDFInfo
- Publication number
- WO2015166558A1 WO2015166558A1 PCT/JP2014/061996 JP2014061996W WO2015166558A1 WO 2015166558 A1 WO2015166558 A1 WO 2015166558A1 JP 2014061996 W JP2014061996 W JP 2014061996W WO 2015166558 A1 WO2015166558 A1 WO 2015166558A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- driver
- information
- notification
- visual field
- Prior art date
Links
- 238000012806 monitoring device Methods 0.000 title claims abstract description 59
- 238000000034 method Methods 0.000 title claims description 25
- 238000012544 monitoring process Methods 0.000 title claims description 17
- 230000002093 peripheral effect Effects 0.000 claims abstract description 27
- 230000000007 visual effect Effects 0.000 claims description 134
- 238000004891 communication Methods 0.000 claims description 40
- 239000002131 composite material Substances 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 11
- 238000012545 processing Methods 0.000 description 10
- 230000000694 effects Effects 0.000 description 4
- 230000002194 synthesizing effect Effects 0.000 description 4
- 102100036848 C-C motif chemokine 20 Human genes 0.000 description 2
- 230000004397 blinking Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 1
- 108090000237 interleukin-24 Proteins 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/123—Traffic control systems for road vehicles indicating the position of vehicles, e.g. scheduled vehicles; Managing passenger vehicles circulating according to a fixed timetable, e.g. buses, trains, trams
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/40—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components
- B60R2300/406—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the details of the power supply or the coupling to vehicle components using wireless transmission
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/50—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the display information being shared, e.g. external display, data transfer to other traffic participants or centralised traffic controller
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/802—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for monitoring and displaying vehicle exterior blind spot views
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/804—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
Definitions
- the present invention relates to a periphery monitoring device, a periphery monitoring system, and a periphery monitoring method for monitoring the periphery of a vehicle such as an automobile and notifying a driver.
- Patent Document 1 in a vehicle such as an automobile, as a technique for detecting a situation around the own vehicle and notifying a user, the vehicle type of the surrounding vehicle is detected by a radar or a camera, and the vehicle type information is obtained from information on the corresponding vehicle type. Assuming a region outside the field of view, there has been proposed a device for notifying the driver when the host vehicle is outside the field of view of surrounding vehicles.
- JP 2009-187424 A Japanese Patent Laid-Open No. 2-224737
- the blind spot is determined according to the vehicle type regardless of whether the driver actually confirms the direction. Even if it is determined that the driver is in the vehicle, if the driver of the surrounding vehicle is not looking in the direction where the vehicle is, the vehicle is traveling outside the field of view of the surrounding vehicle. There was a problem that a warning could not be issued to the driver.
- the present invention has been made to solve the above-described problems.
- the present invention monitors the periphery of a vehicle such as an automobile, and provides a driver only when there is a surrounding vehicle that is not really in the driver's field of view.
- An object of the present invention is to provide a periphery monitoring device, a periphery monitoring system, and a periphery monitoring method for notification.
- the present invention provides a vehicle position for acquiring position information of the own vehicle in a periphery monitoring device mounted on the own vehicle to which a notification device for notifying a driver of the own vehicle is connected. Based on the acquisition unit, the line-of-sight acquisition unit that acquires the line-of-sight information of the driver of the host vehicle, and the line-of-sight information acquired by the line-of-sight acquisition unit, the dynamic field of view range of the driver of the host vehicle is calculated.
- a field-of-view calculation unit, position information of the host vehicle acquired by the host vehicle position acquisition unit, and a dynamic field of view of a driver of the host vehicle calculated by the field-of-view calculation unit A communication unit that transmits to the other vehicle and receives the position information of the other vehicle and a dynamic field of view of the driver of the other vehicle; the position information of the host vehicle acquired by the host vehicle position acquisition unit; Received by the communication unit Determination based on position information of the other vehicle and a dynamic visual field range of the driver of the other vehicle to determine whether the host vehicle is outside the dynamic visual field range of the driver of the other vehicle And a notification control that instructs the notification device to output notification information when it is determined by the determination unit and the determination unit that the host vehicle is outside the dynamic field of view of the driver of the other vehicle And a section.
- the visual field information of the driver of the surrounding vehicle of the own vehicle is received, and based on the information, it is determined whether or not the own vehicle is outside the visual field of the surrounding vehicle. Not only the current dynamic field of view of the driver of the surrounding vehicle that actually travels around the vehicle can be known, but also a more accurate field of view can be obtained, so the necessary notification information is properly notified It is possible to suppress notification of unnecessary notification information.
- FIG. 1 is a block diagram illustrating an example of a peripheral monitoring device according to Embodiment 1 and peripheral devices connected thereto.
- FIG. It is a table
- 3 is a flowchart illustrating an operation of a visual field calculation unit in the periphery monitoring device according to the first embodiment. It is a figure which shows the range of the visual field (alpha) 1 currently hold
- FIG. 4 is a flowchart illustrating an operation of a determination unit in the periphery monitoring device according to the first embodiment.
- 4 is a flowchart illustrating an operation of a notification control unit in the periphery monitoring device according to the first embodiment.
- 10 is a table showing an example of a notification pattern in which a collision possibility and a notification information notification method using an HMI are associated in the first embodiment.
- FIG. 6 is a block diagram which shows an example of the periphery monitoring apparatus by Embodiment 2, and the peripheral device connected to it.
- 6 is a flowchart illustrating an operation of a determination unit in the periphery monitoring device according to the second embodiment. It is a figure which shows the positional relationship with the visual field information in the surrounding vehicle A calculated by the determination part in Embodiment 2, and the own vehicle. It is a figure which shows the positional relationship with the visual field information in the surrounding vehicle B calculated by the determination part in Embodiment 2, and the own vehicle. It is the figure which showed the visual field information in the surrounding vehicles A and B synthesize
- FIG. 10 is a flowchart illustrating an operation of a notification control unit in the periphery monitoring device according to the second embodiment.
- FIG. 10 is a diagram illustrating a display example of a notification pattern in the second embodiment. It is a block diagram which shows an example of the periphery monitoring apparatus by Embodiment 3, and the peripheral device connected to it.
- 10 is a flowchart illustrating an operation of a visual field calculation unit in the periphery monitoring device according to the third embodiment.
- 12 is a flowchart illustrating an operation of a determination unit in the periphery monitoring device according to the third embodiment.
- the periphery monitoring device of the present invention receives visual field information of a driver of a surrounding vehicle of the own vehicle, determines whether the own vehicle is outside the visual field of the surrounding vehicle based on the information, and notifies an alarm, guidance, etc. Information is notified.
- the periphery monitoring device of the present invention may be incorporated in a vehicle-mounted device mounted on a vehicle such as an automobile, such as a vehicle-mounted device such as a car navigation device or an instrument panel, or applied to a server. May be.
- you may apply to the application etc. which are installed in portable information terminals, such as a smart phone, a tablet PC, and a mobile phone.
- FIG. 1 is a block diagram illustrating an example of a periphery monitoring device according to Embodiment 1 and peripheral devices connected thereto.
- the periphery monitoring device 1 mounted on a vehicle such as an automobile includes a line-of-sight acquisition unit 101, a vehicle type information acquisition unit 102, a visual field calculation unit 103, a vehicle position acquisition unit 104, a communication unit 105, a determination unit 107, and a notification control unit 108. I have.
- an in-vehicle camera 2 As a peripheral device of the periphery monitoring device 1, an in-vehicle camera 2, a GPS (Global Positioning System) 3, an inter-vehicle communication antenna 4 and an HMI (Human Machine Interface) 5 are connected. It is assumed that all the vehicles are equipped with the periphery monitoring device 1 and peripheral devices connected thereto.
- GPS Global Positioning System
- HMI Human Machine Interface
- the line-of-sight acquisition unit 101 receives the video output from the in-vehicle camera 2 and acquires the line-of-sight information of the driver of the vehicle (the direction of the driver's line-of-sight and eye position information) from the video.
- the in-vehicle camera 2 is installed so as to face the inside of the vehicle so as to photograph the face of the driver of the vehicle.
- a known technique as shown in Patent Document 2 may be used, and the description thereof is omitted here.
- the vehicle type information acquisition unit 102 acquires information (vehicle type information) related to the vehicle type of the host vehicle from the outside of the periphery monitoring device 1 using a communication interface such as CAN (Controller Area Network).
- a communication interface such as CAN (Controller Area Network).
- the field-of-view calculation unit 103 is based on the vehicle type information acquired by the vehicle type information acquisition unit 102 and the line-of-sight information (line-of-sight direction and eye position information) of the driver of the vehicle acquired by the line-of-sight acquisition unit 101.
- the dynamic visual field range of the driver of the own vehicle is calculated.
- the calculation of the visual field can be performed with reference to a visual field table in which the vehicle type and the line-of-sight direction are associated with each other as shown in FIG.
- FIG. 2 is a table showing an example of a visual field table in which vehicle types and line-of-sight directions are associated with each other.
- the manufacturer and model of the vehicle the field of view when the line-of-sight direction is the rear view mirror, the field of view when it is the left side mirror, the field of view when it is the front, the field of view when it is the left side, right
- a field of view table in which the field of view for the side mirror (not present in the table of FIG. 2), the field of view for the right side (also not present in the table of FIG. 2), etc. are stored in association with each other. Is stored in the visual field calculation unit 103.
- the visual field calculation unit 103 combines the calculated visual field and the visual field up to a predetermined time (T1) before, and calculates the driver's dynamic composite visual field range. Details of the processing in the visual field calculation unit 103 will be described later with reference to the flowchart shown in FIG.
- the own vehicle position acquisition unit 104 acquires the position (latitude / longitude) information of the own vehicle based on the information of the GPS 3.
- the communication unit 105 collects the position information of the host vehicle acquired by the host vehicle position acquisition unit 104 and the dynamic composite field of view range of the driver of the host vehicle calculated by the field of view calculation unit 103. 4 is transmitted to other vehicles in the vicinity of the own vehicle, and the position information (latitude / longitude) of the other vehicle transmitted by the other vehicles in the vicinity of the own vehicle and the dynamic view range of the driver of the other vehicle Received via the inter-vehicle communication antenna 4.
- the determination unit 107 includes the position information (latitude / longitude) acquired by the vehicle position acquisition unit 104, the position information (latitude / longitude) of the surrounding vehicle (other vehicle) received by the communication unit 105, and the surrounding vehicle ( Based on the dynamic visual field range of the driver of the other vehicle), it is determined whether or not the own vehicle exists outside the dynamic visual field range of the driver of the surrounding vehicle (other vehicle).
- the determination unit 107 determines that the host vehicle is outside the dynamic field of view of the driver of the surrounding vehicle (other vehicle), the possibility of collision between the host vehicle and the other vehicle, which is the determination result, is determined. Output to the notification control unit 108. Details of processing in the determination unit 107 will be described later with reference to a flowchart shown in FIG.
- the notification control unit 108 When the notification control unit 108 receives an output from the determination unit 107, that is, when the determination unit 107 determines that the host vehicle is outside the dynamic visual field range of the driver of the surrounding vehicle (other vehicle), An instruction is given to output notification information such as an alarm to the HMI 5 that is a notification device for notifying the driver of the host vehicle, such as a sound output device such as a speaker or a display device such as a display or an icon.
- notification information such as an alarm to the HMI 5 that is a notification device for notifying the driver of the host vehicle, such as a sound output device such as a speaker or a display device such as a display or an icon.
- T2 a predetermined time (T2) elapses after the determination unit 107 determines that the own vehicle is outside the dynamic visual field range of the driver of the surrounding vehicle (other vehicle).
- the HMI 5 that is a notification device is instructed to output notification information.
- a specific notification method for notifying the driver of the host vehicle will be described later using a table shown in FIG.
- the operation of the visual field calculation unit 103 in the periphery monitoring device 1 according to the first embodiment will be described using the flowchart shown in FIG.
- the driver's line-of-sight direction and eye position are acquired from the line-of-sight acquisition unit 101 (step ST1).
- the vehicle type information of the host vehicle is acquired from the vehicle type information acquisition unit 102 (step ST2).
- the driver's visual field is calculated with reference to a visual field table in which the vehicle type and the line-of-sight direction are associated as shown in FIG. 2 (step ST3).
- the vehicle type information of the host vehicle acquired in step ST2 is the vehicle type A1 of manufacturer A and the line-of-sight direction acquired in step ST1 is a rearview mirror, refer to the visual field table shown in FIG. Can be calculated to be “ ⁇ 1”.
- FIG. 4 to 7 are diagrams showing ranges of the visual fields ⁇ 1, ⁇ 1, ⁇ 1, and ⁇ 1 held in the visual field calculation unit 103.
- FIG. 4 to 7 are diagrams showing ranges of the visual fields ⁇ 1, ⁇ 1, ⁇ 1, and ⁇ 1 held in the visual field calculation unit 103.
- the visual field “ ⁇ 1” calculated in step ST3 is (0,1) ( ⁇ 1,2) (0,2) (1,2) ( ⁇ 2,3) ( -1,3) (0,3) (1,3) (2,3).
- X in (X, Y) is positive in the right direction and negative in the left direction (as viewed from the driver) with respect to the front direction of the vehicle, and Y is positive in the rear direction of the vehicle. Indicates minus.
- step ST4 The field of view calculated in this way is recorded (step ST4). If there is a field of view recorded within a predetermined time (T1) before the predetermined time (within the past predetermined time (T1)) (in the case of YES in step ST5), the past predetermined time (T1) ) Are acquired within the past several times (step ST6), combined with the field of view calculated in step ST3 (step ST7), and output to the communication unit 105 (step ST8).
- step ST5 if there is no field of view recorded within the past predetermined time (T1) (NO in step ST5), the field of view calculated in step ST3 is directly output to the communication unit 105 (step ST8).
- the field of view ⁇ 1 corresponding to the left side mirror in this vehicle type A1 as shown in FIG. 5, ( ⁇ 1, 0) ( ⁇ 1, 1) ( ⁇ 2, 2) ( ⁇ 1, 2) ( ⁇ 3, 3)
- the range of ( ⁇ 2, 3) ( ⁇ 1, 3) is recorded, and the field of view ⁇ 1 corresponding to the front is ( ⁇ 2, ⁇ 3) ( ⁇ 1, ⁇ 3) (0) as shown in FIG. , -3) (1, -3) (2, -3) (-1, -2) (0, -2) (1, -2) (0, -1) range is recorded and corresponds to the left side
- the field of view ⁇ 1 is ( ⁇ 3, ⁇ 2) ( ⁇ 3, ⁇ 1) ( ⁇ 2, ⁇ 1) ( ⁇ 1,0) ( ⁇ 2,1) ( ⁇ 3,2).
- the range of is recorded.
- step ST5 since it is determined whether there is a field of view recorded within the past predetermined time (T1) in step ST5, the past recorded within the past predetermined time (T1) is determined.
- Several fields of view that is, three fields of view shown in FIGS. 5 to 7 are acquired (step ST6), and the current field of view shown in FIG. 4 and all of these are synthesized (step ST7), as shown in FIG. A wide range of visual fields.
- FIG. 8 is a diagram showing the driver's current dynamic field of view (composite field of view) calculated and synthesized by the field of view calculation unit 103.
- the combined visual field recorded within the past predetermined time (T1) is (-2, -3) (-1, -3) (0, -3) (1, -3 ) (2, -3) (-3, -2) (-1, -2) (0, -2) (1, -2) (-3, -1) (-2, -1) (0, -1) (-1, 0) (-2, 1) (-1, 1) (0, 1) (1, 1) (-3, 2) (-2, 2) (-1, 2) ( 0,2) (1,2) (-3,3) (-2,3) (-1,3) (0,3) (1,3) (2,3), which is output to the communication unit (Step ST8).
- the visual field calculation unit 103 calculates the visual field range
- the vehicle type information and the driver's eye position information are also used, so that the difference in the visual field range blocked by the pillars due to the difference in the vehicle type, the driver's eyes, etc. Therefore, it is possible to obtain a field of view range with higher accuracy.
- step ST11 it is confirmed whether or not information on other vehicles can be received from the communication unit 105 through inter-vehicle communication. And when the information of a new other vehicle can be received, ie, there exists new reception data (step ST11), the position of the own vehicle is acquired from the own vehicle position acquisition part 104 (step ST12).
- step ST13 the position of the other vehicle and the visual field information of the driver are acquired from the received information of the new other vehicle. Then, based on the position information of the host vehicle acquired in step ST12 and the position information of the other vehicle acquired in step ST13, a relative position between the host vehicle and the other vehicle is calculated (step ST14).
- the predetermined range (distance) determined in advance is set to change according to the speed of the host vehicle.
- step ST15 If the position of the other vehicle is outside the predetermined range (out of the distance) (NO in step ST15), it is determined that the other vehicle is far away from affecting the collision, and the process returns to step ST11 to perform the process. repeat.
- Step ST15 when the position of the other vehicle is within a predetermined range (within the distance) (in the case of YES in step ST15), the other vehicle is nearby, that is, the other vehicle (surrounding vehicle) around the own vehicle. And the visual field information of the other vehicle acquired in step ST13 is superimposed on the relative position calculated in step ST14, and it is determined whether or not the host vehicle exists outside the visual field of the surrounding other vehicle (peripheral vehicle). (Step ST16).
- step ST17 when it determines with the own vehicle existing outside the visual field of a surrounding vehicle (in the case of YES of step ST16), the collision possibility notified to the notification control part 108 is added (step ST17). Note that this possibility of collision may be added one by one, or the numerical value to be added may be changed according to the relative distance.
- step ST16 if the host vehicle does not exist outside the field of view of the surrounding vehicle, that is, if it is within the field of view of the surrounding vehicle (in the case of NO in step ST16), the driver of the surrounding vehicle does not have to inform the driver of the vehicle. Is also visible, the process returns to step ST11 and is repeated.
- the collision probability is 1.
- notification control is performed.
- the collision possibility is output to the unit 108 (step ST18), the collision possibility is reset to 0 (zero), and this process is terminated.
- step ST21 when collision possibility information is received from the determination unit 107 (YES in step ST21), it is confirmed whether or not the received collision possibility is equal to or greater than a predetermined value (step ST22).
- a predetermined value it is assumed that “1” is set as the predetermined value.
- T2 2 seconds, but this value may be appropriately set by the user.
- the data is output via the HMI 5 (step ST24).
- step ST22 when the possibility of collision is less than a predetermined value (in the case of NO in step ST22), and in step ST23, the time during which the host vehicle is outside the field of view of the surrounding vehicle is less than the predetermined time (T2). If this is the case (in the case of NO in step ST23), the process is terminated as it is.
- an instruction to output the notification information is made only when a predetermined time (T2) has elapsed since the determination unit 107 determined that the host vehicle is outside the dynamic visual field range of the driver of the surrounding vehicle. By doing so, it is possible to prevent the driver from making the driver uncomfortable many times in a short time and to suppress unnecessary notifications.
- FIG. 11 is a table showing an example of a notification pattern in which a collision possibility is associated with a notification method of notification information by HMI.
- the HMI 5 exemplifies the case of outputting notification information by voice such as a speaker and the case of outputting notification information by icon display.
- a notification information notification method is set such that an icon with a red color and a large size blink at 10 Hz intervals.
- the notification information output strength is increased, and when the possibility of collision is low, the notification information output strength is decreased according to the collision possibility. If the intensity of the information output is changed, the driver can immediately recognize how much a collision is likely to occur depending on the intensity of the notification information output.
- the notification control unit 108 instructs the notification device (HMI 5 such as a speaker or an icon) to change the output strength of the notification information in accordance with the possibility of collision, and outputs the notification information.
- HMI 5 such as a speaker or an icon
- the display is advantageous in that the driver can easily see the driver's eyes and the driver can easily recognize the possibility of a collision.
- notification methods other than voice and icon display may be used.
- the notification control unit 108 outputs a beeping sound at the volume 2.
- an instruction is given to the notification device (HMI 5 such as a speaker or an icon) so as to notify the driver with a large icon display blinking in red at 10 Hz intervals.
- the visual field information of the driver of the surrounding vehicle of the own vehicle is received, and it is determined whether the own vehicle is out of the visual field of the surrounding vehicle based on the information.
- the current dynamic field of view of the driver of the surrounding vehicle that is actually driving around the vehicle can be known, and more accurate field of view can be obtained. Notification can be performed appropriately, and notification of unnecessary notification information can be suppressed.
- the peripheral monitoring device 1 is a specific example in which hardware and software cooperate by a microcomputer of a device mounted on a vehicle such as an automobile to which the peripheral monitoring device 1 is applied executing a program relating to processing unique to the present invention. As a practical means. The same applies to the following embodiments.
- FIG. FIG. 12 is a block diagram illustrating an example of a peripheral monitoring device according to the second embodiment and peripheral devices connected thereto.
- symbol is attached
- the periphery monitoring device 10 according to the second embodiment described below is different from the periphery monitoring device 1 according to the first embodiment in the contents of processing in the determination unit 117 and the notification control unit 118.
- the periphery monitoring device 1 it is determined whether or not the own vehicle is outside the field of view of another vehicle (peripheral vehicle), and only the possibility of collision is notified.
- the peripheral monitoring apparatus 10 in FIG. 1 not only determines whether or not the own vehicle is out of the field of view of another vehicle (peripheral vehicle), but also determines where the own vehicle moves to be within the field of view of the surrounding vehicle. The direction is calculated and guided.
- the determination unit 117 includes the position information (latitude / longitude) acquired by the vehicle position acquisition unit 104, the position information (latitude / longitude) of the surrounding vehicle (other vehicle) received by the communication unit 105, and the surrounding vehicle ( Based on the dynamic visual field range of the driver of the other vehicle), it is determined whether or not the own vehicle is outside the dynamic visual field range of the driver of the surrounding vehicle (other vehicle) and out of the visual field range If it is determined that the vehicle exists, the distance and direction to the position where the host vehicle is within the field of view of the surrounding vehicle are calculated and output to the notification control unit 118.
- the determination unit 117 of the periphery monitoring device 10 synthesizes the position information and the field-of-view information of the surrounding vehicle received by the communication unit 105, and the own vehicle is outside the field-of-view range of the other vehicle (the surrounding vehicle).
- the distance to the position within the field of view of the surrounding vehicle and the direction are calculated and output to the notification control unit 118 when the vehicle is outside the field of view. This is different from the determination unit 107 of the periphery monitoring device 1. Details of the processing in the determination unit 117 will be described later using the flowchart shown in FIG.
- the notification control part 118 receives the output from the determination part 117, ie, when the determination part 117 determines with the own vehicle existing outside the dynamic visual field range of the driver of a surrounding vehicle (other vehicle).
- the HMI 5 that is a notification device for notifying the driver of the host vehicle, such as a sound output device such as a speaker or a display device such as a display or an icon, moves the driver of the surrounding vehicle (another vehicle).
- An instruction is given to output notification information that guides the user within the visual field range.
- T2 a predetermined time (T2) elapses after the determination unit 117 determines that the host vehicle is outside the dynamic visual field range of the driver of the surrounding vehicle (other vehicle).
- the HMI 5 that is a notification device is instructed to output notification information.
- a specific notification method for notifying the driver of the host vehicle will be described later using a display example shown in FIG.
- step ST31 it is confirmed whether or not information on other vehicles can be received from the communication unit 105 through inter-vehicle communication. And when the information of a new other vehicle can be received, ie, there exists new reception data (step ST31), the position of the own vehicle is acquired from the own vehicle position acquisition part 104 (step ST32).
- step ST33 the position of the other vehicle and the visual field information of the driver are acquired from the received information of the new other vehicle. Then, based on the position information of the host vehicle acquired in step ST32 and the position information of the other vehicle acquired in step ST33, the relative position of the host vehicle and the other vehicle is calculated (step ST34).
- the predetermined range (distance) determined in advance is set to change according to the speed of the host vehicle.
- step ST35 If the position of the other vehicle is outside the predetermined range (out of the distance) (in the case of NO in step ST35), it is determined that the other vehicle is far away from affecting the collision, and the process returns to step ST31 to perform the process. repeat.
- step ST35 when the position of the other vehicle is within a predetermined range (within the distance) (in the case of YES in step ST35), the other vehicle is nearby, that is, the other vehicle (surrounding vehicle) around the own vehicle.
- the relative position calculated in step ST34 is offset to calculate the field of view of other vehicles (neighboring vehicles) around the host vehicle (step ST36), and the process returns to step ST31 and is repeated.
- step ST33 visual field information as shown in FIG. 14A is received from another vehicle (peripheral vehicle) A in step ST33.
- the relative position of the host vehicle with respect to the other vehicle (peripheral vehicle) A is It is calculated that the position is +2 from the other vehicle (peripheral vehicle) A in the X direction (step ST34).
- step ST35 it is determined that the host vehicle and the other vehicle (peripheral vehicle) A exist within a predetermined distance (step ST35), and as shown in FIG.
- step ST36 To calculate visual field information centered on the own vehicle (step ST36). Further, assuming that the visual field information as shown in FIG. 15A is received from another vehicle (neighboring vehicle) B, the same processing is performed for this, and as shown in FIG. View information centered on the subject vehicle is calculated by applying an offset.
- step ST37 When there is no new received data, that is, when data of all other vehicles within the range in which data can be transmitted / received to / from the host vehicle is received or there is no surrounding vehicle (in the case of NO in step ST31), first. Based on the obtained composite visual field information, it is confirmed whether or not the host vehicle is outside the visual field of surrounding vehicles (step ST37).
- step ST37 If it is determined that the host vehicle is outside the field of view of the surrounding vehicle (YES in step ST37), the distance and direction to the position where the host vehicle is within the field of view of the surrounding vehicle are calculated (step ST38). The direction is output to the notification control unit 118 (step ST39). On the other hand, if it is determined that the host vehicle does not exist outside the field of view of the surrounding vehicle (in the case of NO in step ST37), since the host vehicle is in the field of view of the surrounding vehicle, the distance to the position within the field of view is 0 (zero). ) And output the distance and direction (step ST39).
- the determination unit 117 can obtain a dynamic visual field range with higher accuracy by synthesizing the dynamic visual field range of the driver of the plurality of vehicles. Further, even when there are a plurality of vehicles around the own vehicle, it can be determined whether or not the own vehicle is present in the blind spots of all the plurality of vehicles.
- step ST41 when the host vehicle, which is output information from the determination unit 117, receives the distance and direction to the field of view of the surrounding vehicle (in the case of YES in step ST41), is the received distance greater than 0 (zero)? No, that is, whether it exists outside the field of view of the surrounding vehicle is confirmed (step ST42).
- T2 2 seconds, but this value may be appropriately set by the user.
- Guidance indicating the indicated distance and direction is output via the HMI 5 (step ST44).
- step ST42 if the distance is 0 (zero), that is, if the host vehicle is in the field of view of the surrounding vehicle (NO in step ST42), and if the host vehicle is a neighboring vehicle in step ST43. If the time outside the visual field is less than the predetermined time (T2) (NO in step ST43), the process is terminated as it is.
- an instruction to output the notification information is made only when a predetermined time (T2) has elapsed since the determination unit 107 determined that the host vehicle is outside the dynamic visual field range of the driver of the surrounding vehicle. By doing so, it is possible to prevent the driver from making the driver uncomfortable many times in a short time and to suppress unnecessary notifications.
- FIG. 18 is a diagram showing a display example of a notification pattern in the second embodiment.
- the own vehicle does not exist in any field of view of surrounding vehicles A and B (in the case of YES in step ST37), and both the surrounding vehicles A and B have a field of view range.
- the position of is (0, -3) (0, -2) (1, -2) (0,3).
- the closest position from the own vehicle is (0, -2)
- the distance and direction within the field of view of the surrounding vehicles A and B are -2 in the Y direction, that is, a position moved two forwards. Is calculated (step ST38).
- one memory in the Y direction is about 5 meters.
- the notification control unit 118 instructs the notification device to guide only by flashing and displaying only the forward arrow of the icon display.
- the relative positional relationship between the host vehicle and the surrounding vehicles A and B is shown, and an instruction is given to guide the host vehicle by displaying a forward arrow and “10m forward”. You can do it.
- the driver can visually understand the route guidance more intuitively.
- the blinking cycle of the icon in the corresponding direction may be changed or the color may be changed according to the distance output by the determination unit 117.
- the position information of the own vehicle and the position information of the surrounding vehicle (other vehicle) are output by the determining unit 117 (while displaying the relative positions of the own vehicle and the other vehicle).
- the driver can visually confirm the guidance and visually and intuitively know the position and route guidance of the other vehicle and the host vehicle. I can understand.
- the driver when outputting the notification information to be guided by voice, the driver will speak the distance and direction by voice, such as “Please drive 10 meters ahead” or “Change to the right lane”. Can be notified. In this way, if guidance is provided by voice, the driver can confirm guidance while facing forward. Needless to say, the driver may be notified by combining voice, icon display, and display display.
- the second embodiment not only that the own vehicle is out of the field of view of the surrounding vehicle, but also that the guidance is performed within the dynamic field of view.
- the effect of the first embodiment there is an effect that it is possible to support the driver to drive more safely.
- FIG. 19 is a block diagram illustrating an example of a peripheral monitoring device according to Embodiment 3 and peripheral devices connected thereto.
- symbol is attached
- the periphery monitoring device 20 in the third embodiment described below further includes a vehicle ID acquisition unit 201, a visual field calculation unit 123, a communication unit 125, and The contents of processing in the determination unit 127 are different.
- the visual field calculation unit 103 calculates the driver's visual field and combines them several times and outputs them to the communication unit 105. 3, the visual field calculation unit 123 outputs the driver's visual field to the communication unit 125 as it is after each calculation.
- the vehicle ID acquisition unit 201 acquires ID information unique to the vehicle and outputs the ID information to the communication unit 125.
- the visual field calculation unit 123 calculates the dynamic visual field range of the driver of the own vehicle based on the vehicle type information of the host vehicle acquired by the vehicle type information acquisition unit 102 and the driver's line of sight information acquired by the sight line acquisition unit 101. Calculate and output to the communication unit 125.
- the communication unit 125 includes a driver's dynamic visual field range calculated by the visual field calculation unit 123, position information of the own vehicle acquired by the own vehicle position acquisition unit 104, and a vehicle acquired by the vehicle ID acquisition unit 201.
- the IDs are collectively transmitted to the surrounding vehicles via the inter-vehicle communication antenna 4.
- position information latitude / mildness
- field of view information and vehicle ID transmitted by the surrounding vehicles of the host vehicle are received via the inter-vehicle communication antenna 4.
- the operation of the visual field calculation unit 123 in the periphery monitoring device 20 of the third embodiment will be described using the flowchart shown in FIG.
- the direction of the driver's line of sight and the eye position are acquired from the line-of-sight acquisition unit 101 (step ST51).
- the vehicle type information of the host vehicle is acquired from the vehicle type information acquisition unit 102 (step ST52).
- vehicle ID is acquired from the vehicle ID acquisition part 201 (step ST53).
- the driver's visual field is calculated with reference to a visual field table associating the vehicle type and the line-of-sight direction as shown in FIG. 2 (step ST54).
- a visual field table associating the vehicle type and the line-of-sight direction as shown in FIG. 2
- the vehicle type information of the host vehicle acquired in step ST2 is the vehicle type A1 of manufacturer A
- the line-of-sight direction acquired in step ST1 is a rearview mirror
- the visual field calculated in this way is output to the communication unit 125 (step ST55).
- step ST61 it is confirmed whether or not information on other vehicles can be received from the communication unit 125 through inter-vehicle communication. And when the information of a new other vehicle can be received, ie, when there exists new reception data (step ST61), the received visual field is recorded (step ST62).
- step ST63 it is confirmed whether or not the received vehicle ID of the other vehicle is a registered ID (step ST63).
- the received visual field is acquired by a predetermined time (T1) before (within the past predetermined time (T1)) (step ST64).
- T1 predetermined time
- step ST64 The visual field received in step ST61 and the past visual field acquired in step ST64 are synthesized (step ST65).
- step ST63 if it is not a registered vehicle ID in step ST63 (NO in step ST63), the vehicle ID is registered (step ST66).
- steps ST67 to ST75 are the same as the processes of steps ST32 to ST40 in the flowchart of FIG.
- the dynamic visual field range for one vehicle is synthesized in arbitrary time units. Therefore, it is possible to provide more appropriate guidance according to vehicle congestion, road conditions, or vehicle conditions such as vehicle speed.
- the third embodiment not only that the own vehicle is out of the field of view of the surrounding vehicle but also that the guidance is performed within the dynamic field of view.
- the effect of the first embodiment there is an effect that it is possible to support the driver to drive more safely.
- the vehicle ID acquisition unit 201 is provided in addition to the configuration shown in the second embodiment and the processing of the visual field calculation unit and the determination unit is changed.
- the vehicle ID acquisition unit 201 may be provided in addition to the configuration, and the processing of the third embodiment described above may be performed.
- the periphery monitoring device of the present invention is a device mounted on a vehicle such as an automobile such as an in-vehicle device such as a car navigation device or an instrument panel, and is connected to a notification device such as an audio output device or a display device. Any device can be applied as long as it is a device. Further, the periphery monitoring device itself may be incorporated in those devices. Furthermore, the present invention may be applied to an application installed in a portable information terminal such as a smartphone, a tablet PC, or a mobile phone.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Traffic Control Systems (AREA)
- Mechanical Engineering (AREA)
Priority Applications (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE112014006626.5T DE112014006626T5 (de) | 2014-04-30 | 2014-04-30 | Umgebungsbereich-Überwachungsvorrichtung, Umgebungsbereich-Überwachungssystem undUmgebungsbereich-Überwachungsverfahren |
JP2016515802A JP6230699B2 (ja) | 2014-04-30 | 2014-04-30 | 周辺監視装置、周辺監視システムおよび周辺監視方法 |
CN201811351350.7A CN109243204A (zh) | 2014-04-30 | 2014-04-30 | 周边监视装置、周边监视系统以及周边监视方法 |
CN201480078331.5A CN106463057B (zh) | 2014-04-30 | 2014-04-30 | 周边监视装置、周边监视系统以及周边监视方法 |
US15/110,304 US10380895B2 (en) | 2014-04-30 | 2014-04-30 | Surrounding area monitoring apparatus, surrounding area monitoring system and surrounding area monitoring method |
PCT/JP2014/061996 WO2015166558A1 (ja) | 2014-04-30 | 2014-04-30 | 周辺監視装置、周辺監視システムおよび周辺監視方法 |
CN201811351308.5A CN109318806A (zh) | 2014-04-30 | 2014-04-30 | 周边监视装置、周边监视系统以及周边监视方法 |
CN201811351454.8A CN109229022B (zh) | 2014-04-30 | 2014-04-30 | 周边监视装置、周边监视系统以及周边监视方法 |
US16/056,654 US10867516B2 (en) | 2014-04-30 | 2018-08-07 | Surrounding area monitoring apparatus and surrounding area monitoring method |
US16/056,687 US10878700B2 (en) | 2014-04-30 | 2018-08-07 | Surrounding area monitoring apparatus and surrounding area monitoring method |
US16/056,782 US10621870B2 (en) | 2014-04-30 | 2018-08-07 | Surrounding area monitoring system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/061996 WO2015166558A1 (ja) | 2014-04-30 | 2014-04-30 | 周辺監視装置、周辺監視システムおよび周辺監視方法 |
Related Child Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/110,304 A-371-Of-International US10380895B2 (en) | 2014-04-30 | 2014-04-30 | Surrounding area monitoring apparatus, surrounding area monitoring system and surrounding area monitoring method |
US16/056,687 Continuation US10878700B2 (en) | 2014-04-30 | 2018-08-07 | Surrounding area monitoring apparatus and surrounding area monitoring method |
US16/056,782 Continuation US10621870B2 (en) | 2014-04-30 | 2018-08-07 | Surrounding area monitoring system |
US16/056,654 Continuation US10867516B2 (en) | 2014-04-30 | 2018-08-07 | Surrounding area monitoring apparatus and surrounding area monitoring method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015166558A1 true WO2015166558A1 (ja) | 2015-11-05 |
Family
ID=54358310
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/061996 WO2015166558A1 (ja) | 2014-04-30 | 2014-04-30 | 周辺監視装置、周辺監視システムおよび周辺監視方法 |
Country Status (5)
Country | Link |
---|---|
US (4) | US10380895B2 (zh) |
JP (1) | JP6230699B2 (zh) |
CN (4) | CN106463057B (zh) |
DE (1) | DE112014006626T5 (zh) |
WO (1) | WO2015166558A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021181771A1 (ja) * | 2020-03-12 | 2021-09-16 | パナソニックIpマネジメント株式会社 | 推定装置および推定方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017056570A1 (ja) * | 2015-09-30 | 2017-04-06 | アイシン精機株式会社 | 運転支援装置 |
JP6822325B2 (ja) * | 2017-06-21 | 2021-01-27 | 日本電気株式会社 | 操縦支援装置、操縦支援方法、プログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009048564A (ja) * | 2007-08-22 | 2009-03-05 | Toyota Motor Corp | 車両位置予測装置 |
WO2011148455A1 (ja) * | 2010-05-25 | 2011-12-01 | 富士通株式会社 | 映像処理装置、映像処理方法及び映像処理プログラム |
JP2013206183A (ja) * | 2012-03-28 | 2013-10-07 | Fujitsu Ltd | 事故予防装置、事故予防方法およびプログラム |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02224637A (ja) | 1988-11-16 | 1990-09-06 | A T R Tsushin Syst Kenkyusho:Kk | 視線検出方法 |
US5556281A (en) * | 1994-02-17 | 1996-09-17 | Motorola, Inc. | Simulated area weapons effects display arrangement |
US7209221B2 (en) * | 1994-05-23 | 2007-04-24 | Automotive Technologies International, Inc. | Method for obtaining and displaying information about objects in a vehicular blind spot |
US6885968B2 (en) * | 2000-05-08 | 2005-04-26 | Automotive Technologies International, Inc. | Vehicular exterior identification and monitoring system-agricultural product distribution |
DE4425419C1 (de) * | 1994-07-19 | 1995-12-14 | Daimler Benz Ag | Kurzreichweitige Ultraschall-Abstandswarnanlage in einem Kraftfahrzeug, insbesondere als Einparkhilfe |
JP3743582B2 (ja) * | 1996-02-21 | 2006-02-08 | 株式会社小松製作所 | 無人車両と有人車両混走時のフリート制御装置及び制御方法 |
JP3358709B2 (ja) * | 1997-08-11 | 2002-12-24 | 富士重工業株式会社 | 車両用運転支援装置 |
WO2001085491A1 (en) * | 2000-05-08 | 2001-11-15 | Automotive Technologies International, Inc. | Vehicular blind spot identification and monitoring system |
US6882287B2 (en) * | 2001-07-31 | 2005-04-19 | Donnelly Corporation | Automotive lane change aid |
DE10247290B4 (de) * | 2002-10-10 | 2013-04-18 | Volkswagen Ag | Verfahren und Vorrichtung zur Überwachung toter Winkel eines Kraftfahrzeugs |
US6880941B2 (en) * | 2003-06-11 | 2005-04-19 | Tony R. Suggs | Vehicle blind spot monitoring system |
JP3985748B2 (ja) * | 2003-07-08 | 2007-10-03 | 日産自動車株式会社 | 車載用障害物検出装置 |
JP2007200052A (ja) * | 2006-01-27 | 2007-08-09 | Nissan Motor Co Ltd | 交差点における運転支援装置および交差点における運転支援方法 |
JP5088669B2 (ja) * | 2007-03-23 | 2012-12-05 | 株式会社デンソー | 車両周辺監視装置 |
US8190355B2 (en) * | 2007-10-10 | 2012-05-29 | International Business Machines Corporation | Driving assistance and monitoring |
CN101842262A (zh) * | 2007-11-05 | 2010-09-22 | 沃尔沃拉斯特瓦格纳公司 | 基于车辆的夜视装置及其操作方法 |
US20090150768A1 (en) * | 2007-12-10 | 2009-06-11 | International Business Machines Corporation | Composition-based application user interface framework |
JP2009187424A (ja) | 2008-02-08 | 2009-08-20 | Alpine Electronics Inc | 周辺監視装置および周辺監視方法 |
US20100082179A1 (en) * | 2008-09-29 | 2010-04-01 | David Kronenberg | Methods for Linking Motor Vehicles to Reduce Aerodynamic Drag and Improve Fuel Economy |
US8100426B2 (en) * | 2008-09-29 | 2012-01-24 | David Kronenberg | Systems for positioning and linking motor vehicles to reduce aerodynamic drag |
EP2280241A3 (en) * | 2009-07-30 | 2017-08-23 | QinetiQ Limited | Vehicle control |
SE535786C2 (sv) * | 2010-01-19 | 2012-12-18 | Volvo Technology Corp | System för döda vinkeln-varning |
US20120277957A1 (en) * | 2010-04-15 | 2012-11-01 | Satoru Inoue | Driving assist device |
CN102328619A (zh) * | 2010-07-12 | 2012-01-25 | 叶春林 | 车辆扩展前视野的方法及构成 |
CN102632839B (zh) * | 2011-02-15 | 2015-04-01 | 香港生产力促进局 | 一种基于后视图像认知的车载盲区预警系统及方法 |
US8681016B2 (en) * | 2011-02-25 | 2014-03-25 | Volkswagen Ag | Driver assistance system |
US8947219B2 (en) * | 2011-04-22 | 2015-02-03 | Honda Motors Co., Ltd. | Warning system with heads up display |
US8564425B2 (en) * | 2011-08-19 | 2013-10-22 | Ahmad I. S. I. Al-Jafar | Blind spot monitoring system |
US20130060456A1 (en) * | 2011-09-02 | 2013-03-07 | Peyman Pourparhizkar | Synchronizing car movements in road to reduce traffic |
US9328526B2 (en) * | 2011-09-22 | 2016-05-03 | Unitronics Automated Solutions Ltd | Vehicle positioning system |
JP5849040B2 (ja) * | 2012-11-28 | 2016-01-27 | 富士重工業株式会社 | 車両の運転支援制御装置 |
CN103158620B (zh) * | 2013-03-25 | 2015-09-16 | 中国电子科技集团公司第三十八研究所 | 一种车辆行人检测跟踪预警系统 |
CN203402060U (zh) * | 2013-07-16 | 2014-01-22 | 北京汽车股份有限公司 | 行车安全辅助系统以及汽车 |
-
2014
- 2014-04-30 US US15/110,304 patent/US10380895B2/en active Active
- 2014-04-30 DE DE112014006626.5T patent/DE112014006626T5/de not_active Ceased
- 2014-04-30 WO PCT/JP2014/061996 patent/WO2015166558A1/ja active Application Filing
- 2014-04-30 CN CN201480078331.5A patent/CN106463057B/zh active Active
- 2014-04-30 CN CN201811351308.5A patent/CN109318806A/zh active Pending
- 2014-04-30 CN CN201811351454.8A patent/CN109229022B/zh active Active
- 2014-04-30 CN CN201811351350.7A patent/CN109243204A/zh active Pending
- 2014-04-30 JP JP2016515802A patent/JP6230699B2/ja active Active
-
2018
- 2018-08-07 US US16/056,782 patent/US10621870B2/en active Active
- 2018-08-07 US US16/056,687 patent/US10878700B2/en active Active
- 2018-08-07 US US16/056,654 patent/US10867516B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009048564A (ja) * | 2007-08-22 | 2009-03-05 | Toyota Motor Corp | 車両位置予測装置 |
WO2011148455A1 (ja) * | 2010-05-25 | 2011-12-01 | 富士通株式会社 | 映像処理装置、映像処理方法及び映像処理プログラム |
JP2013206183A (ja) * | 2012-03-28 | 2013-10-07 | Fujitsu Ltd | 事故予防装置、事故予防方法およびプログラム |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021181771A1 (ja) * | 2020-03-12 | 2021-09-16 | パナソニックIpマネジメント株式会社 | 推定装置および推定方法 |
JP2021144505A (ja) * | 2020-03-12 | 2021-09-24 | パナソニックIpマネジメント株式会社 | 推定装置および推定方法 |
Also Published As
Publication number | Publication date |
---|---|
CN106463057A (zh) | 2017-02-22 |
US10380895B2 (en) | 2019-08-13 |
US20160328973A1 (en) | 2016-11-10 |
US20180342162A1 (en) | 2018-11-29 |
US20180342161A1 (en) | 2018-11-29 |
US10878700B2 (en) | 2020-12-29 |
US10621870B2 (en) | 2020-04-14 |
JPWO2015166558A1 (ja) | 2017-04-20 |
US20180374362A1 (en) | 2018-12-27 |
CN109229022B (zh) | 2022-05-03 |
JP6230699B2 (ja) | 2017-11-15 |
CN106463057B (zh) | 2020-03-17 |
DE112014006626T5 (de) | 2017-02-09 |
CN109243204A (zh) | 2019-01-18 |
US10867516B2 (en) | 2020-12-15 |
CN109229022A (zh) | 2019-01-18 |
CN109318806A (zh) | 2019-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4763537B2 (ja) | 運転支援情報報知装置 | |
JP4645891B2 (ja) | 車両用運転支援装置及び車両用運転支援方法 | |
US10210405B2 (en) | Sign information display system and method | |
JP6028766B2 (ja) | 運転支援表示装置 | |
WO2011058822A1 (ja) | 車両周囲表示装置、車両周囲表示方法 | |
JP6265965B2 (ja) | 車両用標識表示装置及び方法 | |
JP2016095739A (ja) | 標識情報表示システム及び標識情報表示方法 | |
EP2927642A1 (en) | System and method for distribution of 3d sound in a vehicle | |
WO2018198926A1 (ja) | 電子機器、路側機、電子機器の動作方法および交通システム | |
JP2015169472A (ja) | 車載機器制御装置、システムおよび方法 | |
US10621870B2 (en) | Surrounding area monitoring system | |
JP2015011457A (ja) | 車両用情報提供装置 | |
KR20150133534A (ko) | 차량간 의사 소통 서비스 제공 방법 및 장치 | |
US20210279477A1 (en) | Image processing apparatus, image processing method, and image processing system | |
JP2008257582A (ja) | 運転支援装置 | |
CN112590793B (zh) | 汽车的变道控制方法、装置及计算机存储介质 | |
JP2006113781A (ja) | 危険警報システム | |
JP2016095789A (ja) | 表示装置 | |
KR20230136021A (ko) | 차량의 운전 지원 시스템 | |
KR101874123B1 (ko) | 차량 및 이와 통신하는 휴대 단말기 | |
JP2019105941A (ja) | 走行支援方法及び走行支援装置 | |
JP6237571B2 (ja) | 運転支援装置 | |
JP2018024393A (ja) | 運転支援方法及び運転支援装置 | |
US9858812B2 (en) | Traffic signal state detection apparatus | |
JP2013149105A (ja) | 周辺物体表示装置および周辺物体表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14891003 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016515802 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15110304 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112014006626 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14891003 Country of ref document: EP Kind code of ref document: A1 |