WO2016016922A1 - 走行支援システム及び走行支援方法 - Google Patents
走行支援システム及び走行支援方法 Download PDFInfo
- Publication number
- WO2016016922A1 WO2016016922A1 PCT/JP2014/069788 JP2014069788W WO2016016922A1 WO 2016016922 A1 WO2016016922 A1 WO 2016016922A1 JP 2014069788 W JP2014069788 W JP 2014069788W WO 2016016922 A1 WO2016016922 A1 WO 2016016922A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- vehicle
- information
- detection
- blind spot
- travel
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3697—Output of additional, non-guidance related information, e.g. low fuel level
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/161—Decentralised systems, e.g. inter-vehicle communication
- G08G1/163—Decentralised systems, e.g. inter-vehicle communication involving continuous checking
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/167—Driving aids for lane monitoring, lane changing, e.g. blind spot detection
Definitions
- the present invention relates to a driving support system and a driving support method that support driving of a vehicle using a notification unit.
- the course of the host vehicle is estimated based on the position of the host vehicle, the vehicle speed vector, the operation of the blinker, and the like, and based on the course of the host vehicle and information on the detection object of the blind spot.
- the collision risk between the own vehicle and the blind spot detection object is determined, and when it is determined that the collision risk is high, the driver of the own vehicle is alerted or notified of the collision avoidance operation of the own vehicle.
- an object of the present invention is to provide a technique capable of increasing the possibility of notifying only the detection object of a significant blind spot to the user. To do.
- a travel support system is a travel support system that supports travel of a vehicle using a notification unit, the route information of the first vehicle, the current position information of the first vehicle, the first vehicle, and the second vehicle.
- Information acquisition unit that acquires detection object information about each detection object around each detected by the vehicle, route information of the first vehicle acquired by the information acquisition unit, current position information of the first vehicle, Of the detection objects detected by the second vehicle based on the detection object information of the first and second vehicles, the first vehicle is detected at the blind spot where the first vehicle travels.
- a control unit that causes the notification unit to notify the detection object determined to affect the traveling of the vehicle as the traveling influence object in the first vehicle.
- a travel support method is a travel support method for supporting travel of a vehicle using a notification unit, the route information of the first vehicle, the current position information of the first vehicle, the first vehicle, Detection object information about each detection object around each detected by the second vehicle, the acquired first route information, the current position information of the first vehicle, the first and second Based on the detected object information of the vehicle, out of the detected objects detected by the second vehicle without being detected by the first vehicle, it is determined that the driving of the first vehicle is affected at the blind spot where the first vehicle travels.
- the notification unit is caused to notify the notification unit of the detected detection object as a travel influence object in the first vehicle.
- a detection object that affects the travel of the first vehicle at the blind spot is notified by the first vehicle.
- FIG. 1 is a block diagram showing a main configuration of a navigation device according to Embodiment 1.
- FIG. 1 is a block diagram showing a configuration of a navigation device according to Embodiment 1.
- FIG. 3 is a flowchart showing an operation of the navigation device according to the first embodiment.
- 6 is a diagram for explaining an operation example 1 of the navigation device according to the first embodiment.
- FIG. 6 is a diagram for explaining an operation example 2 of the navigation device according to the first embodiment.
- FIG. FIG. 10 is a diagram for explaining an operation example 3 of the navigation device according to the first embodiment.
- 10 is a flowchart showing the operation of the navigation device according to the second embodiment.
- FIG. 10 is a diagram for explaining an operation example of the navigation device according to the second embodiment.
- FIG. 10 is a flowchart showing the operation of the navigation device according to the third embodiment.
- FIG. 10 is a diagram for explaining an operation example of the navigation device according to the third embodiment.
- 14 is a flowchart showing the operation of the navigation device according to the fourth embodiment.
- FIG. 10 is a diagram for explaining an operation example of the navigation device according to the fourth embodiment. It is a block diagram which shows the main structures of the server which concerns on another modification. It is a block diagram which shows the main structures of the communication terminal which concerns on another modification.
- FIG. 1 is a block diagram showing a main configuration of navigation device 1 according to the first embodiment.
- the navigation apparatus 1 of FIG. 1 is assumed to be mounted on a plurality of vehicles, and description will be given focusing on one of these vehicles.
- one vehicle of interest first vehicle
- second vehicle second vehicle
- other vehicle second vehicle
- other vehicle other vehicle
- the notification unit 4 can be used to support the traveling of the host vehicle.
- the notification unit 4 includes, for example, at least one of a display device and a speaker, and can support driving of the vehicle to a user such as a driver using at least one of display and audio output. Notify various information.
- the notification unit 4 is provided in the navigation device 1, but is not limited thereto, and is not provided in the navigation device 1, and other devices (for example, a communication terminal described later) ).
- the information acquisition unit 2 detects the route information indicating the route on which the host vehicle is planned to travel (the route of the host vehicle), the current position information of the host vehicle, and the detection targets related to the detection target around the host vehicle.
- detection object information related to detection objects around other vehicles detected by other vehicles is acquired.
- detection object information related to detection objects around the vehicle is referred to as “own vehicle detection object information”
- detection object information related to detection objects around the other vehicle is referred to as “other vehicle detection object information”. It will be described.
- a route guided by the navigation device 1 is applied to the route of the host vehicle indicated by the route information.
- the periphery of the own vehicle includes, for example, a circular range centered on the own vehicle and having a radius of the maximum distance detectable by the detection function, or a partial fan-shaped range.
- the surroundings of other vehicles are the same as the surroundings of the own vehicle.
- Detecting objects include, for example, moving objects such as automobiles, bicycles, and pedestrians, or non-moving objects such as construction signs.
- the own vehicle detection target information includes, for example, information on the current position of the detection target around the own vehicle, and the other vehicle detection target information includes, for example, the detection target around the other vehicle. Contains current location information.
- the host vehicle detection target information may further include speed information (for example, a speed vector) of the detection target object around the host vehicle.
- the speed detection information about the detection target around the other vehicle may be further included in the other vehicle detection target information.
- the control unit 3 is realized as a function of the CPU by, for example, a CPU (Central Processing Unit) (not shown) of the navigation device 1 executing a program stored in a storage device such as a semiconductor memory (not shown) of the navigation device 1. Is done.
- a CPU Central Processing Unit
- a storage device such as a semiconductor memory (not shown) of the navigation device 1. Is done.
- the control unit 3 detects the vehicle based on the route information of the vehicle acquired by the information acquisition unit 2, the current position information of the vehicle, the vehicle detection target information, and the other vehicle detection target information.
- a detection object that is determined to affect the traveling of the own vehicle at the blind spot where the own vehicle travels is notified by the own vehicle as the traveling affected object. To notify.
- the subject vehicle is excluded from detection objects detected by other vehicles without being detected by the subject vehicle.
- a detection object corresponds to a detection object of a blind spot that may be located in a blind spot when viewed from the host vehicle. Therefore, in the following description, it is assumed that the detection target detected by the other vehicle without being detected by the own vehicle is the same as the detection target of the blind spot.
- a user or a manufacturer may preliminarily set a spot where a blind spot detection target is assumed.
- a point for example, at least one of an intersection, a parking lot entrance / exit point, a destination entrance / exit point, a road junction point, a T-junction point, a road curve point, and a point that frequently decelerates, etc.
- One point, or a combination of the point and its vicinity is assumed.
- the position of the detection object at the blind spot is the route of the vehicle indicated in the route information. Whether or not the vehicle's route intersects the route where the detection object moves at the blind spot, whether or not the vehicle's route overlaps the route where the detection object moves at the blind spot, etc. Applies.
- the route of the vehicle indicated in the route information passes through the position of the detection target at the blind spot, or intersects or overlaps the path along which the detection target moves at the blind spot, these detection targets.
- the path along which the detection target moves is calculated based on, for example, the current position and speed information of the detection target included in the detection target information.
- the determination described above may be performed by the navigation device 1, and the route information of the own vehicle, the current position information of the own vehicle, and the detection target information of the own vehicle and other vehicles are devices outside the navigation device 1. May be performed in the apparatus.
- the subject vehicle is notified of a blind spot detection target that affects the traveling of the subject vehicle at a spot requiring special attention, such as a blind spot.
- a blind spot detection target that affects the traveling of the subject vehicle at a spot requiring special attention, such as a blind spot.
- FIG. 2 is a block diagram showing a main configuration and an additional configuration of the navigation device 1 according to the first embodiment.
- the navigation device 1 includes an input unit 5, a map data storage unit 6, and a route guide unit 7 in addition to the information acquisition unit 2, the control unit 3, and the notification unit 4 described above. These components of the navigation device 1 are controlled in an integrated manner by the control unit 3.
- the information acquisition unit 2 includes a current position detection unit 21 that detects (acquires) the current position of the host vehicle, a route calculation unit 23 that calculates (acquires) a route, and another vehicle detection target by performing wireless communication with another vehicle.
- a wireless communication unit 25 that receives (acquires) object information, an ambient information detection unit 27 that detects (acquires) own-vehicle detection target information, and an external sensor 28 are provided. Details of each component of the information acquisition unit 2 will be described later as appropriate.
- the notification unit 4 includes a display unit 4a, an image control unit 4b, an audio output unit 4c, and an audio control unit 4d.
- the image control unit 4b displays an image such as a map image and a guide image on the display unit 4a including a display based on the image data including the map image and the guide image output from the control unit 3.
- the voice control unit 4d outputs voices such as guidance voices and warning sounds to the voice output unit 4c composed of speakers and the like based on voice data including guidance voices and warning sounds output from the control unit 3.
- the navigation device 1 can control the notification (display and audio output) of the notification unit 4 configured as described above, and can support the traveling of the vehicle using the notification. It has become.
- the input unit 5 includes, for example, a push button device, a touch panel, or the like, and receives from the user destination information that can specify a destination to which the host vehicle should arrive.
- the input part 5 when the input part 5 is comprised with a touch panel, it may be comprised integrally with the display part 4a.
- the input unit 5 receives a point on the map scroll-displayed on the display unit 4a according to the user's operation from the user, the input unit 5 receives the point as a destination and receives an address or a telephone number from the user. If it is, the address or telephone number is accepted as destination information.
- the map data storage unit 6 is composed of a storage device such as a hard disk drive (HDD) or RAM (Random Access Memory), and stores (accumulates) map data.
- the map data storage unit 6 may be configured to acquire and store map data from outside the navigation device 1.
- the map data storage unit 6 may store the map data by downloading it from an external device via a network, or a DVD (Digital Versatile Disk) -ROM (Read Only Memory), Blu-Ray. You may memorize
- the current position detection unit 21 is connected to a GPS (Global Positioning System) reception unit 21a, an orientation detection unit 21b, and a pulse detection unit 21c.
- the GPS receiving unit 21a receives a GPS signal from a GPS satellite, and detects the current position (for example, latitude / longitude coordinates) of the vehicle based on the GPS signal.
- the direction detection unit 21b includes, for example, a gyro sensor and an direction sensor, and detects the traveling direction (for example, the direction) of the host vehicle.
- the pulse detection unit 21c detects a pulse signal corresponding to the number of revolutions per unit time of the axle of the own vehicle, and detects the traveling speed and the traveling distance of the own vehicle based on the pulse signal.
- the current position detection unit 21 configured as described above is configured to detect the current position received by the GPS reception unit 21a by the vehicle movement data (travel direction and travel speed of the host vehicle) detected by the direction detection unit 21b and the pulse detection unit 21c. And the correct current position of the host vehicle can be detected.
- the route calculation unit 23 includes a departure place such as the current position of the vehicle detected by the current position detection unit 21, a destination received from the user by the input unit 5, and a map stored in the map data storage unit 6. Based on the data, a route from the starting point to the destination on the map is calculated.
- the calculated route includes, for example, a route with a short travel time (time priority route), a route with a short travel distance (distance priority route), a route with less fuel (fuel priority route), and a route traveling on a toll road as much as possible (pay) Priority route), a route traveling on a general road as much as possible (general priority route), and a route (standard route) having a good balance of time, distance and cost.
- the route guide unit 7 stores a route selected by the user via the input unit 5 among the routes calculated by the route calculation unit 23 (hereinafter referred to as “travel planned route”).
- the route guidance unit 7 guides the user from the current position to the destination along the planned travel route by controlling the notification of the notification unit 4 based on the current position of the host vehicle on the planned travel route.
- the above-described route information indicates the planned travel route (the route calculated by the route calculation unit 23).
- the wireless communication unit 25 receives the reception antenna 25a, various types of information transmitted from other vehicles via the reception antenna 25a, the transmission antenna 25c, and various types of information to be transmitted to other vehicles. And a transmission unit 25d for transmitting via the antenna 25c.
- the wireless communication unit 25 is configured to perform inter-vehicle communication that directly transmits and receives various types of information to and from other vehicles (other vehicles in the vicinity of the host vehicle) that are located in a communication reachable range. Has been.
- the wireless communication unit 25 is configured to perform vehicle-to-vehicle communication, no new communication infrastructure is required.
- wireless communication part 25 may be comprised so that a mobile communication network and road-to-vehicle communication may be performed.
- the wireless communication unit 25 configured as described above receives the other vehicle detection target information from the other vehicle and transmits the own vehicle detection target information to the other vehicle by inter-vehicle communication.
- the ambient information detection unit 27 is connected to the external sensor 28, and extracts (detects) detection object information from information around the vehicle detected by the external sensor 28.
- the external sensor 28 includes, for example, a camera 28a and an image processing unit 28b capable of photographing in a visible light region and an infrared region, and a radar 28c and a radar control unit 28d such as a laser beam and a millimeter wave.
- the camera 28a is disposed, for example, in the vicinity of the room mirror on the vehicle interior side of the front window of the host vehicle, and images the outside of a predetermined detection range in front of the host vehicle through the front window.
- a CCD (Charge-Coupled Device) camera, a CMOS (Complementary Metal-Oxide Semiconductor) camera, or the like is applied to the camera 28a.
- the image processing unit 28b performs predetermined image processing such as filtering and binarization processing on the image obtained by the imaging of the camera 28a, and generates image data including a two-dimensional array of pixels. Output to the ambient information detector 27.
- the radar 28c is disposed, for example, near the nose of the body of the host vehicle or in the vicinity of the front window in the passenger compartment.
- the radar control unit 28d controls the transmission signal such as laser light or millimeter wave in an appropriate detection direction (for example, Make a call toward the front of the car in the driving direction. Further, the radar 28c receives a reflection signal generated when the transmission signal is reflected by an object outside the host vehicle, generates a beat signal by mixing the reflection signal and the transmission signal, and generates an ambient information detection unit. 27.
- the radar control unit 28d controls the radar 28c in accordance with a control command input from the surrounding information detection unit 27 to the radar control unit 28d.
- the surrounding information detection unit 27 determines whether the image data includes the image of the detection target by determining whether the image data includes a predetermined moving body or non-moving body image. Determine whether.
- the surrounding information detection unit 27 determines the first position between the reference position (for example, the center position in the horizontal direction of all the images) of the image data and the detection target. While calculating 1 distance, the 2nd distance between a detection target object and the own vehicle is calculated based on the beat signal produced
- the surrounding information detection unit 27 calculates the relative position (for example, latitude / longitude coordinates) of the detection target with respect to the position of the vehicle in the horizontal direction based on the first distance and the second distance. Then, the surrounding information detection unit 27 calculates (detects) the current position of the detection target based on the calculated relative position and the current position of the host vehicle detected by the current position detection unit 21.
- the surrounding information detection unit 27 calculates speed information such as a speed vector of the detection target by obtaining a temporal change in the current position of the detection target ( To detect.
- FIG. 3 is a flowchart showing the operation of the navigation device 1 for the host vehicle according to the first embodiment. This operation is realized by the CPU configuring the control unit 3 of the navigation device 1 executing a program stored in the storage device of the navigation device 1. Next, operation
- step S1 the control unit 3 causes the surrounding information detection unit 27 to detect the vehicle detection target information.
- step S2 the control unit 3 determines whether or not the wireless communication unit 25 (reception unit 25b) has received other vehicle detection target information from another vehicle within the communication range. If it is determined that it has been received, the process proceeds to step S3. If it is determined that it has not been received, the process proceeds to step S8.
- step S3 the control unit 3 compares the own vehicle detection target information with the other vehicle detection target information, and is included in the other vehicle detection target information but not included in the own vehicle detection target information.
- a process of detecting (extracting) information on an object (a blind spot detection target object) as blind spot information is started.
- step S4 the control unit 3 determines whether or not blind spot information is detected in step S3. If it is determined that blind spot information has been detected, the process proceeds to step S5. If it is determined that blind spot information has not been detected, the process proceeds to step S8.
- step S5 the control unit 3 determines whether or not the route calculation unit 23 has already calculated the route. For example, the control unit 3 determines that the route is calculated when the route guidance along the planned travel route is performed by the route guide unit 7, and when the route guidance is not performed, the route is calculated. Is determined not to be calculated. If it is determined in step S5 that the route has been calculated, the process proceeds to step S6. If it is determined that the route has not been calculated, the process proceeds to step S8.
- step S6 the control unit 3 stores the planned travel route (the route calculated by the route calculation unit 23), the blind spot information determined to be detected in step S4, and the map data storage unit 6. Based on the map data, it is determined whether or not there is a blind spot detection object that affects the travel of the vehicle at the blind spot where the vehicle travels. If it is determined that the blind spot detection object exists, the process proceeds to step S7, and if it is determined that the blind spot detection object does not exist, the process proceeds to step S8.
- step S7 the control unit 3 causes the notification unit 4 to notify the detection object of the blind spot that affects the traveling of the host vehicle as the traveling influence object in step S6.
- the control unit 3 causes the notification unit 4 to highlight and display the current position and moving direction (speed vector direction) of the travel-affected object, or causes the notification unit 4 to output sound.
- the position and moving direction of the travel-affected object at the blind spot are notified (displayed and output by voice) by the notification unit 4 of the host vehicle.
- the process proceeds to step S8.
- step S8 the control unit 3 determines whether or not the input unit 5 has accepted an operation for stopping the notification of the travel-affected object by the notification unit 4. If it is determined that the operation is accepted by the input unit 5, the notification of the travel-affected object by the notification unit 4 is stopped and the operation shown in FIG. On the other hand, if it is determined that the operation is not accepted, the process returns to step S1.
- FIG. 4 is a diagram for explaining an example of the operation shown in the flowchart of FIG.
- FIG. 4 shows the host vehicle 101, the scheduled travel route 101 a of the host vehicle 101, and the detection range 101 b of the external sensor 28 of the host vehicle 101.
- FIG. 4 also shows other vehicles 121, 122, 123, 221, 222, 223, their moving directions 121 a, 122 a, 123 a, 221 a, 222 a, 223 a, and the outside world of other vehicles 121, 122, 221, 222.
- the detection ranges 121b, 122b, 221b, and 222b of the sensor 28 are shown.
- FIG. 4 shows a pedestrian 124, a pedestrian movement direction 124a, intersections 100 and 200, and a map around the own vehicle 101.
- a planned travel route 101a of the host vehicle 101 is indicated by a thick arrow in FIG. 4, and is a route that turns left at the intersection 100.
- the blind spot where the host vehicle 101 travels is the intersection 100 and its sidewalk.
- Step S1 The navigation device 1 of the own vehicle 101 detects the detection target object within the detection range 101b of the external sensor 28 and detects the own vehicle detection target information.
- the detection target objects of the own vehicle 101 are as follows.
- Target object of own vehicle 101 Other vehicles 122, 123
- the navigation device 1 of the host vehicle 101 receives other vehicle detection target information from the other vehicles 121, 122, 221, 222.
- the detection objects of the other vehicles 121, 122, 221, 222 shown in the other vehicle detection object information are as follows.
- Step S3 The navigation device 1 of the host vehicle 101 compares the detection target of the host vehicle 101 detected in step S1 with the detection target of the other vehicles 121, 122, 221, 222 detected in step S2.
- Information of detection objects (other cars 121, 221, 222, 223, and pedestrians 124) that are included in the other vehicle detection object information and not included in the own vehicle detection object information are used as blind spot information in the own vehicle 101. To detect.
- Step S4 In the example of FIG. 4, since blind spot information is detected in step S3, the process proceeds to step S5.
- Step S5 The navigation device 1 of the host vehicle 101 determines whether or not the user of the host vehicle 101 is providing guidance to the destination along the planned travel route.
- the travel schedule route 101a that makes a left turn at the intersection 100 is being guided, so the process proceeds to step S6.
- Step S6 The navigation device 1 of the host vehicle 101 detects the detection target (other vehicles 121, 221, 222, 223, pedestrian 124) detected in step S3 at the blind spot (the intersection 100 and its sidewalk) where the host vehicle 101 travels. Among these, it is determined whether or not there is a detection object that affects the traveling of the vehicle 101.
- the planned traveling route 101a of the own vehicle 101 overlaps with the route (the route calculated based on the current position of the other vehicle 121 and the moving direction 121a) at the intersection 100. It is determined that the driving of the car 101 is affected. In addition, since the planned travel route 101 a of the host vehicle 101 intersects the route on which the pedestrian 124 moves on the sidewalk of the intersection 100, it is determined that the pedestrian 124 affects the travel of the host vehicle 101.
- the other vehicles 221, 222, and 223 It is determined that the driving is not affected.
- Step S7 The navigation device 1 of the host vehicle 101 is determined to affect the travel of the host vehicle 101 in step S6 among the detection objects (other vehicles 121, 221, 222, 223, pedestrian 124) detected in step S3. By notifying the other vehicle 121 and the pedestrian 124 from the notifying unit 4 as the traveling influence object, the presence of them is alerted.
- Step S8 Unless the user performs a notification stop operation on the navigation device 1 of the host vehicle 101 according to the necessity of alerting, the process returns to step S1 and repeats the above operation.
- FIG. 5 is a diagram for explaining another example of the operation shown in the flowchart of FIG.
- the positional relationship in FIG. 5 is the same as the positional relationship in FIG.
- the planned travel route 101 a of the vehicle 101 is a route that turns right at the intersection 100, and the blind spot where the vehicle 101 travels is the intersection 100 and its sidewalk.
- step S1 the same operations as in the operation example 1 are performed from step S1 to step S5.
- step S6 since the planned travel route 101a of the own vehicle 101 intersects the route where the other vehicles 121 and 223 move at the intersection 100, it is determined in step S6 that the other vehicles 121 and 223 affect the traveling of the own vehicle 101. Is done. On the other hand, it is determined that the detection object detected in step S3 other than these does not affect the traveling of the vehicle 101.
- step S7 the navigation device 1 of the own vehicle 101 notifies the other vehicles 121 and 223 from the notification unit 4 as the travel influence object.
- FIG. 6 is a diagram for explaining another example of the operation shown in the flowchart of FIG.
- the positional relationship in FIG. 6 is the same as the positional relationship in FIG.
- the planned travel route 101 a of the host vehicle 101 is a route that goes straight through the intersection 100, and the blind spots where the host vehicle 101 travels are the intersections 100 and 200 and its sidewalk.
- step S1 the same operations as in the operation example 1 are performed from step S1 to step S5.
- the planned traveling route 101a of the host vehicle 101 intersects the route where the other vehicle 121 moves at the intersection 100, and intersects the route where the other vehicles 221 and 222 move at the intersection 200. It is determined that the other vehicles 121, 221, 222 affect the traveling of the host vehicle 101. On the other hand, it is determined that the detection object detected in step S3 other than these does not affect the traveling of the vehicle 101.
- step S7 the navigation device 1 of the own vehicle 101 notifies the other vehicles 121, 221, and 222 as the travel influence object from the notification unit 4.
- a dead angle detection target that affects the travel of the host vehicle 101 is detected as a travel influence target at the blind spot where the host vehicle 101 travels. 4 to notify. Therefore, it is possible to notify the detection object with a high probability of blind spot detection without giving a notice of alert for the detection object with a blind spot that is unlikely to affect the traveling of the host vehicle 101. Therefore, the burden on the user of the own vehicle 101 can be reduced.
- the above-described operation can be performed without using the planned travel route of another vehicle, the above-described effect can be achieved even when the other vehicle does not calculate or transmit the planned travel route. Obtainable. Moreover, the detection target object which may become other obstacles, such as a vehicle, a pedestrian, and a bicycle which are not carrying the navigation apparatus 1, can be notified as a driving influence target object. Furthermore, the above-described effect can be obtained even when a directional antenna or the like is used as the reception antenna 25a or the transmission antenna 25c, and the above-described effect can be obtained even when a general-purpose antenna is used.
- the navigation device 1 of the own vehicle 101 does not transmit the information on the planned travel route to the navigation device 1 of the other vehicle, and does not receive the information on the planned travel route from the navigation device 1 of the other vehicle.
- the navigation device 1 of the own vehicle 101 does not transmit the information on the planned travel route to the navigation device 1 of the other vehicle, and does not receive the information on the planned travel route from the navigation device 1 of the other vehicle.
- it is not limited to this.
- the navigation device 1 of the other vehicle 121 provides guidance to the driver of the other vehicle 121 to the destination and guides the other vehicle 121 to go straight at the intersection 100.
- the navigation device 1 of the other vehicle 121 not only transmits the other vehicle detection object information detected by the external sensor 28 of the other vehicle 121 but also travels closest to the current position of the other vehicle 121.
- the traveling schedule information including the traveling schedule that is the moving direction of the intersection 100 (straight ahead in FIG. 4) may also be transmitted.
- an improvement in the accuracy of the determination can be expected.
- the above-mentioned traveling schedule does not have to indicate the moving direction of the intersection 100 that travels most recently, and may be a direction that makes a right turn or a left turn most recently, and a point that makes a right turn or a left turn most recently (a point that does not travel most recently). ).
- the other vehicles 121, 2221, and 222 are notified from the notification unit 4 as the travel influence object.
- the present invention is not limited to this, and the other vehicles 221 and 222 far from the intersection 100 where the host vehicle 101 travels most recently are not notified from the notification unit 4 as travel influence objects as in the second embodiment described later. You may comprise as follows.
- FIG. 6 shows a display example in which a figure indicating the position of the other cars 121, 221, 222 (here, a substantially square indicating the shape of the car) and an arrow indicating the moving directions 121a, 221a, 222a are superimposed.
- the display form is not limited to this. This also applies to FIGS. 4 and 5.
- the surrounding information detection unit 27 corresponds to the relative position of the detection target object with respect to the position of the vehicle 101 in the horizontal direction, that is, two-dimensional coordinates, from the two detection values of the first distance and the second distance. Two unknown values were obtained.
- the present invention is not limited to this, and the above-described relative position can be obtained from two detection values related to the positions of the host vehicle 101 and the detection target.
- the surrounding information detection unit 27 may obtain the above-described relative position from the detection value of the detection direction of the radar 28c and the second distance.
- the current position detection unit 21, the route calculation unit 23, the surrounding information detection unit 27, and the route guide unit 7 are each executed by a CPU provided one-on-one. Alternatively, it may be realized by one CPU executing a program.
- the control unit 3 preliminarily determines the current position of the own vehicle 101 from the blind spot where the own vehicle 101 travels closest based on the current position information of the own vehicle 101. When the distance is longer than the determined first distance, the travel influence target object is not notified from the notification unit 4. Further, the control unit 3 determines that the traveling influence is greater than a predetermined second distance from the blind spot where the host vehicle 101 travels most recently based on the current position information of the detection target included in the detection target information. The object is not notified from the notification unit 4.
- Other configurations and operations are the same as those in the first embodiment, and thus description thereof is omitted here.
- FIG. 7 is a flowchart showing the operation of the navigation device 1 of the host vehicle 101 according to the second embodiment
- FIG. 8 is a diagram for explaining an example of the operation.
- step S11 and S12 are mainly demonstrated.
- step S ⁇ b> 11 the control unit 3 determines the own vehicle based on the route information (scheduled travel route), the current position detected by the current position detection unit 21, and the map data stored in the map data storage unit 6. The distance between 101 and the blind spot where the vehicle 101 travels most recently is calculated.
- control part 3 determines whether the calculated distance is separated from the predetermined 1st distance. For the first distance, for example, any value of 30 m to 110 m is applied.
- the control unit 3 determines that the host vehicle 101 is not further than the first distance from the nearest blind spot, and proceeds to step S12.
- the control unit 3 determines that the host vehicle 101 is farther from the first blind spot than the first distance, and proceeds to step S8.
- FIG. 8 shows the same positional relationship as FIG. 4 of the first embodiment, and a part of the boundary line of the circular range 81 with the center of the intersection 100 as a reference and the first distance R81 as a radius. Is shown by a one-dot chain line.
- the intersection 100 is a blind spot where the host vehicle 101 travels most recently, and the intersection 200 is not a blind spot where the host vehicle 101 travels most recently. For this reason, when the own vehicle 101 is located within the range 81 related to the intersection 100, the control unit 3 determines that the own vehicle 101 is not further than the first distance R81 from the intersection 100 that is the nearest blind spot. If this is not the case, it is determined that the vehicle 101 is farther than the first distance R81 from the intersection 100, which is the nearest blind spot.
- step S12 since the own vehicle 101 is located within the range 81, the process proceeds to step S12.
- step S12 the control unit 3 stores the route information (scheduled travel route), the current position of the detection target included in the other vehicle detection target information received from the other vehicle, and the map data storage unit 6. Based on the existing map data and the detection result of step S3, the distance between the blind spot detection object and the blind spot where the vehicle 101 travels most recently is calculated. In addition, when the own vehicle 101 can receive the traveling schedule of the other vehicle, the distance may be calculated using the current position of the other vehicle included in the traveling schedule.
- the control unit 3 determines whether or not the calculated distance is longer than a predetermined second distance.
- a predetermined second distance for example, any value of 30 m to 110 m is applied.
- the second distance may be the same as or different from the first distance.
- the second distance may be different for each road extending radially from the blind spot.
- step S6 If the calculated distance is equal to or less than the second distance, the control unit 3 determines that the blind spot detection object is not further than the second distance from the nearest blind spot, and proceeds to step S6. If the calculated distance exceeds the second distance, the control unit 3 determines that the blind spot detection object is farther than the second distance from the nearest blind spot, and proceeds to step S8.
- FIG. 8 shows a part of the boundary line of the circular range 82 with the center of the intersection 100 as a reference and the radius of the second distance R82 as a two-dot chain line.
- the intersection 100 is a blind spot where the host vehicle 101 travels most recently, and the intersection 200 is not a blind spot where the host vehicle 101 travels most recently. For this reason, when the blind spot detection object is located within the range 82 related to the intersection 100, the control unit 3 is farther from the intersection 100, which is the nearest blind spot point, than the second distance R82. If not, it is determined that the blind spot detection object is further than the second distance R82 from the intersection 100 that is the nearest blind spot.
- the blind spots are the other vehicles 121, 221, 222, 223 and the pedestrian 124, and since the other vehicle 121 is located within the range 82, the process proceeds to step S ⁇ b> 6. become.
- the other vehicle 121 and the pedestrian 124 are notified as travel influence objects in the case of the positional relationship of FIG. 4, but in the second embodiment, the positional relationship of FIG. In the case of the positional relationship, the pedestrian 124 is not notified as a travel influence object, and the other vehicle 121 is notified as a travel influence object.
- the blind spot detection target that affects the travel of the host vehicle 101 at the blind spot is notified as the travel impact target.
- This notification is performed regardless of whether or not the dead spot point that is the subject of the notification is the dead spot point where the vehicle 101 travels most recently, and between the own vehicle 101 and the dead spot point. Done regardless of distance.
- the blind spot detection target at the blind spot will affect the travel of the host vehicle 101 in the future. Even if there is, it is considered that the possibility of affecting the traveling of the own vehicle 101 is not high at present.
- the navigation device 1 when the own vehicle 101 is farther than the first distance R81 from the blind spot where the vehicle travels most recently, the detection target of the blind spot that is the travel-affected object. An object is not notified from the notification unit 4. As a result, the possibility of notifying only the detection object having a blind spot that is significant for the traveling of the host vehicle 101 can be further increased, and thus the burden on the user of the host vehicle 101 can be further reduced.
- the notification according to the first embodiment is performed regardless of whether the blind spot point that is the subject of the notification is a blind spot where the vehicle 101 travels most recently. It was done regardless of the distance to the blind spot. However, if the current position of the blind spot detection object is far from the blind spot where the vehicle 101 travels most recently, there is a high possibility that the blind spot detection object will affect the traveling of the vehicle 101 at this time. It is not considered.
- the blind spot detection target object that is the travel influence target object is farther than the second distance R82 from the blind spot point that travels most recently.
- the notification unit 4 is not notified.
- the first distance described in the second embodiment is determined based on the speed information of the host vehicle 101
- the second distance described in the second embodiment is a mobile object. It is comprised so that it may be determined based on the speed information of the driving influence target object which is.
- Other configurations and operations are the same as those in the second embodiment, and thus description thereof is omitted here.
- FIG. 9 is a flowchart showing the operation of the navigation device 1 of the host vehicle 101 according to the third embodiment
- FIG. 10 is a diagram for explaining an example of the operation.
- step S21 The flowchart shown in FIG. 9 is obtained by adding step S21 between steps S5 and S11 of the flowchart shown in FIG. Therefore, step S21 will be mainly described below.
- step S21 the control unit 3 changes the first distance used as the threshold in step S11 based on the traveling speed (speed information) of the host vehicle 101 detected by the pulse detection unit 21c (FIG. 2).
- the controller 3 decreases the first distance R81 as the traveling speed of the host vehicle 101 decreases, and increases the first distance R81 as the traveling speed of the host vehicle 101 increases.
- control unit 3 changes the second distance used as the threshold value in step S12 based on the moving speed indicated by the speed vector included in the other-vehicle detection target information (speed information of the moving object that is the travel-affected target).
- the second distance R82 is individually defined as the second distances R821, R822, R823 with respect to the other vehicles 121, 122, 123.
- 2nd distance R82 is prescribed
- the control unit 3 decreases the second distances R821, R822, and R823 as the moving speed of the other vehicles 121, 122, and 123 decreases, and the moving speed of the other vehicles 121, 122, and 123 decreases. As the distance increases, the second distances R821, R822, and R823 are each increased.
- a margin distance for example, 10 m
- a known stop distance for example, 10 m
- step S21 After the above step S21 is completed, the process proceeds to step S11 and the same operation as in the second embodiment is performed.
- the traveling speed of the traveling influence target (dead angle detection target) is low, the entry time required for the traveling influence target to enter the blind spot is increased. And the timing which starts the notification of the said travel influence target object can be delayed by shortening 2nd distance R82.
- the moving speed of the travel-affected object is high, the timing for starting the notification of the travel-affected object is advanced by increasing the second distance R82 in consideration of shortening the entry time.
- the notified travel influence object can be narrowed down according to the travel speed of the travel influence object.
- a distance obtained by adding a margin distance to a known stop distance is applied to the first and second distances.
- the present invention is not limited to this, and the distance calculated from the proportional expression of the speed of the subject vehicle and blind spot detection object may be applied to the first and second distances, and these speeds are associated with each other in a table. A given distance may be applied.
- upper limit values may be provided for the first and second distances.
- the control unit 3 detects the moving object that is the travel-affected object that does not travel at the blind spot at the same time as the own vehicle 101.
- the notification unit 4 is configured not to notify. It is assumed that the map data stored in the map data storage unit 6 includes traffic signal presence / absence information indicating whether or not a traffic signal is installed at the blind spot. Other configurations and operations are the same as those in the first embodiment, and thus description thereof is omitted here.
- FIG. 11 is a flowchart showing the operation of the navigation device 1 of the host vehicle 101 according to the fourth embodiment
- FIG. 12 is a diagram for explaining an example of the operation.
- the flowchart shown in FIG. 11 is obtained by adding steps S31, S32, and S33 between steps S5 and S6 of the flowchart shown in FIG. Therefore, steps S31 to S33 will be mainly described below.
- the blind spot is described as an intersection, but the present invention is not limited to this.
- step S31 the control unit 3 stores the planned travel route (the route calculated by the route calculation unit 23), the blind spot information determined to be detected in step S4, and the map data storage unit 6. Based on the map data, it is determined whether a traffic light is installed at the intersection where the vehicle 101 travels. If it is determined that a traffic signal is installed, the process proceeds to step S32. If it is determined that no traffic signal is installed, the process proceeds to step S6.
- step S32 the control unit 3 determines the direction in which the blind spot detection object enters the intersection.
- the control part 3 determines the direction which approachs an intersection based on the speed information (speed vector) contained in other vehicle detection target object information.
- step S33 the control unit 3 determines whether or not the blind spot detection object enters the intersection from the left-right direction with respect to the traveling direction of the host vehicle 101 based on the planned traveling route and the determination result in step S32. judge.
- the process proceeds to step S8, and when not entering the intersection from the left-right direction, the process proceeds to step S6.
- a blind spot detection object that enters the intersection from the left-right direction with respect to the traveling direction of the host vehicle 101 at an intersection with a traffic light corresponds to a blind spot detection object that does not travel simultaneously with the host vehicle 101 at the intersection.
- FIG. 12 shows the host vehicle 101, the planned travel route 101a of the host vehicle 101, other vehicles 120, 121, 122, 130, and 131, and their moving directions 120a, 121a, 122a, 130a, and 131a. ing.
- a traffic light 161 is installed at the intersection 100 (a blind spot where the host vehicle 101 travels), and the travel-affected objects (dead angle detection objects) are the other vehicles 120 and 122.
- the control unit 3 does not cause the notification unit 4 to notify the other vehicle 122 that does not travel at the vehicle 100 and the intersection 100 at the same time.
- the blind spot detection target that affects the travel of the host vehicle 101 at the blind spot is notified as the travel impact target.
- the notification of the travel influence target object is performed regardless of whether or not the blind spot detection target object travels simultaneously with the own vehicle 101 at the blind spot where the traffic light 161 is installed.
- a blind spot detection object that does not travel simultaneously with the host vehicle 101 at the blind spot where the traffic signal 161 is installed is not likely to affect the traveling of the host vehicle 101.
- the notification unit 4 does not notify the traveling influence target object (dead angle detection target object) that does not travel simultaneously with the own vehicle 101 at the blind spot where the traffic light 161 is installed.
- the traveling influence target object dead angle detection target object
- FIG. 13 is a block diagram showing the main configuration of the server 91 according to this modification.
- the server 91 of FIG. 13 includes a communication unit 91a and a control unit 91b corresponding to the information acquisition unit 2 and the control unit 3 described so far. Further, a navigation device 92a and a navigation device 93a are mounted on the first vehicle 92 and the second vehicle 93 of FIG. 13, respectively.
- the navigation device 92a corresponds to the notification unit 92b corresponding to the notification unit 4 described so far. It has.
- the communication unit 91a communicates with the navigation devices 92a and 93a to detect the route information of the first vehicle 92, the current position information of the first vehicle 92, and the first and second vehicles 92 and 93, respectively.
- the control unit 91b is realized as a function of the CPU, for example, when a CPU (not illustrated) of the server 91 executes a program stored in a storage device such as a semiconductor memory (not illustrated) of the server 91.
- the control unit 91b of the server 91 controls the notification of the notification unit 92b of the navigation device 92a via the communication unit 91a.
- the control unit 91b is based on the route information of the first vehicle 92, the current position information of the first vehicle 92, and the detection target information of the first and second vehicles 92 and 93 received by the communication unit 91a.
- the detection object detected by the second vehicle 93 without being detected by the first vehicle 92 the detection object determined to affect the travel of the first vehicle 92 at the blind spot where the first vehicle 92 travels. Is notified to the notification unit 92b by the first vehicle 92 as a traveling influence object.
- the server 91 configured in this way, the same effect as in the first embodiment can be obtained.
- FIG. 14 is a block diagram showing the main configuration of the communication terminal 96 according to this modification.
- the communication terminal 96 in FIG. 14 includes a communication unit 96a and a control unit 96b corresponding to the information acquisition unit 2 and the control unit 3 described so far.
- the communication terminal 96 includes mobile terminals such as mobile phones, smartphones, and personal computers such as tablets.
- a navigation device 97a and a navigation device 98a are mounted on the first vehicle 97 and the second vehicle 98 of FIG. 14, respectively, and the navigation device 97a corresponds to the notification unit 97b corresponding to the notification unit 4 described so far. It has.
- the communication unit 96a communicates with the navigation devices 97a and 98a to detect the route information of the first vehicle 97, the current position information of the first vehicle 97, and the first and second vehicles 97 and 98, respectively.
- the control unit 96b is realized as a function of the CPU by, for example, a CPU (not shown) of the communication terminal 96 executing a program stored in a storage device such as a semiconductor memory (not shown) of the communication terminal 96.
- the control unit 96b of the communication terminal 96 controls the notification of the notification unit 97b of the navigation device 97a via the communication unit 96a.
- the control unit 96b is based on the route information of the first vehicle 97, the current position information of the first vehicle 97, and the detection target information of the first and second vehicles 97 and 98 received by the communication unit 96a.
- the detection object determined to affect the travel of the first vehicle 97 at the blind spot where the first vehicle 97 travels. Is notified to the notification unit 97b by the first vehicle 97 as a traveling influence object.
- the communication terminal 96 configured in this way, the same effect as in the first embodiment can be obtained.
- the navigation device 1 described so far is not only a navigation device that can be mounted on a vehicle, but also a Portable Navigation Device, a communication terminal (for example, a mobile terminal such as a mobile phone, a smartphone, and a tablet), and the like.
- the present invention can also be applied to a driving support system constructed as a system by appropriately combining functions of installed applications and servers.
- each function or each component of the navigation device 1 described above may be distributed and arranged in each device constituting the system, or may be concentrated on any device. .
- the present invention can be freely combined with each embodiment and each modification within the scope of the invention, or can be appropriately modified and omitted with each embodiment and each modification.
- 1 navigation device 2 information acquisition unit, 3, 91b, 96b control unit, 4, 92b, 97b notification unit, 81, 82 range, 91 server, 91a, 96a communication unit, 92, 97 first vehicle, 93, 98th 2 vehicles, 96 communication terminals, 100, 200 intersections, 101 own vehicle, 101a planned travel route, 120, 121, 122, 123, 130, 131, 221, 222, 223, other vehicles, 120a, 121a, 122a, 123a, 130a , 131a, 124a, 221a, 222a, 223a direction of movement, 124 pedestrians, R81 first distance, R82, R821, R822, R823 second distance.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
Description
以下においては、本発明に係る走行支援システムが、車両に搭載可能なナビゲーション装置単体に適用された場合を例にして説明する。図1は、実施の形態1に係るナビゲーション装置1の主要な構成を示すブロック図である。
図3は、本実施の形態1に係る自車のナビゲーション装置1の動作を示すフローチャートである。この動作は、ナビゲーション装置1の制御部3を構成するCPUが、ナビゲーション装置1の記憶装置に記憶されているプログラムを実行することにより実現される。次に、図3のフローチャートを用いて、自車のナビゲーション装置1の動作について説明する。
図4は、図3のフローチャートに示した動作の一例を説明するための図である。
自車101のナビゲーション装置1は、外界センサ28の検出範囲101b内の検出対象物を検出して、自車検出対象物情報を検出する。図4に示す位置関係の場合、自車101の検出対象物は以下の通りである。
(ステップS2)
自車101のナビゲーション装置1は、他車121,122,221,222から他車検出対象物情報を受信する。図4に示す位置関係の場合、当該他車検出対象物情報に示される他車121,122,221,222の検出対象物は、以下の通りである。
他車122の検出対象物:他車121,123
他車221の検出対象物:他車222
他車222の検出対象物:他車221,223
(ステップS3)
自車101のナビゲーション装置1は、ステップS1で検出した自車101の検出対象物と、ステップS2で検出した他車121,122,221,222の検出対象物とを比較する。他車検出対象物情報に含まれ、かつ自車検出対象物情報に含まれない検出対象物(他車121,221,222,223、歩行者124)の情報を、自車101における死角情報として検出する。
図4の例では、ステップS3にて死角情報が検出されるのでステップS5に進む。
自車101のナビゲーション装置1は、自車101のユーザに対して走行予定経路に沿って目的地までの案内を行っているか否かを判定する。図4の例では、交差点100を左折する走行予定経路101aを案内しているのでステップS6に進む。
自車101のナビゲーション装置1は、自車101が走行する死角地点(交差点100及びその歩道)において、ステップS3で検出された検出対象物(他車121,221,222,223、歩行者124)のうち、自車101の走行に影響する検出対象物が存在するか否かを判定する。
自車101のナビゲーション装置1は、ステップS3で検出された検出対象物(他車121,221,222,223、歩行者124)のうち、ステップS6で自車101の走行に影響すると判定された他車121及び歩行者124を、走行影響対象物として通知部4から通知することにより、それらの存在を注意喚起する。
ユーザが注意喚起の必要性に応じて、通知停止の操作を自車101のナビゲーション装置1に対して行わない限り、ステップS1に戻り再度前記動作を繰り返す。
図5は、図3のフローチャートに示した動作の別例を説明するための図である。図5の位置関係は、図4の位置関係と同じである。また、図5では、自車101の走行予定経路101aは、交差点100を右折する経路となっており、自車101が走行する死角地点は、交差点100及びその歩道となっている。
図6は、図3のフローチャートに示した動作の別例を説明するための図である。図6の位置関係は、図4の位置関係と同じである。また、図6では、自車101の走行予定経路101aは、交差点100を直進する経路となっており、自車101が走行する死角地点は、交差点100,200及びその歩道となっている。
以上のような本実施の形態1に係るナビゲーション装置1によれば、自車101が走行する死角地点において、自車101の走行に影響する死角の検出対象物が、走行影響対象物として通知部4により通知される。したがって、自車101の走行に影響する可能性が低い死角の検出対象物については注意喚起の通知をせずに、当該可能性が高い死角の検出対象物について当該通知をすることができる。したがって、自車101のユーザの負担を軽減することができる。
以上の説明では、自車101のナビゲーション装置1は、走行予定経路の情報を他車のナビゲーション装置1に送信せず、また、他車のナビゲーション装置1からの走行予定経路の情報を受信しなかったが、これに限ったものではない。
本発明の実施の形態2に係るナビゲーション装置1では、制御部3は、自車101の現在位置情報に基づいて、自車101の現在位置が、自車101が直近に走行する死角地点から予め定められた第1距離よりも離れている場合には、走行影響対象物を通知部4から通知させないように構成されている。また、制御部3は、検出対象物情報に含まれる検出対象物の現在位置情報に基づいて、自車101が直近に走行する死角地点から予め定められた第2距離よりも離れている走行影響対象物を、通知部4から通知させないように構成されている。その他の構成及び動作は、実施の形態1と同様であるため、ここでは説明を省略する。
図7は、本実施の形態2に係る自車101のナビゲーション装置1の動作を示すフローチャートであり、図8は当該動作の一例を説明するための図である。
実施の形態1に係るナビゲーション装置1では、死角地点において自車101の走行に影響する死角の検出対象物を、走行影響対象物として通知した。そして、この通知は、当該通知の対象となる死角地点が、自車101が直近に走行する死角地点であるか否かによらずに行われ、また、自車101と死角地点との間の距離によらずに行われた。しかしながら、自車101の現在位置が、自車101が直近に走行する死角地点から遠く離れている場合、将来、当該死角地点における死角の検出対象物が自車101の走行に影響する可能性があるにしても、現時点では自車101の走行に影響する可能性は高くないと考えられる。
本発明の実施の形態3に係るナビゲーション装置1では、実施の形態2で説明した第1距離が自車101の速度情報に基づいて決定され、実施の形態2で説明した第2距離が移動体である走行影響対象物の速度情報に基づいて決定されるように構成されている。その他の構成及び動作は、実施の形態2と同様であるため、ここでは説明を省略する。
図9は、本実施の形態3に係る自車101のナビゲーション装置1の動作を示すフローチャートであり、図10は当該動作の一例を説明するための図である。
以上のような本実施の形態3に係るナビゲーション装置1では、自車101の走行速度が小さい場合には、自車101が死角地点に進入するのに要する進入時間が長くなることを考慮して、第1距離R81を短くすることにより、走行影響対象物の通知を開始するタイミングを遅らせることができる。逆に自車101の走行速度が大きい場合には、進入時間が短くなることを考慮して第1距離R81を長くすることにより、走行影響対象物の通知を開始するタイミングを早めることができる。すなわち、自車101の走行速度に応じて、通知される走行影響対象物を絞り込むことができる。これにより、自車101の走行に有意な死角の検出対象物だけについて、注意喚起の通知をする可能性をより高めることができるので、自車101のユーザの負担をさらに軽減することができる。
以上の説明では、第1及び第2距離に、公知の停止距離にマージンの距離を加えた距離を適用した。しかしこれに限ったものではなく、第1及び第2距離に、自車及び死角の検出対象物の速度の比例式から算出される距離を適用してもよいし、それら速度とテーブルで対応付けられた距離を適用してもよい。また、必要以上に第1及び第2距離が長くなると、通知される走行影響対象物の数も多くなる。そこで、第1及び第2距離に上限値を設けてもよい。
本発明の実施の形態4に係るナビゲーション装置1では、制御部3は、死角地点に信号機が設置されている場合に、自車101と死角地点において同時に走行しない走行影響対象物である移動体を、通知部4から通知させないように構成されている。なお、地図データ蓄積部6に記憶されている地図データには、死角地点において信号機の設置の有無を示す信号機有無情報が含まれるものとする。その他の構成及び動作は、実施の形態1と同様であるため、ここでは説明を省略する。
図11は、本実施の形態4に係る自車101のナビゲーション装置1の動作を示すフローチャートであり、図12は当該動作の一例を説明するための図である。
実施の形態1に係るナビゲーション装置1では、死角地点において自車101の走行に影響する死角の検出対象物を、走行影響対象物として通知した。そして、走行影響対象物の通知は、信号機161が設置された死角地点において、死角の検出対象物が自車101と同時に走行するか否かによらずに行われた。しかしながら、信号機161が設置された死角地点において自車101と同時に走行しない死角の検出対象物は、自車101の走行に影響する可能性は高くないと考えられる。
図13は、本変形例に係るサーバ91の主要な構成を示すブロック図である。図13のサーバ91は、これまでに説明した情報取得部2及び制御部3にそれぞれ対応する通信部91a及び制御部91bを備えている。また、図13の第1車両92及び第2車両93には、ナビゲーション装置92a及びナビゲーション装置93aがそれぞれ搭載されており、ナビゲーション装置92aは、これまでに説明した通知部4に対応する通知部92bを備えている。
Claims (7)
- 通知部を用いて車両の走行を支援する走行支援システムであって、
第1車両の経路情報と、前記第1車両の現在位置情報と、前記第1車両及び第2車両がそれぞれ検出した各々の周囲の検出対象物に関する検出対象物情報と、を取得する情報取得部と、
前記情報取得部で取得された前記第1車両の前記経路情報と、前記第1車両の前記現在位置情報と、前記第1及び前記第2車両の前記検出対象物情報とに基づいて、前記第1車両で検出されずに前記第2車両で検出された前記検出対象物のうち、前記第1車両が走行する死角地点において前記第1車両の走行に影響すると判定された検出対象物を、走行影響対象物として前記第1車両にて前記通知部に通知させる制御部と
を備える、走行支援システム。 - 請求項1に記載の走行支援システムであって、
前記制御部は、
前記第1車両の前記現在位置情報に基づいて、前記第1車両の現在位置が、前記第1車両が直近に走行する前記死角地点から予め定められた第1距離よりも離れている場合に、前記走行影響対象物を前記通知部から通知させない、走行支援システム。 - 請求項2に記載の走行支援システムであって、
前記情報取得部は、前記第1車両の速度情報をさらに取得し、
前記第1距離は、前記第1車両の前記速度情報に基づいて決定される、走行支援システム。 - 請求項1に記載の走行支援システムであって、
前記検出対象物情報は、前記検出対象物の現在位置情報を含み、
前記制御部は、
前記検出対象物の前記現在位置情報に基づいて、前記第1車両が直近に走行する前記死角地点から予め定められた第2距離よりも離れている前記走行影響対象物を、前記通知部から通知させない、走行支援システム。 - 請求項4に記載の走行支援システムであって、
前記検出対象物は移動体を含み、
前記検出対象物情報は、前記移動体の速度情報をさらに含み、
前記第2距離は、前記走行影響対象物である前記移動体の速度情報に基づいて決定される、走行支援システム。 - 請求項1に記載の走行支援システムであって、
前記検出対象物は移動体を含み、
前記検出対象物情報は、前記移動体の現在位置情報を含み、
前記制御部は、
前記死角地点に信号機が設置されている場合に、前記第1車両と前記死角地点において同時に走行しない前記走行影響対象物である前記移動体を、前記通知部から通知させない、走行支援システム。 - 通知部を用いて車両の走行を支援する走行支援方法であって、
第1車両の経路情報と、前記第1車両の現在位置情報と、前記第1車両及び第2車両がそれぞれ検出した各々の周囲の検出対象物に関する検出対象物情報と、を取得し、
当該取得された前記第1の前記経路情報と、前記第1車両の前記現在位置情報と、前記第1及び前記第2車両の前記検出対象物情報とに基づいて、前記第1車両で検出されずに前記第2車両で検出された前記検出対象物のうち、前記第1車両が走行する死角地点において前記第1車両の走行に影響すると判定された検出対象物を、走行影響対象物として前記第1車両にて前記通知部に通知させる、走行支援方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/313,436 US9842503B2 (en) | 2014-07-28 | 2014-07-28 | Driving support apparatus and driving support method |
CN201480081001.1A CN106575474A (zh) | 2014-07-28 | 2014-07-28 | 行驶辅助系统及行驶辅助方法 |
JP2016537615A JP6312831B2 (ja) | 2014-07-28 | 2014-07-28 | 走行支援システム及び走行支援方法 |
PCT/JP2014/069788 WO2016016922A1 (ja) | 2014-07-28 | 2014-07-28 | 走行支援システム及び走行支援方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/069788 WO2016016922A1 (ja) | 2014-07-28 | 2014-07-28 | 走行支援システム及び走行支援方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016016922A1 true WO2016016922A1 (ja) | 2016-02-04 |
Family
ID=55216868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/069788 WO2016016922A1 (ja) | 2014-07-28 | 2014-07-28 | 走行支援システム及び走行支援方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9842503B2 (ja) |
JP (1) | JP6312831B2 (ja) |
CN (1) | CN106575474A (ja) |
WO (1) | WO2016016922A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108447302A (zh) * | 2017-02-16 | 2018-08-24 | 松下电器(美国)知识产权公司 | 信息处理装置以及程序 |
WO2019150460A1 (ja) * | 2018-01-31 | 2019-08-08 | 住友電気工業株式会社 | 車載装置、車車間通信方法、及びコンピュータプログラム |
CN110520916A (zh) * | 2017-04-19 | 2019-11-29 | 日产自动车株式会社 | 行驶辅助方法及行驶辅助装置 |
WO2021015071A1 (ja) * | 2019-07-22 | 2021-01-28 | 株式会社デンソー | 車両用運転支援システム |
KR20210120085A (ko) * | 2019-03-14 | 2021-10-06 | 히다치 겡키 가부시키 가이샤 | 건설 기계 |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102015220640A1 (de) * | 2015-10-22 | 2017-04-27 | Robert Bosch Gmbh | Verfahren und Vorrichtung zum Verringern eines Kollisionsrisikos einer Kollision eines Kraftfahrzeugs mit einem Objekt |
WO2018163817A1 (ja) * | 2017-03-06 | 2018-09-13 | パナソニックIpマネジメント株式会社 | 駐車場の車両走行制御システム、および駐車場の車両走行制御システムの制御方法 |
CN110892233B (zh) * | 2017-05-22 | 2024-06-21 | Drnc控股公司 | 用于传感器范围和视场的车载增强可视化的方法和装置 |
JP6901331B2 (ja) * | 2017-06-20 | 2021-07-14 | 株式会社東芝 | 情報処理装置、移動体、情報処理方法、およびプログラム |
US10229590B2 (en) * | 2017-08-14 | 2019-03-12 | GM Global Technology Operations LLC | System and method for improved obstable awareness in using a V2X communications system |
JP6676025B2 (ja) * | 2017-10-23 | 2020-04-08 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
FR3076045A1 (fr) * | 2017-12-22 | 2019-06-28 | Orange | Procede de surveillance d'un environnement d'un premier element positionne au niveau d'une voie de circulation, et systeme associe |
WO2020039517A1 (ja) * | 2018-08-22 | 2020-02-27 | 三菱電機株式会社 | 進路予測装置、進路予測プログラムおよび進路予測方法 |
KR102628282B1 (ko) * | 2018-10-10 | 2024-01-24 | 현대자동차주식회사 | 차량 및 그 제어 방법 |
CN113168767B (zh) * | 2018-11-30 | 2023-08-15 | 索尼集团公司 | 信息处理设备、信息处理系统和信息处理方法 |
US11630202B2 (en) * | 2018-12-20 | 2023-04-18 | Omron Corporation | Sensing device, moving body system, and sensing method |
FR3097400B1 (fr) * | 2019-06-12 | 2021-05-28 | Continental Automotive Gmbh | Système et procédé de notification de dysfonctionnement pour véhicule |
JP7078587B2 (ja) * | 2019-09-30 | 2022-05-31 | 本田技研工業株式会社 | 走行支援システム、走行支援方法およびプログラム |
CN111599200B (zh) * | 2020-04-09 | 2021-11-16 | 宁波吉利汽车研究开发有限公司 | 自主代客泊车感知决策方法、系统以及车载终端 |
CN115116267B (zh) * | 2021-03-18 | 2024-06-25 | 上海汽车集团股份有限公司 | 一种车辆换道处理系统和车辆 |
TWI843339B (zh) * | 2022-12-15 | 2024-05-21 | 搏盟科技股份有限公司 | 自行車用之光電反射式速度感測器及其測速方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008225786A (ja) * | 2007-03-12 | 2008-09-25 | Toyota Motor Corp | 道路状況検出システム |
JP2008293099A (ja) * | 2007-05-22 | 2008-12-04 | Mazda Motor Corp | 車両の運転支援装置 |
JP2010237063A (ja) * | 2009-03-31 | 2010-10-21 | Zenrin Co Ltd | 注意喚起情報提示装置 |
JP2013131145A (ja) * | 2011-12-22 | 2013-07-04 | Sanyo Electric Co Ltd | 移動体通信装置及び走行支援方法 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6720920B2 (en) * | 1997-10-22 | 2004-04-13 | Intelligent Technologies International Inc. | Method and arrangement for communicating between vehicles |
JP4517393B2 (ja) * | 2005-02-16 | 2010-08-04 | 株式会社デンソー | 運転支援装置 |
JP5082349B2 (ja) | 2006-09-05 | 2012-11-28 | マツダ株式会社 | 車両用運転支援システム |
JP4311426B2 (ja) | 2006-09-12 | 2009-08-12 | 住友電気工業株式会社 | 移動体を表示するための表示システム、車載装置及び表示方法 |
CN101681557A (zh) * | 2007-04-27 | 2010-03-24 | 爱信艾达株式会社 | 驾驶支援装置 |
JP5529058B2 (ja) * | 2011-02-24 | 2014-06-25 | アルパイン株式会社 | 立体物検知装置および立体物検知方法 |
JP5408240B2 (ja) * | 2011-12-12 | 2014-02-05 | 株式会社デンソー | 警告システム、車両装置、及びサーバ |
KR102092625B1 (ko) * | 2013-10-15 | 2020-04-14 | 현대모비스 주식회사 | 차량 상태 경보 방법 및 이를 위한 장치 |
US9997077B2 (en) * | 2014-09-04 | 2018-06-12 | Honda Motor Co., Ltd. | Vehicle operation assistance |
-
2014
- 2014-07-28 CN CN201480081001.1A patent/CN106575474A/zh active Pending
- 2014-07-28 JP JP2016537615A patent/JP6312831B2/ja active Active
- 2014-07-28 US US15/313,436 patent/US9842503B2/en not_active Expired - Fee Related
- 2014-07-28 WO PCT/JP2014/069788 patent/WO2016016922A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008225786A (ja) * | 2007-03-12 | 2008-09-25 | Toyota Motor Corp | 道路状況検出システム |
JP2008293099A (ja) * | 2007-05-22 | 2008-12-04 | Mazda Motor Corp | 車両の運転支援装置 |
JP2010237063A (ja) * | 2009-03-31 | 2010-10-21 | Zenrin Co Ltd | 注意喚起情報提示装置 |
JP2013131145A (ja) * | 2011-12-22 | 2013-07-04 | Sanyo Electric Co Ltd | 移動体通信装置及び走行支援方法 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108447302A (zh) * | 2017-02-16 | 2018-08-24 | 松下电器(美国)知识产权公司 | 信息处理装置以及程序 |
CN110520916A (zh) * | 2017-04-19 | 2019-11-29 | 日产自动车株式会社 | 行驶辅助方法及行驶辅助装置 |
WO2019150460A1 (ja) * | 2018-01-31 | 2019-08-08 | 住友電気工業株式会社 | 車載装置、車車間通信方法、及びコンピュータプログラム |
KR20210120085A (ko) * | 2019-03-14 | 2021-10-06 | 히다치 겡키 가부시키 가이샤 | 건설 기계 |
KR102579791B1 (ko) * | 2019-03-14 | 2023-09-19 | 히다치 겡키 가부시키 가이샤 | 건설 기계 |
US12054913B2 (en) | 2019-03-14 | 2024-08-06 | Hitachi Construction Machinery Co., Ltd. | Construction machine |
WO2021015071A1 (ja) * | 2019-07-22 | 2021-01-28 | 株式会社デンソー | 車両用運転支援システム |
JP2021018648A (ja) * | 2019-07-22 | 2021-02-15 | 株式会社デンソー | 車両用運転支援システム |
JP7306132B2 (ja) | 2019-07-22 | 2023-07-11 | 株式会社デンソー | 車両用運転支援システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2016016922A1 (ja) | 2017-04-27 |
US20170116862A1 (en) | 2017-04-27 |
JP6312831B2 (ja) | 2018-04-18 |
CN106575474A (zh) | 2017-04-19 |
US9842503B2 (en) | 2017-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6312831B2 (ja) | 走行支援システム及び走行支援方法 | |
US10198000B2 (en) | Vehicle control system, vehicle control method, and vehicle control program | |
JP6304223B2 (ja) | 運転支援装置 | |
EP3324556B1 (en) | Visual communication system for autonomous driving vehicles (adv) | |
US10676101B2 (en) | Vehicle control system, vehicle control method, and vehicle control program | |
JP4434224B2 (ja) | 走行支援用車載装置 | |
WO2016181618A1 (ja) | 監視対象領域設定装置および監視対象領域設定方法 | |
US10818177B2 (en) | Driving assistance device and driving assistance method | |
JP2016203745A (ja) | 車両走行制御装置 | |
JP2016095697A (ja) | 注意喚起装置 | |
WO2019064350A1 (ja) | 運転支援方法及び運転支援装置 | |
JP2018156462A (ja) | 移動体及びそれを含む運転支援システム | |
JP2015075889A (ja) | 運転支援装置 | |
WO2011058822A1 (ja) | 車両周囲表示装置、車両周囲表示方法 | |
JP2016007954A (ja) | 車線合流支援装置 | |
JP2018133072A (ja) | 情報処理装置およびプログラム | |
JP2019197303A (ja) | 車外報知装置 | |
US11181386B2 (en) | Navigation device, destination guiding system, and non-transitory recording medium | |
WO2017104209A1 (ja) | 運転支援装置 | |
JP2009286274A (ja) | 車両用運転支援装置 | |
JP6253349B2 (ja) | 走行支援装置および走行支援方法 | |
JP2019079454A (ja) | 車両制御システム、機能通知装置、機能通知方法およびコンピュータプログラム | |
JP2022019244A (ja) | 運転支援装置およびプログラム | |
JP6221568B2 (ja) | 運転支援装置 | |
JP2019079453A (ja) | 情報生成システム、情報生成装置、情報生成方法およびコンピュータプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14898748 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016537615 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15313436 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14898748 Country of ref document: EP Kind code of ref document: A1 |