JP4929997B2 - Driving assistance device - Google Patents

Driving assistance device Download PDF

Info

Publication number
JP4929997B2
JP4929997B2 JP2006309392A JP2006309392A JP4929997B2 JP 4929997 B2 JP4929997 B2 JP 4929997B2 JP 2006309392 A JP2006309392 A JP 2006309392A JP 2006309392 A JP2006309392 A JP 2006309392A JP 4929997 B2 JP4929997 B2 JP 4929997B2
Authority
JP
Japan
Prior art keywords
warning
driver
vehicle
feature
flag
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2006309392A
Other languages
Japanese (ja)
Other versions
JP2008123443A (en
Inventor
孝幸 宮島
和司 紺野
Original Assignee
アイシン・エィ・ダブリュ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アイシン・エィ・ダブリュ株式会社 filed Critical アイシン・エィ・ダブリュ株式会社
Priority to JP2006309392A priority Critical patent/JP4929997B2/en
Publication of JP2008123443A publication Critical patent/JP2008123443A/en
Application granted granted Critical
Publication of JP4929997B2 publication Critical patent/JP4929997B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a driving support device that detects a warning object such as a pedestrian or a bicycle and warns the driver.

Conventionally, various driving assistance apparatuses that detect warning objects such as pedestrians and bicycles and warn the driver have been proposed.
For example, an imaging unit that captures an image around the vehicle, an object detection unit that detects the position of the detection target object based on the image captured by the imaging unit, and a gaze detection that detects the gaze direction of the driver driving the vehicle Based on the means, the position of the detection target object detected by the object detection means, and the driver's gaze direction detected by the gaze detection means, it is determined whether or not the driver recognizes the detection target object. Driving support composed of determination means and warning means for urging the driver to warn when the position of the detection target object determined by the determination means to be not recognized by the driver is within a predetermined warning area There is an apparatus (for example, refer to Patent Document 1).
JP 2006-27481 A (paragraphs (0014) to (0083), FIGS. 1 to 12)

  However, in the driving assistance apparatus described in Patent Document 1 described above, whether or not to issue a warning via a warning unit is determined depending on whether or not the driver recognizes a pedestrian near the intersection. Therefore, for example, even if the driver does not recognize a pedestrian, even if the driver recognizes a traffic light installed at an intersection or a temporary stop line in front, a warning is issued via a warning means. It will be. For this reason, there is a problem that it is difficult to provide a warning at an appropriate timing to assist the driver in driving.

  Therefore, the present invention has been made to solve the above-described problems, and only when a warning for confirmation of a pedestrian or the like is necessary, a warning can be given to support the driving of the driver. It aims at providing a driving assistance device.

In order to achieve the above object, the driving support device according to claim 1 is a driving support device (1) that detects a warning object and warns the driver, and includes map information including feature information about the feature. Map information storage means (25) for storing the vehicle, vehicle position detection means (11) for detecting the vehicle position of the vehicle, and line-of-sight detection means (13, 51,) for detecting the line-of-sight direction of the driver driving the vehicle. 52), feature information acquisition means (13) for acquiring the feature information about the feature at the intersection ahead of the own vehicle position in the traveling direction based on the map information, and the feature information acquisition means A feature that determines whether or not the driver recognizes a feature at an intersection ahead of the traveling direction based on the feature information obtained and the driver's line-of-sight direction detected by the line-of-sight detection means Recognition judging means (13 When the a warning object information acquisition means for acquiring a warning object information including position information of the warning object in the traveling direction ahead of the intersection of the vehicle position (13,51,53), the warning object information acquisition Whether the driver recognizes the warning object at the intersection ahead of the traveling direction based on the warning object information acquired by the means and the driver's line-of-sight direction detected by the line-of-sight detection means Warning object recognition determination means (13) for determining whether or not, vehicle speed detection means (21) for detecting the vehicle speed of the vehicle, and distance calculation means for calculating the distance from the vehicle position to an intersection ahead in the traveling direction (13) Based on the vehicle speed detected by the vehicle speed detecting means and the distance calculated by the distance calculating means, a predicted remaining time until the intersection is reached from the own vehicle position is calculated. Collision remaining time calculation means for output (13), the predicted remaining time calculated by the collision remaining time calculating means when longer than a predetermined first time, the driver is not aware of the forward traveling direction of the feature It is determined, and, if the driver is determined not to recognize the warning object, and controls to perform a warning and to the driver, calculated by the collision remaining time calculating means When it is determined that the driver does not recognize the warning object when the predicted remaining time is equal to or shorter than the predetermined first time and longer than the predetermined second time shorter than the predetermined first time. Regardless of whether the driver recognizes the feature ahead in the traveling direction, the driver is controlled to warn, and the predicted remaining time calculated by the collision remaining time calculating means is Predetermined second time When it is determined that the driver does not recognize the feature ahead of the traveling direction, or when the driver determines that the driver does not recognize the warning object, Warning control means (13) for controlling so as to give a warning to a person , wherein the warning control means (13) is configured such that when the predicted residual time is longer than the predetermined first time, the predicted residual time Is displayed for each of the time when the predicted remaining time is equal to or shorter than the predetermined second time and the predetermined remaining time is equal to or shorter than the predetermined second time. It is characterized by selecting .

  Further, the driving support device according to claim 2 is the driving support device (1) according to claim 1, wherein the other vehicle information acquisition includes acquiring other vehicle information including position information of the other vehicle traveling in front of the vehicle. Based on the means (13, 61, 62), the other vehicle information acquired by the other vehicle information acquisition means, and the driver's line-of-sight direction detected by the line-of-sight detection means, the driver Another vehicle recognition determination means (13) for determining whether or not the vehicle is recognized, and the warning control means (13) determines that the driver recognizes the other vehicle Is characterized in that control is performed to suppress warnings given to the driver.

Furthermore, the driving support apparatus according to claim 3 is the driving support apparatus (1) according to claim 1 or 2 , wherein the feature is a paint display (72, 73) provided on a road surface of a road. And a three-dimensional object (71) provided along the road.

In the driving support apparatus according to claim 1 having the above-described configuration, the feature information (for example, a traffic light, a temporary stop sign, a temporary stop line, etc.) at the intersection ahead of the own vehicle position in the traveling direction (for example, the traffic information) , the coordinate position of the latitude and longitude are installed, the height or the like from the ground.) and, based on the line-of-sight direction of the detected driver by visual axis detecting means, the driver forward traveling direction of the intersection whether to recognize the feature is determined in. Also, warning object information including position information of warning objects such as pedestrians and other vehicles approaching from the left and right at the intersection ahead of the traveling direction of the vehicle position, and the driver's line-of-sight direction detected by the line-of-sight detection means Based on the above, it is determined whether or not the driver recognizes the warning object. Further, based on the vehicle speed detected by the vehicle speed detection means and the distance from the vehicle position calculated by the distance calculation means to the intersection ahead of the traveling direction, the estimated remaining time from the vehicle position to the intersection ahead of the traveling direction Is calculated.
When the predicted remaining time is longer than the predetermined first time, it is determined that the driver does not recognize the feature ahead in the traveling direction , and it is determined that the driver does not recognize the warning object . in the case, a warning is carried out to the driver. Further, when it is determined that the driver does not recognize the warning object when the predicted remaining time is equal to or shorter than the predetermined first time and is longer than the predetermined second time shorter than the predetermined first time, A warning is given to the driver regardless of whether the driver recognizes the feature ahead of the traveling direction. Further, when the predicted remaining time is equal to or shorter than the predetermined second time, it is determined that the driver does not recognize the feature ahead in the traveling direction, or it is determined that the driver does not recognize the warning object. If this happens, a warning is given to the driver. Furthermore, when the predicted residual time is longer than the predetermined first time, when the predicted residual time is less than the predetermined first time and longer than the predetermined second time shorter than the predetermined first time, and when the predicted residual time is The warning content is selected and a warning is issued at each time when is less than the predetermined second time.
As a result, when the predicted remaining time is longer than a predetermined first time (for example, about 5 seconds), in the vicinity of the intersection, the driver can detect a feature (for example, a traffic light, a stop sign, a stop line, etc.) at the intersection. ) Or warning objects (pedestrians, bicycles, other vehicles approaching from the left and right, etc.), the driver need not be warned. Therefore, it is possible to accurately determine that there is no need to issue a warning to the driver, and only when a warning for confirmation near the intersection is necessary, the warning can be given to assist the driver in driving. .
When the predicted remaining time is equal to or shorter than a predetermined first time (for example, about 5 seconds) and longer than a predetermined second time (for example, about 2 seconds) shorter than the predetermined first time. If it is determined that the driver does not recognize the warning object (pedestrian, bicycle, other vehicle approaching from the left and right, etc.), the warning is given to the driver. It is possible to accurately determine that it is necessary to stop once before the object, and to give a warning, and to assist the driver in driving more safely.
Further, when the predicted remaining time is equal to or shorter than a predetermined second time (for example, about 2 seconds), the driver is a feature ahead in the traveling direction (for example, a traffic light, a stop sign, a stop line, etc.). If it is determined that the driver is not recognized, or if it is determined that the driver does not recognize the warning object (pedestrian, bicycle, other vehicle approaching from the left and right, etc.) This makes it possible for the driver to accurately determine whether or not to temporarily stop in front of the warning object, and to make a warning, so that the driver can surely stop once in front of the warning object. Thus, it is possible to assist in driving more safely.
Furthermore, the driver can easily recognize the degree of urgency that stops once before the warning object based on the warning content, and can assist the driver in driving more safely. .

In the driving support device according to claim 2, the driver is based on the other vehicle information including the position information of the other vehicle traveling in front of the vehicle and the driver's line-of-sight direction detected by the line-of-sight detection means. It is determined whether or not another vehicle traveling ahead is recognized at this intersection. When it is determined that the driver recognizes another vehicle traveling ahead at this intersection, the driver is notified when a warning object such as a pedestrian or another vehicle approaching from the left or right is detected. Warnings given to the user are suppressed.
As a result, if the driver recognizes other vehicles traveling ahead even if he / she does not recognize pedestrians, other vehicles approaching from the left and right, and features such as traffic lights, the other vehicles When stopping once, the driver stops behind the other vehicle, so there is no need to warn the driver, so only a warning for confirmation of pedestrians is required. Thus, it becomes possible to support the driving of the driver.

Furthermore, in the driving support apparatus according to claim 3 , both a planar paint display such as a temporary stop line provided on the road surface and a three-dimensional object such as a road sign and a traffic light provided along the road are provided. It is possible to determine whether or not the object is recognized as a feature, and it is possible to further increase the reliability of the determination as to whether or not the driver recognizes the feature.

  Hereinafter, a driving support device according to the present invention will be described in detail with reference to the drawings based on a first embodiment and a second embodiment in which the navigation device is embodied.

  First, a schematic configuration of a vehicle on which the navigation device according to the first embodiment is mounted will be described with reference to FIG. FIG. 1 is a schematic configuration diagram of a vehicle 2 on which a navigation device 1 according to the first embodiment is mounted.

As shown in FIG. 1, a camera ECU (Electronic Control Unit) 51 that drives and controls a CCD camera or the like is electrically connected to the navigation control unit 13 of the navigation device 1 installed on the vehicle 2. In addition, on the left end of the top end of the vehicle 2 and on the right side of the top end of the front end, and on the left side of the rear end and on the right side of the end of the rear end, each outside camera 53 constituted by a CCD camera or the like is installed. A driver camera 52 composed of a CCD camera or the like is installed near the speedometer.
The navigation control unit 13 is electrically connected to a radar ECU (Electronic Control Unit) 61 that drives and controls a millimeter wave radar or the like. Further, millimeter wave radars 62 are installed at the front end center position and the rear end center position of the vehicle 2.

  The navigation device 1 is provided on the center console or the panel surface of the vehicle 2, displays a search route to a map or destination on a liquid crystal display (LCD) 15, and provides voice guidance regarding route guidance by a speaker 16. Output. Further, the navigation device 1 can detect a pedestrian or the like ahead of the vehicle 2 by transmitting a control signal to the camera ECU 51 when a predetermined condition is satisfied. In addition, when the predetermined condition is satisfied, the navigation device 1 can transmit a control signal to the radar ECU 61 to detect the distance to the other vehicle approaching the front and rear of the vehicle 2 and the relative speed of the other vehicle. .

Further, the camera ECU 51 detects the pedestrians, bicycles, and the like ahead of the vehicle 2 by performing image processing on the video signals of the respective outside cameras 53 based on the control signal received from the navigation device 1. The camera ECU 51 detects the movement of the face and the movement of the eyes when the driver performs left-right confirmation by performing image processing on the video signal of the driver camera 52 based on the control signal received from the navigation device 1. An electronic control unit that outputs detection signals for the face direction and the line-of-sight direction, to which a driver camera 52 and each vehicle camera 53 are connected.
The radar ECU 61 is an electronic control unit that measures the distance and relative speed between the front and rear vehicles based on the detection signal of the millimeter wave radar 62 based on the control signal received from the navigation device 1. It is connected.

  Next, the configuration related to the control system of the vehicle 2 according to the first embodiment will be described with reference to FIG. FIG. 2 is a block diagram schematically showing a control system centered on the navigation device 1 mounted on the vehicle 2.

  As shown in FIG. 2, the control system of the vehicle 2 is configured based on a navigation device 1 and a camera ECU 51 and a radar ECU 61 that are electrically connected to the navigation device 1. A predetermined peripheral device is connected.

The navigation device 1 includes a current position detection processing unit 11 that detects a current position of the own vehicle (hereinafter referred to as “own vehicle position”), a data recording unit 12 that records various data, and input information. Based on the navigation control unit 13 that performs various arithmetic processing, the operation unit 14 that receives operations from the operator, a liquid crystal display (LCD) 15 that displays information such as a map to the operator, and route guidance The communication device 17 is configured to communicate with each other between a speaker 16 that outputs voice guidance and an information center such as a road traffic information center (VICS: registered trademark). The navigation control unit 13 is connected to a vehicle speed sensor 21 that detects the traveling speed of the vehicle.
The communication device 17 is, for example, a LAN (Local Area Network), a WAN (Wide Area Network), an intranet, a mobile phone network, a telephone network, a public communication network, a dedicated communication network, a communication network such as the Internet, etc. It can be used by connecting to other communication systems.

Below, each component which comprises the navigation apparatus 1 is demonstrated.
As shown in FIG. 2, the current position detection processing unit 21 includes a GPS 31, a direction sensor 32, a distance sensor 33, an altimeter (not shown), and the like, and the current position, direction, and target (for example, an intersection) of the own vehicle. Can be detected.

  Specifically, the GPS 31 detects the current location and current time of the vehicle on the earth by receiving radio waves generated by artificial satellites. The azimuth sensor 32 includes a geomagnetic sensor, a gyro sensor, an optical rotation sensor attached to a rotating portion of a steering wheel (not shown), a rotation resistance sensor, an angle sensor attached to a wheel, or the like. Detect the direction of the vehicle. The distance sensor 33 measures, for example, the rotational speed of a wheel (not shown) of the host vehicle, detects the distance based on the measured rotational speed, measures the acceleration, and integrates the measured acceleration twice. Thus, it is composed of a sensor or the like that detects the distance, and detects the moving distance of the vehicle.

  The data recording unit 12 includes an external storage device and a hard disk (not shown) as a recording medium, a map information database (map information DB) 25 stored in the hard disk, and a warning determination table 65 described later (see FIG. 3). And a recording head (not shown) which is a driver for reading predetermined programs and the like and writing predetermined data to the hard disk. In the first embodiment, a hard disk is used as the external storage device and storage medium of the data recording unit 12, but in addition to the hard disk, a magnetic disk such as a flexible disk, memory card, CD, DVD, optical disk, IC A card or the like can also be used as an external storage device.

  The map information DB 25 is composed of various information necessary for route guidance and map display. For example, new road information for specifying each new road, map display data for displaying a map, each intersection Intersection data on nodes, node data on node points, link data on roads (links) as a kind of facility, search data for searching for routes, store data on POI (Point of Interest) such as shops as a kind of facility, It consists of search data for searching for points, feature information on various features such as traffic lights at intersections, and the like. Further, the contents of the map information DB 25 are updated by downloading update information distributed from an information distribution center (not shown) via the communication device 17.

  Further, as the feature information, there are feature information of a paint display provided on the road surface of each road and feature information of a three-dimensional object provided along each road. This paint display, for example, specifies the lane markings that divide the lane (including information on the types of lane markings such as solid lines, broken lines, and double lines), zebra zones, stop lines, pedestrian crossings, and the traveling direction of each lane The display includes a direction-by-direction traffic classification display, a speed display, and the like. In addition to various road signs and traffic lights, the three-dimensional objects include various three-dimensional objects provided on or around the road, such as guardrails, buildings, utility poles, and signboards.

  The feature information includes position information, shape information, color information, and related information as specific information contents. Each feature information has a feature information identification ID unique to each feature information. The position information is position information on a map represented by latitude and longitude. Further, the position information includes height information indicating the height from the road surface in addition to the information indicating the planar position. Regarding the position information of the feature information of the three-dimensional object, height information is particularly important. The shape information and the color information are information expressed by modeling the specific shapes and colors of the various features as described above.

  The related information is information for associating features existing at close positions. Specifically, the related information of each feature information stores information such as a feature information identification ID that represents feature information about other features present at close positions. Specific examples of combinations of features related by this related information include, for example, a crosswalk and a stop line, a stop line and a stop sign, a traffic light and a crosswalk, a stop line, and a pair of traffic lights. There are traffic lights, traffic lights indicating zebra zones and road branches, traffic division display according to the traveling direction, and traffic division display according to the traveling direction of the adjacent lane. Therefore, since the traffic light at the intersection, the pedestrian crossing, and the temporary stop line are associated with each other by the feature information identification ID, the intersection ahead of the traveling direction of the vehicle position is specified from the map stored in the map information DB 25, By identifying the traffic light at this intersection, it is possible to identify feature information such as a pedestrian crossing and a temporary stop line at this intersection based on the relevant information regarding this traffic light.

As shown in FIG. 2, the navigation control unit 13 constituting the navigation device 1 is an arithmetic device that performs overall control of the navigation device 1, a CPU 41 as a control device, and working when the CPU 41 performs various arithmetic processes. In addition to the RAM 42 for storing route data when a route is searched, a control program, and a warning processing program for warning of confirmation of pedestrians and the like at an intersection described later (FIG. 4). ROM 43 storing a reference), an internal storage device such as a flash memory 44 for storing a program read from the ROM 43, a timer 45 for measuring time, and the like.
Here, a semiconductor memory, a magnetic core, or the like is used as the RAM 42, the ROM 43, the flash memory 44, or the like. As the arithmetic device and the control device, an MPU or the like can be used instead of the CPU 41.

  In the first embodiment, various programs are stored in the ROM 43 and various data are stored in the data recording unit 12. However, the programs, data, and the like are stored in the same external storage device, memory card, and the like. It is also possible to read out a program, data, etc. from the memory etc. Further, the program, data, etc. can be updated by exchanging a memory card or the like.

  Furthermore, the navigation control unit 13 is electrically connected to peripheral devices (actuators) of the operation unit 14, the liquid crystal display 15, the speaker 16, and the communication device 17.

The operation unit 14 is operated when correcting the current location at the start of travel, inputting a departure point as a guidance start point and a destination as a guidance end point, or searching for information about facilities, and the like. And a plurality of operation switches. The navigation control unit 13 performs control to execute various corresponding operations based on switch signals output by pressing the switches. Note that the operation unit 14 may be configured by a keyboard, a mouse, or the like, or a touch panel provided on the front surface of the liquid crystal display 15.
The liquid crystal display 15 also has operation guidance, operation menus, key guidance, guidance route from the current location to the destination, guidance information along the guidance route, traffic information, news, weather forecast, time, mail, TV program, etc. Is displayed.

  In addition, the speaker 16 outputs travel guidance along the guidance route, voice guidance for warning of stoppage or safety confirmation at an intersection or pedestrian crossing, and the like based on an instruction from the navigation control unit 13. Here, as the voice guidance to be guided, for example, “200m ahead, right at the XX intersection”, or “Pip, beep, ... (intermittent sound) warning the confirmation of pedestrians at the intersection, etc. ) "Etc.

  The communication device 17 is a communication means such as a mobile phone network that communicates with the information distribution center, and transmits / receives the latest version of updated map information and the like to / from the information distribution center. Further, in addition to the information distribution center, the communication device 17 receives traffic information including information such as traffic jam information transmitted from a road traffic information center (VICS (registered trademark)) and the congestion status of a service area.

In addition, the camera ECU 51 includes a data receiving unit 51A that receives control information transmitted from the navigation control unit 13, and controls the driver camera 52 and each vehicle-mounted camera 53 based on the received control information and performs image recognition. An image recognition unit 51B is provided.
The radar ECU 61 includes a data receiving unit 61A that receives the control information transmitted from the navigation control unit 13, and controls the millimeter wave radar 62 based on the received control information, and the distance from other vehicles in front and rear. And a control unit 61B for detecting the relative speed.

Here, the warning determination table 65 stored in the warning determination DB 26 will be described with reference to FIG. FIG. 3 is a diagram illustrating an example of the warning determination table 65 stored in the warning determination DB 26.
As shown in FIG. 3, the warning determination table 65 includes a collision remaining time 65 </ b> A representing a predicted remaining time until a collision with a pedestrian or the like (warning object) crossing an intersection, a driver's pedestrian or the like (warning target) (○ mark indicates recognition. ○ mark indicates unrecognition.) Recognizing 65B, warning indicating the content of warning via speaker 16 or the like Contents 65C.

Next, warning processing for warning of confirmation of a pedestrian or the like at an intersection performed by the CPU 41 of the navigation device 1 configured as described above will be described with reference to FIGS. 4 and 5.
FIG. 4 is a flowchart illustrating a warning process for performing a warning of confirmation of a pedestrian or the like at an intersection executed by the CPU 41 of the navigation device 1 according to the first embodiment. FIG. 5 is a diagram illustrating an example of a state in which the driver recognizes a feature such as a traffic light at an intersection.
Note that the program shown in the flowchart of FIG. 4 is stored in the ROM 43 provided in the navigation control unit 13 of the navigation device 1 and is executed by the CPU 41 at regular intervals (for example, every 10 msec to 100 msec).

As shown in FIG. 4, first, in step (hereinafter abbreviated as “S”) 11, the CPU 41 detects the own vehicle position and the direction of the own vehicle by the current position detection processing unit 11, and the own vehicle Coordinate data representing the position (for example, latitude and longitude data) and the vehicle direction are stored in the RAM 42.
Subsequently, in S12, the CPU 41, based on the map information stored in the map information DB 25 and the own vehicle position data, within a predetermined distance from the own vehicle 2 ahead of the traveling direction of the road on which the own vehicle 2 is traveling ( For example, a determination process is performed to determine whether or not an intersection exists at a position of about 200 m to 300 m.

If no intersection exists at a position within a predetermined distance from the host vehicle 2 (for example, within about 200 m to 300 m) ahead of the traveling direction of the road on which the host vehicle 2 is traveling (S12: NO), the CPU 41 ends the process.
On the other hand, when an intersection exists at a position within a predetermined distance from the own vehicle 2 (for example, within about 200 m to 300 m) ahead of the traveling direction of the road on which the own vehicle 2 is traveling (S12: YES), the CPU 41 obtains the coordinate data of the intersection (for example, latitude and longitude data), stores it in the RAM 42, and proceeds to the processing of S13.

  In S <b> 13, the CPU 41 acquires feature information regarding various features such as traffic lights at the intersection from the map information DB 25 and stores them in the RAM 42. For example, the CPU 41 identifies traffic lights and stop signs at intersections from the map stored in the map information DB 25, and acquires position information and related information including height information of the traffic lights and stop signs from the map information DB 25. And stored in the RAM 42. Further, the CPU 41 specifies feature information such as a pedestrian crossing and a temporary stop line at the intersection based on the relevant information regarding the traffic signal and the stop sign, and maps the position information such as the pedestrian crossing and the temporary stop line to the map. Obtained from the information DB 25 and stored in the RAM 42.

Subsequently, in S14, the CPU 41 causes a pedestrian (warning object) or bicycle crossing the intersection in front of the traveling direction via each of the outside cameras 53 provided on the left end of the top end and the right end of the top end of the host vehicle 2. (Warning object) Further, a determination process for determining whether or not there is another vehicle (warning object) traveling from the left and right of this intersection is executed.
If there are no pedestrians (warning objects) or bicycles (warning objects) crossing the intersection, or other vehicles (warning objects) traveling from the left and right of this intersection (S14: NO) ), The CPU 41 ends the process.

On the other hand, if there are pedestrians (warning objects) or bicycles (warning objects) crossing the intersection, and other vehicles (warning objects) traveling from the left and right of this intersection (S14: YES) ) The CPU 41 performs image processing on the video signal of each vehicle exterior camera 53 via the camera ECU 51, and a pedestrian (warning object), bicycle (warning object), and other vehicles traveling from the left and right of this intersection. Etc. (warning object) and the like are acquired and stored in the RAM 42, and then the process proceeds to S15.
In S <b> 15, the CPU 41 executes a determination process for determining whether or not there is another vehicle ahead of the host vehicle 2 by the millimeter wave radar 62 attached to the center position of the front end of the host vehicle 2.
When no other vehicle exists in front of the host vehicle 2 (S15: NO), the CPU 41 proceeds to the process of S16.

In S16, CPU41 performs the determination process which determines whether the feature information regarding various features, such as a traffic signal in the said intersection, is memorize | stored in said S13.
And when the feature information regarding various features such as traffic lights at the intersection is not stored in S13 (S16: NO), the CPU 41 has various features such as traffic lights and stop lines at the intersection. It is determined that there is not, and the process proceeds to S17.
In S17, the CPU 41 determines a pedestrian (warning object) or a bicycle (warning object) that crosses the intersection detected through each of the outside cameras 53 provided on the left end and upper right end of the host vehicle 2. In addition, other vehicles or the like (warning object) traveling from the left and right of this intersection are set as recognition objects for determining whether or not the driver recognizes them, and each is stored in the RAM 42. Further, the CPU 41 reads the pedestrian flag from the RAM 42, sets the pedestrian flag to “ON”, stores the pedestrian flag in the RAM 42 again, and then proceeds to the processing of S21.
When the navigation apparatus 1 is activated, the pedestrian flag is set to OFF and stored in the RAM 42.

On the other hand, when the feature information related to various features such as traffic lights at the intersection is stored in S13 (S16: YES), the CPU 41 has various features such as traffic lights and temporary stop lines at the intersection. Is determined, and the process proceeds to S18.
In S18, the CPU 41 determines whether the pedestrian (warning object) or the bicycle (warning object) crossing the intersection detected via the on-vehicle cameras 53 provided on the left end and the upper right end of the host vehicle 2. In addition, other vehicles or the like (warning object) traveling from the left and right of this intersection are set as recognition objects for determining whether or not the driver recognizes, and traffic lights ahead in the traveling direction at the intersection Are set as recognition objects for determining whether or not the driver recognizes the various features, and each is stored in the RAM 42. Further, the CPU 41 reads the feature flag from the RAM 42, sets the feature flag to “ON”, stores the feature flag in the RAM 42 again, and then proceeds to the processing of S21.
Note that the feature flag is set to OFF and stored in the RAM 42 when the navigation device 1 is activated.

If there is another vehicle in front of the host vehicle 2 in S15 (S15: YES), the CPU 41 proceeds to the process of S19.
In S <b> 19, the CPU 41 measures the distance from the other vehicle traveling in front and the relative traveling speed by the millimeter wave radar 62 attached to the center position of the front end of the host vehicle 2. At the same time, the CPU 41 performs image processing on the video signals of the respective outside cameras 53 via the camera ECU 51 to acquire position information and traveling directions of other vehicles traveling ahead. Then, the CPU 41 stores the pedestrian (warning object) and the bicycle (warning object) stored in the RAM 42 in S14, positional information of other vehicles (warning object) traveling from the left and right of this intersection, and the like. Pedestrians (warning objects) and bicycles (warning objects) where the other vehicles traveling ahead cross the intersection, and other vehicles (warning objects) traveling from the left and right of this intersection A determination process for determining whether or not there is a possibility of contact is performed.

Then, the other vehicle traveling ahead contacts the pedestrian (warning object) or bicycle (warning object) crossing the intersection, and other vehicles (warning object) traveling from the left and right of the intersection. When there is no possibility of doing, for example, when passing an intersection, without contacting a pedestrian etc. (S19: NO), CPU41 transfers to the process of S17.
On the other hand, the other vehicle traveling ahead contacts a pedestrian (warning object) or bicycle (warning object) crossing the intersection, or another vehicle (warning object) traveling from the left or right of this intersection. If there is a possibility of doing so (S19: YES), the CPU 41 proceeds to the process of S20.
In S <b> 20, the CPU 41 is set as a recognition object for determining whether or not the driver recognizes another vehicle traveling in front of the host vehicle 2, and stores it in the RAM 42. Further, the CPU 41 reads the forward vehicle flag from the RAM 42, sets the forward vehicle flag to “ON”, stores it in the RAM 42 again, and then proceeds to the processing of S21.
When the navigation device 1 is activated, the front vehicle flag is set to OFF and stored in the RAM 42.

  Subsequently, in S <b> 21, the CPU 41 detects again the vehicle position indicating the vehicle position and the direction of the vehicle by the current position detection processing unit 11, and stores the coordinate data indicating the vehicle position and the vehicle direction in the RAM 42. . Further, the CPU 41 detects the vehicle speed via the vehicle speed sensor 21 and stores it in the RAM 42. Further, the CPU 41 measures again the distance and the relative traveling speed with the other vehicle traveling ahead by the millimeter wave radar 62 attached to the center position of the front end of the host vehicle 2 and stores it in the RAM 42.

In S22, the CPU 41 reads the pedestrian flag from the RAM 42. If the pedestrian flag is ON, the CPU 41 detects the recognition target object such as a pedestrian (warning target object) set as the recognition target object in S17. The determination process which reads and determines whether the driver recognizes this recognition target object is performed.
When the pedestrian flag is OFF, the CPU 41 reads the feature flag from the RAM 42, and when the pedestrian flag is ON, the pedestrian or the like set as the recognition target in S18 (warning) The recognition object of various features, such as (object) and a traffic light, is read, and the determination process which determines whether the driver has recognized this recognition object is performed.
Further, when the feature flag is OFF, the CPU 41 reads out another vehicle traveling in front of the host vehicle 2 set as the recognition target in S19, and the driver recognizes the recognition target. A determination process for determining whether or not there is is executed.

  Specifically, the pedestrian flag is set to ON in S17, a pedestrian (warning object) or a bicycle (warning object) crossing the intersection, and other vehicles traveling from the left and right of this intersection, etc. When (warning object) is set as the recognition object, the CPU 41 detects the driver's face direction and line-of-sight direction detection signal obtained by performing image processing on the video signal of the driver camera 52 via the camera ECU 51. Based on the above, it is determined whether or not the driver is looking at these pedestrians (warning objects). When the driver is looking at these pedestrians or the like (warning object), the CPU 41 determines that the driver recognizes the pedestrian or the like (warning object) at the intersection. The pedestrian recognition flag is read out, the pedestrian recognition flag is set to ON, and stored in the RAM 42 again.

When the navigation device is activated, the pedestrian recognition flag is set to OFF and stored in the RAM 42.
On the other hand, when the driver does not see these pedestrians (warning objects), the CPU 41 determines that the driver does not recognize the pedestrians (warning objects) at the intersection, and from the RAM 42. The pedestrian recognition flag is read out, and the pedestrian recognition flag is set to OFF and stored in the RAM 42 again.

  In S18, the feature flag is set to ON, and a pedestrian (warning object) or a bicycle (warning object) crossing the intersection, another vehicle traveling from the left or right of this intersection, etc. (warning object) When various features such as traffic lights ahead in the traveling direction at the intersection are set as recognition objects, the CPU 41 obtains the video signal of the driver camera 52 through the camera ECU 51 by performing image processing. Whether or not the driver is looking at any of these pedestrians (warning objects) and various features ahead in the direction of travel at the intersection based on the detection signals of the driver's face direction and line-of-sight direction judge. When the driver is looking at these pedestrians or the like (warning object), the CPU 41 determines that the driver recognizes the pedestrian or the like (warning object) at the intersection. The pedestrian recognition flag is read out, the pedestrian recognition flag is set to ON, and stored in the RAM 42 again. Further, when the driver sees various features ahead in the traveling direction at the intersection, the CPU 41 determines that the driver recognizes various features ahead in the traveling direction at the intersection, and from the RAM 42. The feature recognition flag is read out, the feature recognition flag is set to ON, and stored in the RAM 42 again.

Note that the feature recognition flag is set to OFF and stored in the RAM 42 when the navigation device is activated.
On the other hand, when the driver does not see these pedestrians (warning objects), the CPU 41 determines that the driver does not recognize the pedestrians (warning objects) at the intersection, and from the RAM 42. The pedestrian recognition flag is read out, and the pedestrian recognition flag is set to OFF and stored in the RAM 42 again. When the driver does not see the various features ahead in the traveling direction at the intersection, the CPU 41 determines that the driver does not recognize the various features ahead in the traveling direction at the intersection. The feature recognition flag is read out, and the feature recognition flag is set to OFF and stored in the RAM 42 again.

  For example, as shown in FIG. 5, when the own vehicle 2 enters the intersection 70 on the road 68 from the right side in FIG. The other vehicle 75 (warning object) that travels on the road 69 orthogonal to the road 68 and enters the intersection 70 via each vehicle exterior camera 53 provided on the right side of the upper end, and each of the vehicles crossing the intersection 70 Pedestrians 81 and 82 (warning objects) are detected and set as recognition objects. Further, the CPU 41 sets various features such as a traffic light 71A, a pedestrian crossing 73C, and a temporary stop line 72C provided at the intersection 70 as recognition objects. Further, the CPU 41 reads the feature flag from the RAM 42, sets the feature flag to ON, and stores it in the RAM 42.

  Then, the CPU 41 detects the driver's face direction and line-of-sight direction detection signals obtained by performing image processing on the video signal of the driver camera 52 via the camera ECU 51, and various points such as a traffic light 71A, a pedestrian crossing 73C, and a temporary stop line 72C. Based on the feature information of the object (for example, the coordinate position of the installed latitude and longitude, the height from the ground, etc.), the line-of-sight direction 91 where the driver looks at the front traffic light 71A, A line-of-sight direction 92 for detecting the temporary stop line 72C and the pedestrian crossing 73C is detected. At the same time, the CPU 41 performs image processing on the video signal of the driver camera 52 via the camera ECU 51 and detects the driver's face direction and line-of-sight detection signals, and the stored pedestrians 81 and 82 (warning objects). Based on the position information of the other vehicle 75 (warning object) and the own vehicle position, the line-of-sight direction 101 in which the driver looks at the other vehicle 75 and the line-of-sight directions 102 and 103 in which each pedestrian 81 and 82 is seen are detected. To do.

  Then, the CPU 41 detects the driver's face direction and line-of-sight direction, and information on features such as traffic lights 71A, a pedestrian crossing 73C, and a temporary stop line 72C (for example, coordinate positions of installed latitude and longitude). , The height from the ground, etc.) is detected, it is determined that the driver has recognized the traffic light 71A, the temporary stop line 72C, and the pedestrian crossing 73C, The feature recognition flag is read from the RAM 42, the feature recognition flag is set to ON, and stored in the RAM 42 again. In addition, the CPU 41 detects each gaze direction based on the detection signal of the driver's face direction and the gaze direction, the position information of each pedestrian 81, 82 (warning object), the other vehicle 75 (warning object), and the like. If 101 to 103 are detected, it is determined that the driver has recognized the other vehicle 75 and the pedestrians 81 and 82, the pedestrian recognition flag is read from the RAM 42, and the pedestrian recognition flag is set to ON. The data is stored in the RAM 42 again.

  On the other hand, the CPU 41 detects the driver's face direction and line-of-sight direction, and information on features such as traffic lights 71A, a pedestrian crossing 73C, and a temporary stop line 72C (for example, coordinate positions of installed latitude and longitude). If the respective gaze directions 91 and 92 are not detected, it is determined that the driver does not recognize the traffic light 71A, the temporary stop line 72C, and the pedestrian crossing 73C. Then, the feature recognition flag is read from the RAM 42, the feature recognition flag is set to OFF, and stored again in the RAM 42. In addition, the CPU 41 detects each gaze direction based on the detection signal of the driver's face direction and the gaze direction, the position information of each pedestrian 81, 82 (warning object), the other vehicle 75 (warning object), and the like. If 101 to 103 are not detected, it is determined that the driver does not recognize the other vehicle 75 and the pedestrians 81 and 82, the pedestrian recognition flag is read from the RAM 42, and the pedestrian recognition flag is turned off. And stored in the RAM 42 again.

  In S22, when another vehicle traveling in front of the host vehicle 2 is set as the recognition target in S20, the CPU 41 performs image processing on the video signal of the driver camera 52 via the camera ECU 51. Based on the acquired detection signal of the driver's face direction and line-of-sight direction and the positional information of the other vehicle traveling in front obtained by image processing of the video signal of the outside camera 53, the vehicle travels in front of the host vehicle 2. It is determined whether or not the driver is looking at the other vehicle. When the driver is looking at another vehicle traveling in front of the host vehicle 2, the CPU 41 determines that the driver recognizes the other vehicle, reads the front vehicle recognition flag from the RAM 42, and The forward vehicle recognition flag is set to ON and stored in the RAM 42 again.

When the navigation device 1 is activated, the forward vehicle recognition flag is set to OFF and stored in the RAM 42.
On the other hand, when the driver does not see the other vehicle traveling in front of the host vehicle 2, the CPU 41 determines that the driver does not recognize the other vehicle, reads the front vehicle recognition flag from the RAM 42, The forward vehicle recognition flag is set to OFF and stored in the RAM 42 again.

For example, as shown in FIG. 5, when the own vehicle 2 enters the intersection 70 on the road 69 from the lower side in FIG. 5 at the intersection 70, the CPU 41 is positioned at the center position of the front end of the own vehicle 2. The other vehicle 76 traveling ahead is detected by the attached millimeter wave radar 62, and the distance and relative traveling speed with respect to the other vehicle 76 are measured and stored in the RAM 42. Further, the CPU 41 acquires the position information and the traveling direction of the other vehicle 76 that travels ahead by performing image processing on the video signals of the respective outside cameras 53 via the camera ECU 51, and stores them in the RAM 42. At the same time, the CPU 41 performs image processing on the video signals of the respective outside cameras 53 via the camera ECU 51 to acquire the position information, the traveling direction, and the like of the pedestrian 83 (warning object) and store them in the RAM 42. And when CPU41 determines with the possibility that the said other vehicle 76 may contact the pedestrian 83 based on the positional information etc. of the other vehicle 76 which drive | works ahead, and the positional information of the pedestrian 83, etc. Sets the other vehicle 76 as a recognition object. Further, the CPU 41 reads the forward vehicle flag from the RAM 42, sets the forward vehicle flag to ON, and stores it in the RAM 42.
Then, the CPU 41 is based on the detection signal of the driver's face direction and line-of-sight direction obtained by performing image processing on the video signal of the driver camera 52 via the camera ECU 51, the position information of the other vehicle 76 traveling ahead, and the like. Thus, the line-of-sight direction 93 in which the driver looks at the other vehicle 76 in front is detected. When the CPU 41 detects the line-of-sight direction 93, the CPU 41 determines that the driver has recognized the other vehicle 76 corresponding to the line-of-sight direction 93, reads the forward vehicle recognition flag from the RAM 42, and sets the forward vehicle recognition flag. It is set to ON and stored in the RAM 42 again.
On the other hand, when the CPU 41 does not detect the line-of-sight direction 93, the CPU 41 determines that the other vehicle 76 corresponding to the line-of-sight direction 93 is not recognized by the driver, reads the forward vehicle recognition flag from the RAM 42, and The vehicle recognition flag is set to OFF and stored in the RAM 42 again.

  Subsequently, in S23, the CPU 41 performs a determination process for determining whether or not to warn the driver of confirmation of a pedestrian or the like at the intersection based on the warning determination table 65 stored in the warning determination DB 26. Execute.

Specifically, in S23, the CPU 41 calculates the distance from the vehicle position acquired in S21 to the intersection ahead in the traveling direction, that is, the collision prediction remaining distance. Then, the CPU 41 divides this predicted collision remaining distance by the vehicle speed acquired in S21, calculates the time to reach the front intersection, that is, the remaining collision time, and stores it in the RAM 42.
Subsequently, the CPU 41 reads the warning determination table 65 from the warning determination DB 26.
Then, the CPU 41 reads out the remaining collision time from the RAM 42, and when the remaining collision time is longer than “7 seconds” of the remaining collision time 65A of the warning determination table 65, the CPU 41 asks the driver of a pedestrian or the like at the intersection. It is determined that there is no need to issue a confirmation warning (S23: NO), and the forward vehicle flag, pedestrian flag, feature flag, pedestrian recognition flag, feature recognition flag, and forward vehicle recognition flag are read from the RAM 42, Each flag is set to OFF and stored again in the RAM 42, and then the process ends.

Further, the CPU 41 reads the remaining collision time from the RAM 42, and when the remaining collision time is 7 seconds or less, the CPU 41 reads the pedestrian recognition flag, the feature recognition flag, and the forward vehicle recognition flag from the RAM 42, and any of the flags is It is determined whether or not it is ON.
If any flag is ON, the CPU 41 corresponds to “7 seconds” of the remaining collision time 65A from the warning determination table 65 when the remaining collision time is longer than 5 seconds and 7 seconds or less. If the recognition content 65B is “○”, the warning content 65C “None” is read, and if the remaining collision time is longer than 2 seconds and less than 5 seconds, the recognition corresponding to “5 seconds” of the remaining collision time 65A When the warning content 65C “N” is read out by 65B and the remaining collision time is 2 seconds or less, the warning content 65B corresponding to “2 seconds” of the remaining collision time 65A is “○”. 65C “None” is read out, it is determined that the driver is not warned of confirmation of a pedestrian or the like at the intersection (S23: NO), and the output of the speaker 16 is set to OFF. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends. Therefore, when the warning sound is output and notified through the speaker 16, the CPU 41 stops the warning sound and ends the process.

On the other hand, if any flag is OFF, the CPU 41 determines that it is necessary to warn the driver of confirmation of a pedestrian or the like at the intersection (S23: YES), and proceeds to the process of S24. To do.
In S24, the CPU 41 reads the remaining collision time from the RAM 42, and when the remaining collision time is longer than 5 seconds and shorter than 7 seconds, the warning determination table 65 corresponds to “7 seconds” of the remaining collision time 65A. The warning 65B “Pong” with the recognition 65B of “×” is read out, and the warning sound “Pong” is output and notified through the speaker 16. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

Further, the CPU 41 reads out the remaining collision time from the RAM 42, and when the remaining collision time is longer than 2 seconds and shorter than 5 seconds, the CPU 41 recognizes from the warning determination table 65 corresponding to “5 seconds” of the remaining collision time 65A. The warning content 65C “beep, beep,... (Intermittent sound)” with 65B being “x” is read out, and the intermittent sound “beep, beep,... Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.
Further, the CPU 41 reads the remaining collision time from the RAM 42, and when the remaining collision time is 2 seconds or less, the recognition 65B corresponding to “2 seconds” of the remaining collision time 65A is “x” from the warning determination table 65. The warning content 65 </ b> C “P (continuous sound)” is read, and the continuous sound “P” is output and notified as a warning sound through the speaker 16. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

  As described in detail above, in the navigation device 1 according to the first embodiment, when there are various features at the intersection in front of the traveling direction of the host vehicle 2, the CPU 41 detects these features (for example, traffic lights, stop signs) , Feature information (for example, installed coordinate positions of latitude and longitude, height from the ground, etc.) and video of the driver camera 52 via the camera ECU 51. Based on the detection signal of the driver's face direction and line-of-sight direction obtained by image processing the signal, it is determined whether or not the driver recognizes each feature at this intersection (S16: YES to S18). S21 to S22). And when it determines with the driver | operator recognizing each feature in this intersection, CPU41 turns ON a feature recognition flag and does not perform a warning via the speaker 16 with respect to a driver | operator (S22-). S23: NO).

  As a result, even if the driver does not recognize pedestrians or other vehicles approaching from the left and right near the intersection, the driver can view features such as traffic lights installed at the intersection and a stop line in front. If it is recognized, the driver temporarily stops in front of the intersection, so there is no need to give a warning to the driver via the speaker 16, so that a warning of confirmation of a pedestrian or the like at the intersection It is possible to assist the driver's driving by giving a warning only when necessary.

  When another vehicle traveling in front of the host vehicle 2 is set as an object to be recognized, the CPU 41 captures the driver's face obtained by performing image processing on the video signal of the driver camera 52 via the camera ECU 51. Based on the detection signal of the direction and the line-of-sight direction, it is determined whether or not the driver is looking at another vehicle traveling in front of the host vehicle 2 (S15: YES to S19: YES to S20 to S22). When it is determined that the driver recognizes another vehicle traveling ahead at this intersection, the CPU 41 turns on the front vehicle recognition flag and issues a warning to the driver via the speaker 16. No (S22 to S23: NO).

  Thereby, in the vicinity of the intersection, when the driver recognizes other vehicles traveling ahead, even if the driver does not recognize pedestrians, other vehicles approaching from the left and right, and features such as traffic lights, When the other vehicle stops at one end, the driver temporarily stops behind the other vehicle, so there is no need to give a warning to the driver via the speaker 16, so a pedestrian or the like at the intersection It becomes possible to provide a warning only when a confirmation warning is necessary and to support the driver more appropriately.

  In addition, there are no various features (for example, traffic lights, temporary stop signs, one-stop lines, etc.) at intersections ahead of the traveling direction of the host vehicle 2, and there are no other vehicles traveling in front of the host vehicle 2. In other words, the CPU 41 pedestrian crossing the intersection (warning object) based on the detection signal of the driver's face direction and line-of-sight direction obtained by performing image processing on the video signal of the driver camera 52 via the camera ECU 51. It is determined whether or not the driver is looking at the vehicle or the bicycle (warning object) or other vehicles (warning object) traveling from the left and right of this intersection (S15: NO to S16: NO to S17). S21 to S22). And when it determines with the driver | operator recognizing the pedestrian etc. (warning object) in this intersection, CPU41 turns ON a pedestrian recognition flag and warns the driver | operator via the speaker 16 with respect to a driver | operator. (S22 to S23: NO).

  Accordingly, even if the driver does not recognize a feature (eg, a traffic light, a stop sign, a stop line, etc.) or another vehicle traveling in front of the pedestrian near the intersection. Etc. (warning object), the driver temporarily stops before the pedestrian or the like (warning object), so there is no need to give a warning to the driver via the speaker 16. For this reason, it is possible to provide a warning only when a warning for confirmation of a pedestrian or the like at an intersection is necessary, and to further appropriately support the driving of the driver.

In S24, when the remaining collision time is 5 seconds or less, the CPU 41 reads the warning content 65C “pip, beep,... (Intermittent sound)”, and intermittent sound is generated as a warning sound through the speaker 16. “Pip, beep,...” Is output and notified. In S24, if the remaining collision time is 2 seconds or less, the CPU 41 reads the warning content 65C “Pe (continuous sound)” and outputs a continuous sound “Pie” as a warning sound via the speaker 16 To do.
Thus, since the warning sound is different between the remaining collision time of 5 seconds or less and 2 seconds or less, the driver can easily recognize the degree of urgency that stops temporarily before the warning object based on the warning content. It becomes possible, and it becomes possible to assist the driver to drive more safely.

  In addition, as feature information, feature information of paint display such as temporary stop lines and pedestrian crossings provided on the road surface of each road, and three-dimensional objects such as traffic lights and stop signs provided along each road There is feature information. As a result, the CPU 41 displays a planar paint display such as a one-stop line provided on the road surface via the driver camera 52 and a solid object such as a stop sign or a traffic light provided along the road. By detecting the line-of-sight direction, it is possible to determine whether each is recognized as a feature, and it is possible to further increase the reliability of the determination whether the driver recognizes the feature. It becomes.

Next, a navigation device 111 according to the second embodiment will be described with reference to FIGS.
In the following description, the same reference numerals as in the configuration of the navigation apparatus 1 according to the first embodiment in FIGS. 1 to 5 indicate the same or corresponding parts in the first embodiment as the configuration of the navigation apparatus 1. .
FIG. 6 is a flowchart illustrating a warning process for performing a warning of confirmation of a pedestrian or the like at an intersection performed by the CPU 41 of the navigation device 111 according to the second embodiment. FIG. 7 is a diagram illustrating an example of the feature warning determination table 121 stored in the warning determination DB 26 of the navigation device 111 according to the second embodiment.

The schematic configuration and control configuration of the navigation device 111 according to the second embodiment are substantially the same as those of the navigation device 1 according to the first embodiment.
However, in the navigation device 111 according to the second embodiment, the feature warning determination table 121 and the warning determination table 65 shown in FIG. 7 are stored in the warning determination DB 26, and as shown in FIG. It is different from the navigation apparatus 1 according to the first embodiment in that the processes of S123 and S124 are executed instead of the process of S24.

Hereinafter, the processing executed by the CPU 41 of the navigation device 111 according to the second embodiment in S123 and S124 will be specifically described.
In S <b> 123, the CPU 41 determines whether or not to warn the driver of confirmation of a pedestrian or the like at the intersection based on the warning determination table 65 and the warning determination table 121 stored in the warning determination DB 26. Execute the judgment process.

  Specifically, in S123, the CPU 41 calculates the distance from the vehicle position acquired in S21 to the intersection ahead of the traveling direction, that is, the remaining collision prediction distance. Then, the CPU 41 divides this predicted collision remaining distance by the vehicle speed acquired in S21, calculates the time to reach the front intersection, that is, the remaining collision time, and stores it in the RAM 42.

Subsequently, the CPU 41 reads the feature flag from the RAM 42, and when the feature flag is OFF, reads the warning determination table 65 from the warning determination DB 26. That is, the CPU 41 is configured such that when another vehicle traveling in front of the host vehicle 2 is set as a recognition target, or a pedestrian (warning target) or bicycle (warning target) crossing the intersection, When only other vehicles (warning objects) traveling from the left and right of this intersection are set as recognition objects, the warning determination table 65 is read from the warning determination DB 26.
When the feature flag is OFF, the CPU 41 reads the remaining collision time from the RAM 42, and when the remaining collision time is longer than “7 seconds” of the remaining collision time 65A of the warning determination table 65, the driver It is determined that there is no need to issue a warning for confirmation of a pedestrian or the like at the intersection (S123: NO), and the process ends.

When the feature flag is OFF, the CPU 41 reads the remaining collision time from the RAM 42, and when the remaining collision time is 7 seconds or less, the CPU 41 reads the pedestrian recognition flag and the forward vehicle recognition flag from the RAM 42, It is determined whether any of the flags is ON.
When any of the flags is ON, the CPU 41 recognizes corresponding to “7 seconds” of the remaining collision time 65A from the warning determination table 65 when the remaining collision time is longer than 5 seconds and shorter than 7 seconds. When the warning content 65C “none” of 65B is read and the remaining collision time is longer than 2 seconds and shorter than 5 seconds, the recognition 65B corresponding to “5 seconds” of the remaining collision time 65A is displayed. The warning content 65C “none” of “◯” is read, and when the remaining collision time is 2 seconds or less, the recognition content 65B corresponding to “2 seconds” of the remaining collision time 65A is “○”. “None” is read out, it is determined that the driver is not warned of confirmation of a pedestrian or the like at the intersection (S123: NO), and the output of the speaker 16 is set to OFF. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends. Therefore, when the warning sound is output and notified through the speaker 16, the CPU 41 stops the warning sound and ends the process.

On the other hand, when both the pedestrian recognition flag and the forward vehicle recognition flag are OFF, the CPU 41 determines that it is necessary to warn the driver of confirmation of a pedestrian or the like at the intersection ( (S123: YES), the process proceeds to S124.
In S124, the CPU 41 reads the remaining collision time from the RAM 42, and when the remaining collision time is longer than 5 seconds and shorter than 7 seconds, it corresponds to “7 seconds” of the remaining collision time 65A from the warning determination table 65. The warning 65B “Pong” with the recognition 65B of “×” is read out, and the warning sound “Pong” is output and notified through the speaker 16. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

Further, the CPU 41 reads out the remaining collision time from the RAM 42, and when the remaining collision time is longer than 2 seconds and shorter than 5 seconds, the CPU 41 recognizes from the warning determination table 65 corresponding to “5 seconds” of the remaining collision time 65A. The warning content 65C “beep, beep,... (Intermittent sound)” with 65B being “x” is read out, and the intermittent sound “beep, beep,... Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.
Further, the CPU 41 reads the remaining collision time from the RAM 42, and when the remaining collision time is 2 seconds or less, the recognition 65B corresponding to “2 seconds” of the remaining collision time 65A is “x” from the warning determination table 65. The warning content 65 </ b> C “P (continuous sound)” is read, and the continuous sound “P” is output and notified as a warning sound through the speaker 16. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

On the other hand, when the feature flag read from the RAM 42 in S123 is ON, the CPU 41 reads the feature warning determination table 121 from the warning determination DB 26. That is, the CPU 41 travels along the intersection along with a pedestrian (warning object) and a bicycle (warning object) crossing the intersection, and other vehicles (warning object) traveling from the left and right of the intersection. When various features such as a traffic signal ahead are set as recognition objects, the feature warning determination table 121 is read from the warning determination DB 26.
When the feature flag is ON, the CPU 41 reads the remaining collision time from the RAM 42, and when the remaining collision time is longer than “7 seconds” of the remaining collision time 121A of the feature warning determination table 121, It is determined that there is no need to warn the driver of confirmation of a pedestrian or the like at an intersection (S123: NO), and a forward vehicle flag, pedestrian flag, feature flag, pedestrian recognition flag, ground The object recognition flag and the forward vehicle recognition flag are read out, each flag is set to OFF, and stored again in the RAM 42, and then the process ends.

  When the feature flag is ON, the CPU 41 reads the remaining collision time from the RAM 42. When the remaining collision time is 7 seconds or less, the CPU 41 reads the pedestrian recognition flag and the feature recognition flag from the RAM 42, Whether or not one or both of the flags is ON is determined, the corresponding warning content 121C is read from the warning determination table 121, and whether or not the driver is warned of confirmation of a pedestrian or the like at the intersection Judging.

  For example, when one or both of the pedestrian recognition flag and the feature recognition flag are ON, and the remaining collision time is longer than 5 seconds (predetermined first time) and not longer than 7 seconds, the CPU 41 From the warning determination table 121, the warning content 121C “No” or “O” in the recognition 121B corresponding to “7 seconds” of the remaining collision time 121A is read out, and the operation is performed. It is determined that a warning for confirmation of a pedestrian or the like at an intersection is not given to the person (S123: NO), and the output of the speaker 16 is set to OFF. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

  On the other hand, when both the pedestrian recognition flag and the feature recognition flag are OFF, and the remaining collision time is longer than 5 seconds (predetermined first time) and shorter than 7 seconds, the CPU 41 determines the warning determination table 121. The warning content 121C “Pong” of “X” and “Feature” “X” of the recognition 121B corresponding to “7 seconds” of the remaining collision time 121A is read out from It is determined that it is necessary to issue a warning for confirmation of a pedestrian or the like in (S123: YES), and the process proceeds to S124.

  In S124, the CPU 41 determines from the warning determination table 121 that the warning object 121B of the recognition 121B corresponding to “7 seconds” of the remaining collision time 121A is “x” and “feature” is “x”. “Pong” is read again, and a warning sound “Pong” is output and notified through the speaker 16. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

  Further, when the pedestrian recognition flag of the pedestrian recognition flag and the feature recognition flag is ON, and the remaining collision time is longer than 2 seconds (predetermined second time) and not longer than 5 seconds (predetermined first time). The CPU 41 determines from the warning determination table 121 that “warning object” of the recognition 121B corresponding to “5 seconds” of the remaining collision time 121A is “◯” and “feature” is “◯”, or “warning object”. Is “○” and “feature” is “×”, and the warning content 121C “none” is read out, and it is determined that the driver is not warned of confirmation of a pedestrian or the like at the intersection (S123: NO). The output of the speaker 16 is set to OFF. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends. Therefore, when the warning sound is output and notified through the speaker 16, the CPU 41 stops the warning sound and ends the process.

  On the other hand, when the pedestrian recognition flag of the pedestrian recognition flag and the feature recognition flag is OFF, and the remaining collision time is longer than 2 seconds (predetermined second time) and not longer than 5 seconds (predetermined first time). The CPU 41 determines from the warning determination table 121 that the “warning object” of the recognition 121B corresponding to “5 seconds” of the remaining collision time 121A is “x” and “feature” is “x”, or “warning object”. Is “×” and “feature” is “◯”, and the warning content 121C “pip, beep,... (Intermittent sound)” is read, and the driver is warned of confirmation of a pedestrian or the like at the intersection. It is determined that it is necessary (S123: YES), and the process proceeds to S124.

  In S124, the CPU 41 determines that the “warning object” of the recognition 121B corresponding to “5 seconds” of the remaining collision time 121A is “x” and “feature” is “x” or “warning” from the warning determination table 121. The warning content 121C “pip, beep,... (Intermittent sound)” with “target” as “×” and “feature” as “o” is read out, and the intermittent sound “pip, "Pip, ..." is output and notified. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

  However, when the pedestrian recognition flag is OFF, the feature recognition flag is ON, and when it is further detected that the vehicle has decelerated via the vehicle speed sensor 21, the CPU 41 detects the pedestrian at the intersection, etc. It is determined that the confirmation warning is not performed (S123: NO), and the output of the speaker 16 is set to OFF. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

  When both the pedestrian recognition flag and the feature recognition flag are ON and the remaining collision time is 2 seconds (predetermined second time) or less, the CPU 41 determines from the warning determination table 121 that the remaining collision time 121A is “2”. Read the warning content 121C “None” in the “Warning object” of the recognition 121B corresponding to “second” and “O” and “Fine” is “O”, and confirm the pedestrian etc. at the intersection. It determines with not giving a warning (S123: NO), and sets the output of the speaker 16 to OFF. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends. Therefore, when the warning sound is output and notified through the speaker 16, the CPU 41 stops the warning sound and ends the process.

  On the other hand, when one or both of the pedestrian recognition flag and the feature recognition flag are OFF and the remaining collision time is 2 seconds (predetermined second time) or less, the CPU 41 determines whether the CPU 41 has a warning determination table. From 121, the "warning object" of the recognition 121B corresponding to "2 seconds" of the remaining collision time 121A is "x" and "feature" is "x", or "warning object" is "x" and "feature" ”Is“ O ”, or“ Warning object ”is“ O ”and“ Feature ”is“ X ”, the warning content 121C“ Pe (continuous sound) ”is read out, and the pedestrian at the intersection, etc. It is determined that it is necessary to issue a confirmation warning (S123: YES), and the process proceeds to S124.

  In S124, the CPU 41 determines that the “warning object” of the recognition 121B corresponding to “2 seconds” of the remaining collision time 121A is “x”, “feature” is “x”, and “warning target” from the warning determination table 121. Read again the warning content 121C “Pe (continuous sound)” in which “object” is “O” and “feature” is “x”, or “warning object” is “x” and “feature” is “o”, The continuous sound “pea” is output and notified as a warning sound through the speaker 16. Then, the CPU 41 reads the front vehicle flag, the pedestrian flag, the feature flag, the pedestrian recognition flag, the feature recognition flag, and the front vehicle recognition flag from the RAM 42, sets each flag to OFF, and stores it again in the RAM 42. Then, the process ends.

  As described above in detail, in the navigation device 111 according to the second embodiment, when the feature flag read from the RAM 42 in S123 is ON, either the pedestrian recognition flag or the feature recognition flag or both flags are set. If the collision remaining time is longer than 5 seconds and not longer than 7 seconds, the CPU 41 determines that the “warning object” of the recognition 121B corresponding to “7 seconds” of the remaining collision time 121A from the warning determination table 121. The warning content 121C “None” with “O” or “Feature” being “O” is read out, and it is determined that the driver is not warned of confirmation of a pedestrian or the like at the intersection (S123: NO). After the output of 16 is turned off, the processing is terminated. When the feature flag read from the RAM 42 in S123 is ON, the pedestrian recognition flag of the pedestrian recognition flag and the feature recognition flag is ON, and the remaining collision time is longer than 2 seconds and shorter than 5 seconds. In this case, the CPU 41 determines from the warning determination table 121 that the “warning target” of the recognition 121B corresponding to “5 seconds” of the remaining collision time 121A is “◯” and “feature” is “◯”, or “warning target”. The warning content 121C “None” of “object” is “○” and “feature” is “x” is read, and it is determined that the driver is not warned of confirmation of a pedestrian or the like at the intersection (S123: NO) ) After the output of the speaker 16 is turned off, the process is terminated.

  As a result, when the remaining collision time until collision with a pedestrian or the like (warning object) is longer than 5 seconds and shorter than 7 seconds in the vicinity of the intersection, the driver can detect the feature (for example, traffic light, It is not necessary to give a warning to the driver via the speaker 16 when one of stop signs, one-end stop lines, etc.) or a pedestrian or the like (warning object) is recognized. Thus, no warning is given via the speaker 16. Further, when the remaining collision time is longer than 2 seconds and less than 5 seconds, it is determined that the driver recognizes a pedestrian or the like (warning object), or the driver It is determined that an object (for example, a traffic light, a stop sign, a stop line, etc.) is recognized, and a warning is not issued only when the vehicle speed is decelerated. For this reason, it becomes possible to judge more accurately whether or not the driver temporarily stops in front of a pedestrian or the like (warning object) and to give a warning through the speaker 16, so that the driver can drive more safely. Can be supported.

When the feature flag read from the RAM 42 in S123 is ON, when both the pedestrian recognition flag and the feature recognition flag are ON and the remaining collision time is 2 seconds or less, the CPU 41 determines whether the warning determination table 121 is ON. Read out the warning content 121C “None” in the “Warning object” of the recognition 121B corresponding to “2 seconds” of the remaining collision time 121A and “Fine” from the recognition 121B. It is determined that a warning for confirmation of a pedestrian or the like is not to be performed (S123: NO), the output of the speaker 16 is turned off, and the process is terminated.
As a result, when the remaining collision time until collision with a pedestrian or the like (warning object) is 2 seconds or less in the vicinity of the intersection, the driver can detect the feature at the intersection (for example, a traffic light, a temporary stop sign, one stop) The warning is not given only when both the pedestrian and the like (warning object) are recognized. For this reason, it becomes possible to accurately determine whether or not the driver is surely stopped once before a pedestrian or the like (warning object) so that the driver can drive more safely. It becomes possible to support.

Further, when the feature flag read from the RAM 42 in S123 is ON, when the remaining collision time is 5 seconds or less, only the feature recognition flag is ON, that is, the “feature” of the recognition 121B is only “◯”. Then, in S124, the CPU 41 reads the warning content 121C “beep, beep,... (Intermittent sound)”, and outputs an intermittent sound “beep, beep,... To do. If the remaining collision time is 2 seconds or less, only the feature recognition flag is ON, that is, if the “feature” of the recognition 121B is only “O”, the CPU 41 causes the warning content 121C “P ( Continuous sound) ”is read out, and the continuous sound“ pea ”is output and notified through the speaker 16 as a warning sound.
As a result, when the remaining collision time is 5 seconds or less, even if the driver recognizes the feature ahead in the traveling direction, the driver is warned to stop once before the warning object such as a pedestrian. This makes it possible to assist the driver in driving more safely. Also, since the warning sound is different between the remaining collision time of 5 seconds or less and 2 seconds or less, the driver can easily recognize the degree of urgency to temporarily stop before the warning object based on the warning content. Thus, it is possible to support the driver to drive more safely.

  In addition, this invention is not limited to the said Example 1 and Example 2, Of course, various improvement and deformation | transformation are possible within the range which does not deviate from the summary of this invention. For example, the following may be used.

  (A) In step 12, the CPU 41 is located within a predetermined distance (for example, within about 200 m to 300 m) from the host vehicle 2 in front of the traveling direction of the straight road on which the host vehicle 2 is traveling. It may be determined whether there is a traffic light, a stop sign, a stop line, or a pedestrian crossing. Then, in front of the traveling direction of the straight road on which the vehicle 2 is traveling, a traffic light, a stop sign, a stop line, or the like at a position within a predetermined distance from the host vehicle 2 (for example, within about 200 m to 300 m). When there is a pedestrian crossing (S12: YES), the CPU 41 may execute the processes after S13. Thereby, even when a pedestrian crossing or the like is provided on the straight road, it is possible to warn the driver of confirmation of a pedestrian or the like at an appropriate timing on the pedestrian crossing.

  (B) In step 24 and step 124 described above, the driver's seat is vibrated slightly together with a warning sound “pip, beep,... A warning mark may be displayed or a warning light may be turned on. In addition, the driver's seat is greatly vibrated, the pedestrian warning mark blinks on the liquid crystal display 15, and the warning light blinks. You may make it do. Thereby, it becomes possible to make a driver | operator recognize the warning of confirmation of a pedestrian etc. more reliably.

BRIEF DESCRIPTION OF THE DRAWINGS It is a schematic block diagram of the vehicle by which the navigation apparatus concerning Example 1 is mounted. It is a block diagram which shows typically the control system centering on the navigation apparatus mounted in a vehicle. It is a figure which shows an example of the warning determination table stored in warning determination DB. It is a flowchart which shows the warning process which performs the warning of confirmation of the pedestrian etc. in the intersection which CPU of a navigation apparatus performs. It is a figure explaining an example of the state in which a driver recognizes features, such as a traffic signal in an intersection. It is a flowchart which shows the warning process which performs the warning of confirmation of the pedestrian etc. in the intersection which CPU of the navigation apparatus which concerns on Example 2 performs. It is a figure which shows an example of the feature warning determination table stored in warning determination DB of the navigation apparatus which concerns on Example 2. FIG.

Explanation of symbols

1 Navigation device 2 Vehicle (own vehicle)
DESCRIPTION OF SYMBOLS 11 Present location detection process part 13 Navigation control part 16 Speaker 21 Vehicle speed sensor 25 Map information DB
26 Warning judgment DB
41 CPU
42 RAM
43 ROM
51 Camera ECU
52 Driver camera 53 Outside camera 61 Radar ECU
62 millimeter wave radar 65 warning judgment table 68, 69 road 70 intersection 71A-71D traffic lights 72A-72D temporary stop line 73A-73D pedestrian crossing 75, 76 other vehicles 81-83 pedestrians 91-93, 101-103 gaze direction 121 ground Object warning judgment table

Claims (3)

  1. In a driving support device that detects a warning object and warns the driver,
    Map information storage means for storing map information including feature information about the feature;
    Vehicle position detection means for detecting the vehicle position of the vehicle;
    Gaze detection means for detecting the gaze direction of the driver driving the vehicle;
    Feature information acquisition means for acquiring feature information about the feature at the intersection ahead of the traveling direction of the vehicle position based on the map information;
    Based on the feature information acquired by the feature information acquisition means and the driver's line-of-sight direction detected by the line-of-sight detection means, the driver recognizes the feature at the intersection ahead of the traveling direction. Feature recognition determination means for determining whether or not
    Warning object information acquisition means for acquiring warning object information including position information of the warning object at an intersection ahead of the vehicle position in the traveling direction;
    Based on the warning object information acquired by the warning object information acquisition means and the driver's line-of-sight direction detected by the line-of-sight detection means, the warning object at the intersection ahead of the traveling direction of the driver Warning object recognition determination means for determining whether or not an object is recognized;
    Vehicle speed detecting means for detecting the vehicle speed of the vehicle;
    Distance calculating means for calculating a distance from the vehicle position to an intersection ahead of the traveling direction;
    A collision remaining time calculating means for calculating a predicted remaining time from the vehicle position to the intersection based on the vehicle speed detected by the vehicle speed detecting means and the distance calculated by the distance calculating means;
    When the predicted remaining time calculated by the collision remaining time calculating means is longer than the predetermined first time, it is determined that the driver does not recognize the feature ahead in the traveling direction , and the driver If it is determined not to recognize the object, and controls to perform a warning and to the driver,
    When the predicted remaining time calculated by the collision remaining time calculating means is less than the predetermined first time and longer than a predetermined second time shorter than the predetermined first time, the driver moves the warning object. If it is determined that the driver does not recognize, regardless of the presence or absence of recognition of the feature ahead in the traveling direction by the driver, control to give a warning to the driver,
    When the predicted remaining time calculated by the collision remaining time calculating means is equal to or shorter than the predetermined second time, it is determined that the driver does not recognize the feature ahead in the traveling direction, or the driver Warning control means for controlling to give a warning to the driver when it is determined that the warning object is not recognized ,
    Equipped with a,
    The warning control means is configured such that when the predicted remaining time is longer than the predetermined first time, and when the predicted remaining time is equal to or shorter than the predetermined first time and shorter than the predetermined second time. The driving support device , wherein the warning content is selected when the time is long and when the predicted remaining time is equal to or shorter than the predetermined second time .
  2. Other vehicle information acquisition means for acquiring other vehicle information including position information of another vehicle traveling in front of the vehicle;
    Based on the other vehicle information acquired by the other vehicle information acquisition unit and the driver's line-of-sight direction detected by the line-of-sight detection unit, it is determined whether or not the driver recognizes the other vehicle. Other vehicle recognition determination means,
    With
    The said warning control means is controlled so that the warning given with respect to a driver | operator is suppressed, when it determines with the said driver | operator recognizing the said other vehicle. Driving assistance device.
  3. The feature includes a painted mark provided on the road surface, driving support apparatus according to claim 1 or claim 2, characterized in that it comprises a three-dimensional object provided along the road.
JP2006309392A 2006-11-15 2006-11-15 Driving assistance device Expired - Fee Related JP4929997B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006309392A JP4929997B2 (en) 2006-11-15 2006-11-15 Driving assistance device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006309392A JP4929997B2 (en) 2006-11-15 2006-11-15 Driving assistance device

Publications (2)

Publication Number Publication Date
JP2008123443A JP2008123443A (en) 2008-05-29
JP4929997B2 true JP4929997B2 (en) 2012-05-09

Family

ID=39508103

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006309392A Expired - Fee Related JP4929997B2 (en) 2006-11-15 2006-11-15 Driving assistance device

Country Status (1)

Country Link
JP (1) JP4929997B2 (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4973627B2 (en) * 2008-08-21 2012-07-11 株式会社デンソー Driving behavior estimation device
JP2010191517A (en) * 2009-02-16 2010-09-02 Toyota Motor Corp Driving support device
JP5338371B2 (en) * 2009-02-24 2013-11-13 トヨタ自動車株式会社 Vehicle alarm device
CN102859567B (en) * 2010-04-19 2015-06-03 本田技研工业株式会社 Device for monitoring vicinity of vehicle
JP2012155376A (en) * 2011-01-24 2012-08-16 Sanyo Electric Co Ltd Mobile communication device and control determination method
JP5657809B2 (en) * 2011-10-06 2015-01-21 本田技研工業株式会社 Armpit detector
JP5846106B2 (en) * 2012-11-13 2016-01-20 トヨタ自動車株式会社 Driving support device and driving support method
US10279742B2 (en) 2014-05-29 2019-05-07 Nikon Corporation Image capture device and vehicle
JP6330537B2 (en) * 2014-07-14 2018-05-30 株式会社デンソー Driving assistance device
JP6451576B2 (en) * 2015-09-18 2019-01-16 株式会社ニコン Imaging device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000046574A (en) * 1998-07-24 2000-02-18 Honda Motor Co Ltd Navigation device for vehicle
JP3991928B2 (en) * 2003-06-17 2007-10-17 日産自動車株式会社 The vehicle collision avoidance control device
JP2006146429A (en) * 2004-11-17 2006-06-08 Mitsubishi Electric Corp Automatic tracking device
JP4525915B2 (en) * 2005-02-16 2010-08-18 株式会社デンソー Driving support system

Also Published As

Publication number Publication date
JP2008123443A (en) 2008-05-29

Similar Documents

Publication Publication Date Title
US8346426B1 (en) User interface for displaying internal state of autonomous driving system
US7362241B2 (en) Vehicle proximity warning apparatus, method and program
EP1959236B1 (en) Lane determining device and lane determining method
EP1136792B1 (en) Dangerous area alarm system
US7862177B2 (en) Image forming system
US6249214B1 (en) Image processing apparatus, image processing method, navigation apparatus, program storage device and computer data signal embodied in carrier wave
EP2028447A1 (en) Navigation device and navigation method
KR19990072978A (en) Device and method for displaying information of vehicle position
JP5973447B2 (en) Zone driving
JP4743496B2 (en) Navigation device and navigation method
JP4792866B2 (en) Navigation system
US20080015772A1 (en) Drive-assist information providing system for driver of vehicle
EP2601480B1 (en) Method and device for determining the position of a vehicle on a carriageway and motor vehicle having such a device
US8024115B2 (en) Navigation apparatus, method and program for vehicle
JP4432801B2 (en) Driving support system
US8346473B2 (en) Lane determining device, lane determining method and navigation apparatus using the same
JP4967015B2 (en) Safe driving support device
US7877187B2 (en) Driving support method and device
CN1991312B (en) Route guidance system and route guidance method
US7429825B2 (en) Headlight beam control system and headlight beam control method
KR100766677B1 (en) Navigation Apparatus
JP5234691B2 (en) Navigation device, probe information transmission method, program, and traffic information creation device
EP1793204A1 (en) System for and method of providing lane guidance
WO2007132860A1 (en) Object recognition device
JP3966170B2 (en) Driving support system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20090717

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A711

Effective date: 20100317

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A821

Effective date: 20100317

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110411

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110419

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110617

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120117

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120130

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150224

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees