JP4888761B2 - Virtual lane display device - Google Patents

Virtual lane display device Download PDF

Info

Publication number
JP4888761B2
JP4888761B2 JP2005316478A JP2005316478A JP4888761B2 JP 4888761 B2 JP4888761 B2 JP 4888761B2 JP 2005316478 A JP2005316478 A JP 2005316478A JP 2005316478 A JP2005316478 A JP 2005316478A JP 4888761 B2 JP4888761 B2 JP 4888761B2
Authority
JP
Japan
Prior art keywords
vehicle
virtual lane
display
visibility
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2005316478A
Other languages
Japanese (ja)
Other versions
JP2007122578A (en
Inventor
正夫 川合
竜弥 村松
洋一 野本
Original Assignee
株式会社エクォス・リサーチ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社エクォス・リサーチ filed Critical 株式会社エクォス・リサーチ
Priority to JP2005316478A priority Critical patent/JP4888761B2/en
Publication of JP2007122578A publication Critical patent/JP2007122578A/en
Application granted granted Critical
Publication of JP4888761B2 publication Critical patent/JP4888761B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a virtual lane display device, and more particularly to a vehicle control device that displays a virtual lane according to visibility in front of the vehicle.

A technique for improving visibility when visibility in front of the vehicle is poor has been proposed.
For example, Patent Document 1 proposes a technique that enables lane recognition even when it is raining or snowing by storing the shape of the lane and displaying it on a head-up display.
In the technique described in Patent Document 1, the lane shape is always displayed, whereas in Patent Document 2, when it is determined that the road shape is visible by the driver, the shape of the road to the windshield of the vehicle is determined. Has been proposed as a technology for hiding the display, that is, a technology for displaying the shape of the runway only when it is not visible.

JP 2000-211142 A JP 2005-170323 A

In the technologies described in Patent Documents 1 and 2, the lane and the running road shape are always displayed on the windshield when the visibility is poor.
However, it is not always necessary for the driver to display lanes because of poor visibility. That is, if the poor visibility does not affect the driving operation of the driver, the display of the lane may present excessive information to the driver, and conversely, the driving operation may be hindered.

  Therefore, an object of the present invention is to provide a virtual lane display device capable of displaying a virtual lane in a more accurate situation for the driver.

(1) In the invention described in claim 1, outside information acquisition means for acquiring environmental information outside the vehicle, visibility determination means for determining the visibility of the front of the vehicle from the acquired environmental information, and driving A biometric information acquisition unit that acquires the biometric information of the person, a biometric information determination unit that determines whether or not the acquired biometric information exceeds a predetermined threshold, and a road that recognizes a road shape on which the vehicle is currently traveling A virtual shape that displays a virtual lane in accordance with the recognized road shape when it is determined that the visibility is not good and the biological information exceeds a predetermined threshold. A lane display means is provided in the virtual lane display device to achieve the object.
(2) In the invention described in claim 2, outside information acquisition means for acquiring environmental information outside the vehicle, visibility determination means for determining the visibility of the front of the vehicle from the acquired environmental information, and driving Driving operation detecting means for detecting a driving operation of the user, abnormal operation determining means for determining whether the detected driving operation corresponds to an abnormal operation defined in advance as a driving operation when visibility is not good, and , the vehicle is determined and the road shape recognizing means for recognizing the road shape currently traveling, the visibility is not good, and, wherein when it is determined to correspond to abnormal operation, the recognized road shape The virtual lane display device for displaying the virtual lane in accordance with the virtual lane display device is provided in the virtual lane display device to achieve the object.
(3) In the invention described in claim 3, in the virtual lane display device described in claim 1 or 2, the virtual lane display means displays the virtual lane on the road by scanning the laser beam. or displaying the virtual lane on the windshield, characterized in that.
(4) In the invention described in claim 4, in the virtual lane display device according to claim 1, claim 2, or claim 3, the virtual lane display device further includes an obstacle detection means for detecting an obstacle existing in front of the vehicle, The virtual lane display means displays a virtual lane that avoids the detected obstacle.
(5) In the invention described in claim 5, in the virtual lane display device described in claim 1, claim 2, or claim 3, another traveling vehicle existing in front of the vehicle and a front detecting an obstacle And a warning means for warning the presence of the detected vehicle according to the detected other traveling vehicle or obstacle.

According to the first aspect of the present invention, not only is it determined that the visibility is not good, but also the virtual lane is displayed when the biological information exceeds a predetermined threshold. The virtual lane can be displayed more accurately for the driver.
According to the second aspect of the present invention, not only is it determined that the visibility is not good, but also a driving operation corresponding to an abnormal operation defined in advance as a driving operation when the visibility is not good is detected. In this case, since the virtual lane is displayed, the virtual lane can be displayed more accurately for the driver.

Hereinafter, a preferred embodiment of the virtual lane display device of the present invention will be described in detail with reference to FIGS. 1 to 5.
(1) Outline of Embodiment In the virtual lane display device of the present embodiment, it is determined whether or not visibility is poor by analyzing an image captured by a camera that images the outside of the vehicle. For example, it is recognized that the visibility is not good when it is raining at night, when the visibility in front of the vehicle is bad due to backlight, when it is difficult to see the lane (including when it is not visible at all), or when the visibility in front of the vehicle is poor due to fog To do.

Further, in order to determine whether or not it is difficult for the driver to perform a driving operation due to poor visibility, a change in driving operation and a change in biological information of the driver are detected.
In other words, as a change in driving operation, the driving operation when visibility is not good, such as when the accelerator operation becomes extremely weak or the number of brakes increases, is defined in advance as an abnormal driving operation, and the driving operation is abnormal It is determined whether or not the operation is applicable.
In addition, at least one piece of information on the autonomic nervous system, such as heart rate, sweating volume, pupil open state (pupil size), and brain waves, is detected as biometric information and compared with the average value during normal running Whether the heart rate has changed by more than a predetermined value (for example, when the heart rate has changed by 30 or more), or whether a predetermined threshold value has been exceeded (for example, when the heart rate has exceeded 100). Judging.

In a situation where visibility is not good, if the driving operation corresponds to a predetermined abnormal operation and the biological information exceeds a predetermined threshold value, the visibility may not be good. In such a case, the virtual lane is displayed.
The virtual lane identifies the current position of the vehicle and recognizes the current road shape using the road map database, and displays it on the windshield of the vehicle with a head-up display according to the recognized road shape. .
Note that the virtual lane may be directly displayed on the road by scanning the laser beam. In this case, the irradiation range (virtual lane display range) is a range in which the preceding vehicle is not irradiated.

If the road shape is curved, the virtual lane is displayed according to the curve.
When an obstacle is detected in front of the vehicle by the millimeter wave radar, a virtual lane is displayed so as to avoid the obstacle.
Furthermore, when the traveling vehicle is ahead, an image indicating the presence of the traveling vehicle may be displayed.
Further, when there is an obstacle or a vehicle traveling ahead, a warning to that effect is given.

(2) Details of Embodiment Hereinafter, a preferred embodiment of the virtual lane display device of the present invention will be described in detail with reference to FIGS. 1 to 5.
FIG. 1 shows a configuration of a virtual lane display device in the present embodiment.
As shown in FIG. 1, the virtual lane display device includes an ECU (electronic control device) 10 that controls the entire virtual lane display device according to various programs and data. The ECU 10 includes a current position detection device 11, biological information. Sensor 12, exterior environment acquisition unit 13, driving operation detection unit 14, display device 15, data storage unit 16, program storage unit 17, audio output unit 18, and other devices (connect to an external information center or the Internet as a communication means) A communication control device or the like) is connected.

  The current position detection device 11 is for detecting the current position (absolute coordinate value composed of latitude and longitude) of the vehicle on which the virtual lane display device is mounted, and measures the position of the vehicle using an artificial satellite. A GPS (Global Positioning System) receiver 111, one or more of a geomagnetic sensor 112, a gyro sensor 113, a vehicle speed sensor 114, and the like that detect geomagnetism and determine the direction of the vehicle are used.

The biological information sensor 12 includes a heart rate sensor 121, a sweat sensor 122, and a blood pressure sensor 123 as sensors for acquiring the driver's biological information.
When the vehicle starts running, the heart rate and the amount of sweat are detected at predetermined time intervals and supplied to the ECU 10.

  The heart rate sensor 121 is a sensor that detects the heart rate of the driver, and detects the heart rate from the pulse rate of the driver. The heart rate sensor 121 according to the present embodiment detects a heart rate by collecting a heart rate signal from the hand of a driver who is driving by an electrode disposed on a steering wheel. The heart rate sensor 121 may be provided with a dedicated sensor on the driver's body such as a wrist.

  The perspiration sensor 122 is disposed on the steering and detects the perspiration state from the change in the current value that flows depending on the perspiration state.

The blood pressure sensor 123 is a sensor that detects the blood pressure of the driver.
In the present embodiment, the blood pressure sensor 123 is, for example, a correlation between a pulse wave transmission time (PWTT: Pulse Wave Transmit Time) until the blood pulse wave accompanying the heart contraction reaches the fingertip from the heart and blood pressure in the human body. Is used to measure blood pressure.
The blood pressure sensor 123 detects an electrical potential change that occurs when the heart beats and detects a heart contraction timing, and detects a change in blood flow of the fingertip with infrared rays, and a pulse wave reaches the fingertip. An infrared sensor for capturing timing (pulse) is provided, and blood pressure is measured by calculation based on the pulse wave propagation time detected by these sensors.
As described in JP 2000-107141 A, pulse sensors that measure a pulse by using a difference in distance from the heart may be arranged in both sensor units.

  The vehicle exterior environment acquisition unit 13 includes a camera 131, a millimeter wave radar 132, a wiper sensor 133, and a headlamp sensor 134.

The camera 131 is disposed in front of the vehicle and images the front outside the vehicle. An image captured by the camera 131 is subjected to image recognition processing to determine whether or not the driver's field of view is in a poor state.
As a case where the visibility of the driver is poor, for example, according to a road map database described later, a lane (white line or yellow line) is detected from a captured image even though a lane is present on a currently running road. If it cannot be recognized, it is determined that the field of view is poor.
Further, when the vehicle detected by the millimeter wave radar 132 cannot be recognized from the captured image, it is determined that the field of view is poor due to fog, rain, and the like.
In the present embodiment, the camera 131 is a CCD camera.

  The image captured by the camera 131 is also used to detect the brightness outside the vehicle, but instead of or in addition to the image captured by the camera, various sensors (such as photodiodes) that detect the brightness are used. Also good.

The millimeter wave radar 132 scans the front of the vehicle with millimeter waves to detect a vehicle, an obstacle, or the like existing ahead, and if present, detects a distance from the preceding vehicle or the obstacle. Is a radio wave radar using radio waves in the millimeter wave band, and a millimeter wave FMCW radar device is used as the millimeter wave radar of this embodiment.
The FMCW radar apparatus transmits a continuous wave, which is FM-modulated by a modulating triangular wave, as a transmission wave toward an obstacle such as an automobile located in front of the FMCW radar apparatus, and a reflected wave that has been reflected and returned by the obstacle as a reception wave. The distance to the obstacle and the relative velocity are calculated by processing the beat signal obtained by taking in and mixing the transmission wave and the reception wave at that time by a frequency analysis method such as FFT.

  In this embodiment, the millimeter wave radar 132 is used to detect a vehicle or an obstacle in front of the vehicle, but a laser radar such as an infrared laser radar may be used instead of or in addition to this. .

The wiper sensor 133 and the headlamp sensor 134 detect a state where the wiper and the headlamp are turned on, respectively.
In this embodiment, when both the wiper sensor 133 and the headlamp sensor 134 detect the on state, it is determined that the field of view is poor because it is raining at night.

The driving operation detection unit 14 includes an accelerator sensor 141, a brake sensor 142, and a handle sensor 143.
The accelerator sensor 141 detects the speed of stepping on the accelerator, the stepping force, the number of times of stepping, and the like.
The brake sensor 142 detects the speed at which the brake is depressed, the depression force, the number of depressions, and the like.
The handle sensor 143 detects the speed of handle operation and the pressure for gripping the handle. In addition to the handle sensor 143, the handle is also provided with a sweat sensor 122 as a biological information sensor described above.

  Although not explicitly shown as the driving operation detection unit 14, the vehicle control device includes an average speed detection unit, and the average vehicle speed is detected from the vehicle speed detected by the vehicle speed sensor 114.

The display device 15 is a device that displays a virtual lane in the present embodiment.
The display device 15 in the present embodiment is configured by a head-up display, and projects and displays an image of a virtual lane supplied from the ECU 10 on the windshield of the vehicle.
In addition to the head-up display, the display device 15 may be configured by a light irradiation device that displays a virtual lane in a line with irradiation light such as laser light of a predetermined color on a running road surface. In this case, when a vehicle or a person is detected ahead by the vehicle environment acquisition unit 13, the detected vehicle or the like is excluded from the irradiation light irradiation range.

In addition to ROM and RAM, the data storage unit 16 and the program storage unit 17 include magnetic recording media such as flexible disks, hard disks, and magnetic tapes, semiconductor recording media such as memory chips and IC cards, CD-ROMs, MOs, and PDs. Examples include a recording medium on which information is optically read, such as a (phase change rewritable optical disk), and a recording medium on which data and computer programs are recorded by various methods.
Different recording media may be used depending on the recording contents.

  The data storage unit 16 stores various data used in the present embodiment, such as a road map database 161, vehicle exterior environment determination data 162, driving operation determination data 163, biological information determination data 164, history data 165, vehicle data 166, and the like. ing.

  The road map database 161 uses data used in the navigation function, and displays map information for displaying various maps and roads around the current location of the vehicle and around the destination on the display device, and route search to the destination. It is a database storing various maps and road-related data such as road information and facility information (POI information) storing information for each facility.

In the present embodiment, the road map database 161 is used to detect a position on the currently traveling road by map matching between the current position of the vehicle detected by the current position detecting means 11 and road data.
The detected position on the road is used to identify the shape of the running road based on the road data and to determine the shape of the virtual lane (virtual lane image) according to the identified road shape and obstacles ahead of the vehicle. Is done.

The vehicle exterior environment determination data 162 is data that defines a case where the vehicle exterior environment cannot be recognized, that is, a case where the driver's view is poor.
The driving operation determination data 163 is data that defines the case where the driving behavior (driving operation) is changed due to the poor visibility.
The biometric information determination data 164 is data that defines a case where it is determined that there is an abnormality in the biometric information due to poor visibility and a change in driving operation.

FIG. 2 exemplifies the classification of states detected by the vehicle exterior environment acquisition unit 13, the driving operation detection unit 14, and the biological information sensor 12.
The ECU 10 determines each of the outside environment, the driving operation, and the biological information based on each of these detection states and the determination criteria defined in the outside environment determination data 162, the driving operation determination data 163, and the biological information determination data 164. Is done.

FIG. 2A illustrates state classification of the environment outside the vehicle.
The “lane” in the environment outside the vehicle is divided into “none” (cannot be recognized) and “present” (can be recognized) of the lane, and recognition processing is performed from an image captured by the camera 131.
“Weather” is classified into “rain”, “cloudy”, “snow”, “sunny”, and “mist”. This “weather” is determined from the recognition processing of the captured image and the detection result of the wiper sensor 133. When the wiper sensor 133 detects ON, it is determined as rain, snow, or fog, and when the wiper sensor 133 detects OFF, it is determined as cloudy or sunny.
Then, rain, snow, fog, cloudy, or sunny is determined from the captured image of the camera 131 by image processing.

“Brightness” is divided into “dark” and “bright”. “Brightness” is determined by image processing from an image captured by the camera 131. Note that “dark” and “bright” may be determined using an illuminance sensor.
The “obstacle” is classified into “many”, “small”, and “none”. The “obstacle” is detected by the millimeter wave radar 132.
“Nearby vehicles” are classified into “many”, “small”, and “none”. The “peripheral vehicle” is detected by the millimeter wave radar 132 and image processing of the captured image.

In the vehicle exterior environment determination data 162 that cannot recognize the vehicle exterior environment, the following cases are defined in the vehicle exterior environment state classification shown in FIG.
(1) Even when an obstacle is detected by the millimeter wave radar 132, the obstacle cannot be recognized from the image processing of the captured image.
(2) When the recognized lane is 5 m or less from the host vehicle by image processing of the captured image, that is, when the lane ahead of 5 m ahead of the vehicle cannot be recognized.
(3) When it is raining and the surroundings of the vehicle are dark.
(4) When it is snowing.
(5) In the case of passing with an oncoming vehicle in a place with many narrow roads and obstacles.

FIG. 2B illustrates the state classification of the driving operation.
“Acceleration operation” of driving operation is classified into “strong”, “medium”, and “weak”. The “accelerator operation” is detected by the accelerator sensor 141.
The “brake count” is classified into “many”, “medium”, and “small”. The “brake count” is detected by the brake sensor 142.
The “handle” is classified into “strong”, “medium”, and “weak”. The “handle” is detected by the handle sensor 143.
“Behavior” is classified into “present” and “none”, and is detected as “handle operation amount” by the handle sensor 143.

Then, in the driving operation determination data 163 that is determined that the driving behavior has changed, the following cases are defined among the state classifications of the driving operation shown in FIG.
(1) When accelerator operation becomes extremely weak.
(2) When the number of brakes increases.
(3) When the number of sudden steering operations (when strong) increases.
(4) When the handle operation increases.

FIG. 2C illustrates the state classification of the biological information.
The “heart rate” of the biological information is classified into “high”, “medium”, and “low”. “Heartbeat” is detected by the heartbeat sensor 121.
“Sweating” is classified into “many”, “small”, and “none”. “Sweating” is detected by the sweat sensor 122.
“Blood pressure” is classified into “high”, “medium”, and “low”. “Blood pressure” is detected by the blood pressure sensor 123.

The biometric information determination data 164 that determines that the biometric information is abnormal stipulates the following cases among the categories of the biometric information shown in FIG. That is, the following cases are defined in which it is determined that the heart rate and the amount of sweating are in a sympathetic nervous system dominant state.
(1) When the heart rate increases (2) When the heart rate decreases suddenly (3) When the amount of sweat increases (4) When the blood pressure increases rapidly (5) When the blood pressure decreases rapidly

In FIG. 1, the history data 165 of the data storage unit 16 stores various driving operations of the driver detected by the driving operation detection unit 14 and the driver's biological information detected by the biological information sensor 12.
The history data 165 is used in the virtual lane display processing program 171 to determine whether or not the driving operation has changed and whether or not the biological information is abnormal. Stored in a predetermined area of the RAM.

The history data 165 stores output signals from the accelerator sensor 141, the brake sensor 142, and the handle sensor 143.
The vehicle data 166 stores data such as the vehicle dimensions and the position (height) of the camera 131 for the vehicle on which the virtual lane display device is mounted.

  The program storage unit 17 stores a virtual lane display program 171, a driving operation collection program 172, a biological information collection program 173, and other programs 174.

  In the virtual lane display program 171, various functions such as vehicle exterior environment detection, visibility determination, driving operation detection, driving behavior change determination, biological information detection, biological information check, vehicle position detection, other vehicle detection, road shape calculation, virtual lane display, etc. Is a program that displays a virtual lane when the driver is more necessary.

The driving operation collection program 172 is a program that acquires various driving operations detected by the driving operation detector 14 at predetermined time intervals and stores them in the history data 165.
The biometric information collection program 173 is a program that acquires biometric information (heart rate, sweating state, blood pressure) detected by the biometric information sensor 12 at predetermined time intervals and stores it in the history data 165.
The driving operation and the biological information stored in the history data 165 are stored for a predetermined time, for example, 2 minutes, and the oldest information is deleted and the latest information is stored. The biometric information may be stored for a predetermined number of times instead of a predetermined time.

  The driving operation collection program 172 and the biological information collection program 173 are programs that are always executed while the vehicle is running, and are executed independently of other programs such as the virtual lane display program 171.

The audio output device 18 includes a plurality of speakers arranged in the vehicle, and warns that there is an obstacle in front of the audio controlled by the audio control unit, for example, in the virtual lane display processing of the present embodiment. Audio is output.
The audio output device 18 may also be used as an audio speaker.

Next, virtual lane display processing in the virtual lane display device configured as described above will be described with reference to the flowchart of FIG.
This virtual lane display process is executed when the vehicle starts to travel or when the ignition is turned on.

First, the ECU 10 detects the environment outside the vehicle based on the detection result by the vehicle environment acquisition unit 13 (step 11).
That is, the ECU 10 determines the environment outside the vehicle (lane, weather, brightness, obstacles, surrounding vehicles (see FIG. 2A) based on images and detections by the camera 131, the millimeter wave radar 132, the wiper sensor 133, and the headlamp sensor 134. )) Is recognized and detected (determined).

Then, the ECU 10 determines whether or not the recognized and detected outside environment satisfies the outside environment determination data 162 that defines the case where the visibility is bad, that is, whether or not the visibility is bad (step 12).
If it is determined that the field of view is not bad (step 12; N), the display of the virtual lane is unnecessary, so the process returns to step 11 to continue the detection of the environment outside the vehicle and the determination of the field of view.

  On the other hand, if it is determined that the visibility is poor (step 12; Y), the ECU 10 detects the accelerator operation from the driving operation history detected by the driving operation detection unit 14 and stored in the history data 165 according to the driving operation collection program 172. Then, it is determined whether or not the driving operation such as the number of brakes, the steering wheel operation, the behavior (see FIG. 2B) has changed (steps 13 and 14). That is, the ECU 10 determines, based on the driving operation determination data 163, whether or not the accelerator operation or the average vehicle speed has changed because the visibility has deteriorated.

  If there is no change in the driving behavior (step 14; N), the ECU 10 determines that the display of the virtual lane is unnecessary for the driver because the driving operation is not affected even if the visibility is deteriorated. Returns to step 11 and continues processing.

On the other hand, when there is a change in the driving behavior (step 14; Y), the ECU 10 detects the heartbeat, from the biological information history detected by the biological information sensor 12 and stored in the history data 165 according to the biological information collection program 173. It is determined whether or not the biological information of sweating, blood pressure, etc. (see FIG. 2C) is abnormal (steps 15 and 16).
That is, the ECU 10 determines whether or not there is an abnormality in the biological information based on the biological information determination data 164 because the visibility is poor and the driving operation has changed.
If there is no abnormality in the biological information (step 16; N), even if the visibility is deteriorated and the driving operation changes, the driver is not mentally uneasy, and the display of the virtual lane is unnecessary. The ECU 10 returns to step 11 and continues processing.

  If the biological information is abnormal (step 16; Y), that is, if the biological information is abnormal because of poor visibility and a change in driving operation, the ECU 10 displays a virtual lane (step 17). ).

  That is, the ECU 10 first identifies the shape of the road displaying the virtual lane from the map data. In this case, the ECU 10 specifies a road corresponding to the current position of the vehicle detected by the current position detection means 11 by map matching, and acquires the shape of the road currently running from the map data.

Next, the ECU 10 corrects the road shape from the image recognition result of the captured image in front of the vehicle by the camera 131 and the detection data of the millimeter wave radar 132.
Furthermore, obstacles (including a stopped vehicle) existing in front of the vehicle and other vehicles that are running are recognized from the captured image and the detection value of the millimeter wave radar.
Then, the ECU 10 generates a virtual lane image of the virtual lane corresponding to the corrected road shape and obstacle, and displays the virtual lane on the windshield by the display device 15.
Here, as the virtual lane image, two virtual lane images having a width of the vehicle width + α (for example, 50 cm left and right) stored in the vehicle data 168 are generated, and when an obstacle exists, the obstacle is avoided. A virtual lane image having a different shape is generated.

If the currently running road is a curve, a curve curvature is calculated and a virtual lane image is generated.
The curve curvature may be acquired from road data.

Next, the ECU 10 determines whether or not the biological information after displaying the virtual lane is still in an abnormal state (steps 18 and 19).
The processing in steps 18 and 19 is the same as that in steps 15 and 16.

  If the abnormality of the biological information continues (step 19; Y), the ECU 10 returns to step 17 and continues the display of the virtual lane and the monitoring of the abnormal biological information.

  On the other hand, when the abnormal state of the biological information returns to normal (step 19; N), the ECU 10 deletes the displayed virtual lane (ends display) (step 20), returns to step 11, and returns. Continue to detect outside environment and determine visibility.

4 and 5 show the display state of the virtual lane according to the present embodiment.
FIG. 4A shows 3D image data created from the captured image of the camera 131 and the detection value of the millimeter wave radar 132. In the case of this image, since the lane recognition from the vehicle can be recognized only less than 5 m, it is determined that the field of view is poor.
On the other hand, the road shape is recognized from the map data as shown in FIG.
Then, as shown in FIG. 4C, the ECU 10 displays the virtual lane 151 on the windshield with the vehicle width + α. FIG. 4C is a diagram for explaining the driver's field of view. The virtual lane 151 is displayed on the windshield, so that the driver's field of view is represented.

  As shown in FIG. 4C, the virtual lane 151 is displayed in accordance with the shape of the road that is visible to the driver, so parallel lines are not displayed even on a straight road. The distant road width is displayed narrowly.

FIG. 5 shows a virtual lane display state in a curve (b) that is narrow and has no lane when an obstacle (stopped vehicle) is present in front of the vehicle (a).
5 (a) and 5 (b) show the state seen from the top in order to explain the shape of the virtual lane, unlike FIG. 4 showing the state visible to the driver. Therefore, although other than the virtual lane is clearly displayed, the field of view is poor when the virtual lane is actually displayed.

As shown in FIG. 5A, when an obstacle (road parking etc.) 152 is detected in front of the host vehicle A, a path for avoiding the detected obstacle 152 is displayed in a virtual lane.
Then, as shown in FIG. 5 (a), when there is a vehicle 153, 154 that travels in the opposite lane, the virtual lane cannot be traveled according to the displayed virtual lane while avoiding obstacles. Is displayed lightly or not, and the visual field is poor.
As a warning by voice, for example, a warning such as “There is a possibility of coming into contact with an oncoming vehicle when passing by avoiding an obstacle present in the left front” is given.

  In addition, you may make it operate a winker automatically, when crossing a lane (it runs in an oncoming lane) irrespective of the presence or absence of an oncoming vehicle.

  On the other hand, as shown in FIG. 5 (b), when there is no lane and the vehicle travels on a narrow curve, a virtual lane that matches the road shape recognized based on the curve curvature as described above is displayed, If there is a counter vehicle, a warning to that effect may be given.

The embodiment of the virtual lane display device of the present invention has been described above. However, the present invention is not limited to the described embodiment, and various modifications can be made within the scope described in each claim. is there.
For example, in the embodiment described above, the pulse rate and sweating information are detected as the biological information to be detected, but other information of the autonomic nervous system such as the pupil and brain waves is detected as the other biological information, and the change Therefore, it may be determined that the biological information is abnormal when the sympathetic nervous system is dominant.
Further, for example, vehicle information acquisition means for acquiring environmental information outside the vehicle, visibility determination means for determining the visibility of the front of the vehicle from the acquired environmental information, and biological information for acquiring the driver's biological information Acquisition means, biological information determination means for determining whether or not the acquired biological information exceeds a predetermined threshold, road shape recognition means for recognizing a road shape on which the vehicle is currently traveling, and the visibility Virtual lane display means for displaying a virtual lane in accordance with the recognized road shape when it is determined that the biometric information is determined not to be good and exceeds the predetermined threshold value. A virtual lane display device (configuration 1) characterized by
In addition, vehicle information acquisition means for acquiring environmental information outside the vehicle, visibility determination means for determining the visibility of the front of the vehicle from the acquired environmental information, and driving operation detection for detecting the driving operation of the driver Means, an abnormal operation determining means for determining whether or not the detected driving operation corresponds to an abnormal operation defined in advance as a driving operation when visibility is not good, and a road shape on which the vehicle is currently traveling And a virtual lane display that displays a virtual lane in accordance with the recognized road shape when it is determined that the visibility is not good and the abnormal operation is determined. And a virtual lane display device (Configuration 2).
The virtual lane display means displays a virtual lane on a road by scanning a laser beam, or displays a virtual lane on a windshield, The virtual lane according to Configuration 1 or Configuration 2 A display device (Configuration 3) may be used.
Further, the present invention includes an obstacle detection unit that detects an obstacle existing in front of the vehicle, wherein the virtual lane display unit displays a virtual lane that avoids the detected obstacle. Alternatively, the virtual lane display device (configuration 4) described in configuration 3 may be used.
In addition, the vehicle further includes another traveling vehicle existing in front of the vehicle, and a front entity detection unit that detects an obstacle, and a warning unit that warns the presence according to the detected other traveling vehicle or obstacle. The virtual lane display device (configuration 5) described in the configuration 1, the configuration 2, or the configuration 3 may be used.

In the described embodiment, after the display of the virtual lane (step 17), when the biological information returns to normal (step 19; N), the displayed virtual lane is deleted (step 20). There may be a case where the abnormality of the biological information is resolved because the virtual lane is displayed and the patient is relieved in a situation where visibility is not good.
Therefore, the virtual lane may be erased (step 20) when both conditions of elimination (normalization) of a biological abnormality and a change in driving behavior are satisfied are satisfied.
Further, instead of the processing in steps 17 and 18, it is determined whether or not the visibility has improved (processing in steps 11 and 12), and the virtual lane is deleted on the condition that the visibility has improved (step 20). You may do it.

It is a block diagram of the virtual lane display apparatus in one Embodiment of this invention. It is explanatory drawing which illustrated the division | segmentation of the state detected by each of a vehicle exterior environment acquisition part, a driving operation detection part, and a biological information sensor. It is a flowchart showing the content of the virtual lane display process. It is explanatory drawing showing the display state of the virtual lane by this embodiment. It is explanatory drawing showing the other display state of the virtual lane by this embodiment.

Explanation of symbols

10 ECU
DESCRIPTION OF SYMBOLS 11 Current position detection apparatus 12 Biological information sensor 121 Heart rate sensor 122 Sweating sensor 123 Blood pressure sensor 13 Outside vehicle environment acquisition part 131 Camera 132 Millimeter wave radar 133 Wiper sensor 134 Headlamp sensor 14 Driving | operation operation detection part 141 Accelerator sensor 142 Brake sensor 143 Handle sensor DESCRIPTION OF SYMBOLS 15 Display apparatus 16 Data storage part 17 Program storage part 18 Audio | voice output apparatus

Claims (5)

  1. Vehicle information acquisition means for acquiring environmental information outside the vehicle;
    Visibility determination means for determining the quality of the visibility ahead of the vehicle from the acquired environmental information;
    Biometric information acquisition means for acquiring the biometric information of the driver;
    Biometric information determination means for determining whether or not the acquired biometric information exceeds a predetermined threshold;
    Road shape recognition means for recognizing the road shape on which the vehicle is currently traveling;
    Virtual lane display means for displaying a virtual lane in accordance with the recognized road shape when it is determined that the visibility is not good and the biometric information is determined to exceed a predetermined threshold; A virtual lane display device comprising:
  2. Vehicle information acquisition means for acquiring environmental information outside the vehicle;
    Visibility determination means for determining the quality of the visibility ahead of the vehicle from the acquired environmental information;
    Driving operation detection means for detecting the driving operation of the driver;
    An abnormal operation determining means for determining whether the detected driving operation corresponds to an abnormal operation defined in advance as a driving operation when visibility is not good;
    Road shape recognition means for recognizing the road shape on which the vehicle is currently traveling;
    It is determined that the visibility is poor, and, wherein when it is determined to correspond to abnormal operation that was equipped with a virtual lane display means for displaying the virtual lane to suit the recognized road shape A virtual lane display device.
  3. 3. The virtual lane according to claim 1, wherein the virtual lane display means displays a virtual lane on a road by scanning a laser beam , or displays a virtual lane on a windshield. Display device.
  4. An obstacle detection means for detecting an obstacle present in front of the vehicle;
    The virtual lane display device according to claim 1, wherein the virtual lane display means displays a virtual lane that avoids the detected obstacle.
  5. Other traveling vehicles existing in front of the vehicle, and a front entity detection means for detecting obstacles,
    The virtual lane display device according to claim 1, further comprising a warning unit that warns the presence of the detected other traveling vehicle or obstacle according to the detected traveling vehicle.
JP2005316478A 2005-10-31 2005-10-31 Virtual lane display device Active JP4888761B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2005316478A JP4888761B2 (en) 2005-10-31 2005-10-31 Virtual lane display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2005316478A JP4888761B2 (en) 2005-10-31 2005-10-31 Virtual lane display device

Publications (2)

Publication Number Publication Date
JP2007122578A JP2007122578A (en) 2007-05-17
JP4888761B2 true JP4888761B2 (en) 2012-02-29

Family

ID=38146332

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2005316478A Active JP4888761B2 (en) 2005-10-31 2005-10-31 Virtual lane display device

Country Status (1)

Country Link
JP (1) JP4888761B2 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009023471A (en) * 2007-07-19 2009-02-05 Clarion Co Ltd Vehicle information display device
JP5016503B2 (en) * 2008-01-18 2012-09-05 本田技研工業株式会社 Vehicle contact avoidance support device
EP2351668A4 (en) * 2008-09-12 2013-03-13 Toshiba Kk Image irradiation system and image irradiation method
JP5299026B2 (en) * 2009-03-30 2013-09-25 マツダ株式会社 Vehicle display device
JP5338654B2 (en) * 2009-12-24 2013-11-13 株式会社デンソー Virtual white line setting method, virtual white line setting device, and course changing support device using the same
JP5917840B2 (en) * 2011-06-15 2016-05-18 日産自動車株式会社 Route search apparatus and route search method
JP6105416B2 (en) * 2013-07-17 2017-03-29 矢崎総業株式会社 Display device
JP6395393B2 (en) * 2014-02-13 2018-09-26 株式会社小糸製作所 Vehicle driving support device
KR101714185B1 (en) * 2015-08-05 2017-03-22 엘지전자 주식회사 Driver Assistance Apparatus and Vehicle Having The Same
CN106364403A (en) * 2016-10-14 2017-02-01 深圳市元征科技股份有限公司 Lane recognizing method and mobile terminal
KR20180062503A (en) * 2016-11-30 2018-06-11 현대엠엔소프트 주식회사 Apparatus for controlling autonomous driving and method thereof
KR20180090610A (en) * 2017-02-03 2018-08-13 삼성전자주식회사 Method and apparatus for outputting information about a lane
KR102091017B1 (en) * 2017-12-01 2020-03-19 팅크웨어(주) Electronic device and mehtod of shooting pictures thereof

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05221253A (en) * 1992-02-12 1993-08-31 Toyota Motor Corp Running control device for vehicle
JPH079886A (en) * 1993-06-29 1995-01-13 Hitachi Ltd Drive information device for vehicle
JPH11151230A (en) * 1997-11-19 1999-06-08 Nissan Motor Co Ltd Driver state measuring instrument for vehicle
JP3680243B2 (en) * 1999-01-20 2005-08-10 トヨタ自動車株式会社 Runway shape display device and map database recording medium
JP4051984B2 (en) * 2002-04-03 2008-02-27 日産自動車株式会社 Vehicle information providing device
JP2003317197A (en) * 2002-04-26 2003-11-07 Aisin Aw Co Ltd Alarm system
JP2004034881A (en) * 2002-07-05 2004-02-05 Honda Motor Co Ltd Running controller for vehicle
JP4075743B2 (en) * 2003-09-01 2008-04-16 株式会社デンソー Vehicle travel support device
JP2005170323A (en) * 2003-12-15 2005-06-30 Denso Corp Runway profile displaying device
JP4305318B2 (en) * 2003-12-17 2009-07-29 株式会社デンソー Vehicle information display system
JP4815943B2 (en) * 2005-08-19 2011-11-16 株式会社デンソー Hazardous area information display device

Also Published As

Publication number Publication date
JP2007122578A (en) 2007-05-17

Similar Documents

Publication Publication Date Title
US10336323B2 (en) Predictive human-machine interface using eye gaze technology, blind spot indicators and driver experience
US9604648B2 (en) Driver performance determination based on geolocation
US10286905B2 (en) Driver assistance apparatus and control method for the same
CN105136156B (en) Adaptive navigation based on user behavior pattern and location based service
Bergasa et al. Drivesafe: An app for alerting inattentive drivers and scoring driving behaviors
US9536156B2 (en) Arrangement and method for recognizing road signs
US9248777B2 (en) Automatic signaling system for vehicles
DE102017201717A1 (en) Visual return system for a vehicle and method for use thereof
Chen et al. Invisible sensing of vehicle steering with smartphones
JP6189815B2 (en) Traveling line recognition system
US20160191840A1 (en) User interface method for terminal for vehicle and apparatus thereof
EP2565106B1 (en) Method for monitoring lanes and lane monitoring system for a vehicle
US9707971B2 (en) Driving characteristics diagnosis device, driving characteristics diagnosis system, driving characteristics diagnosis method, information output device, and information output method
KR102051142B1 (en) System for managing dangerous driving index for vehicle and method therof
EP2427855B1 (en) Method for the presentation on the display portion of a display device of objects present in the neighborhood of a vehicle
JP5407764B2 (en) Driving assistance device
US8811668B2 (en) Method for controlling a headlamp system for a vehicle, and headlamp system
US6411898B2 (en) Navigation device
US7877187B2 (en) Driving support method and device
DE19609488B4 (en) Road situation recognition system
EP2149131B1 (en) Method and device for identifying traffic-relevant information
EP2426001B1 (en) Inattention determining device
US20160364621A1 (en) Navigation device with integrated camera
TWI314115B (en) Method and apparatus for predicting/alarming the moving of hidden objects
US10147005B2 (en) Sign display apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20080926

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110203

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110204

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110329

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20111118

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20111201

R150 Certificate of patent or registration of utility model

Ref document number: 4888761

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20141222

Year of fee payment: 3

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250