JP2007272350A - Driving support device for vehicle - Google Patents

Driving support device for vehicle Download PDF

Info

Publication number
JP2007272350A
JP2007272350A JP2006094529A JP2006094529A JP2007272350A JP 2007272350 A JP2007272350 A JP 2007272350A JP 2006094529 A JP2006094529 A JP 2006094529A JP 2006094529 A JP2006094529 A JP 2006094529A JP 2007272350 A JP2007272350 A JP 2007272350A
Authority
JP
Japan
Prior art keywords
vehicle
target
display
driving
predicted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2006094529A
Other languages
Japanese (ja)
Other versions
JP4847178B2 (en
Inventor
Yasuharu Oyama
Shoichi Sano
彰一 佐野
泰晴 大山
Original Assignee
Honda Motor Co Ltd
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honda Motor Co Ltd, 本田技研工業株式会社 filed Critical Honda Motor Co Ltd
Priority to JP2006094529A priority Critical patent/JP4847178B2/en
Publication of JP2007272350A publication Critical patent/JP2007272350A/en
Application granted granted Critical
Publication of JP4847178B2 publication Critical patent/JP4847178B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/80Technologies aiming to reduce greenhouse gasses emissions common to all road transportation technologies
    • Y02T10/84Data processing systems or methods, management, administration

Abstract

An object of the present invention is to display a target driving route and an actual driving schedule course in a superimposed manner on a screen of a display device so as to facilitate the recognition of the relationship between them and eliminate the need for predictive steering input based on the driver's experience. Provided is a vehicle driving support device that can be provided with a simple operation method.
A vehicle driving support device 10 includes a head-up display device 14 that displays driving information of a vehicle 1 on a display part 13A, and a route guidance setting unit that generates information on a target driving route suitable for driving the vehicle. 76, a predicted travel locus calculation unit 73 that creates information on the predicted travel locus of the vehicle based on the behavior signal output from the vehicle state detection unit 40, and the target travel route is displayed on the display screen as the target travelable region 111. First display control means (74) and second display control means (74) for displaying the predicted traveling locus 112 on the display screen are provided.
[Selection] Figure 2

Description

  The present invention relates to a driving support apparatus for a vehicle, and more particularly to a driving support apparatus for driving in which a target travel route and a forward travel planned course are displayed superimposed on a head-up display to easily recognize two pieces of travel information.

Conventionally, an in-vehicle display device including a head-up display is known (Patent Document 1). In this in-vehicle display device, various information indicating the vehicle running state such as vehicle speed, acceleration / deceleration, rudder angle, yaw rate, etc. is detected by a sensor, processed by a computer, and a predicted running locus of the vehicle is calculated. The calculated predicted travel locus is projected from the projector onto the screen and displayed. A driver looking at the front of the vehicle through the screen and the front window sees the scenery in front of the vehicle and the predicted traveling locus reflected from the screen and entering the field of view. In this example, it is expected that the vehicle can travel along the road by maintaining the current driving state.
Japanese Patent Publication No. 5-81449

  According to the vehicle-mounted display device described in Patent Literature 1, the driver can know the predicted traveling locus of the vehicle by superimposing it on the scenery in front. For this reason, it is possible to know whether or not the driving situation of the own vehicle is appropriate for the situation such as a road curve. However, the driving course according to the target driving route and the target driving position on the road need to be driven by the driver himself in consideration of surrounding information, and the driving burden is particularly large for a driver who has little driving experience. It has become a thing.

  In view of the above-described problems, the object of the present invention is to display the target travel route and the actual travel schedule course in front of each other on the display screen of the display device so as to facilitate the recognition of the relationship between the two travel information. It is an object of the present invention to provide a vehicle driving support device that eliminates the need for predictive steering input based on experience and can provide a driver with a simple operation method.

  The vehicle driving support apparatus according to the present invention is configured as follows in order to achieve the above object.

  A first vehicle driving support device (corresponding to claim 1) includes a display device that displays vehicle travel information on a display screen, and target travel route creation means that creates information on a target travel route suitable for vehicle travel. And a predicted travel locus creating means for creating information on the predicted travel locus of the vehicle based on a behavior signal output from the behavior detection means of the vehicle, and a target travelable region for the target travel route created by the target travel route creating means. As a first display control means for displaying on the display screen, and a second display control means for displaying the predicted travel locus created by the predicted travel locus creation means on the display screen.

  In the vehicle driving support device described above, the target travelable region (one display mode of the target travel route) and the predicted travel locus in front of the vehicle (scheduled travel course ahead) are superimposed on the screen of the display device that displays the travel information. By simultaneously displaying the two pieces of traveling information so as to be displayed simultaneously, it is possible for the driver to easily recognize the traveling state of the host vehicle and to reduce the handling burden.

  In the second vehicle driving support device (corresponding to claim 2), in the above configuration, the display device is preferably a head-up display, and a target travelable region and a predicted travel locus are displayed on the head-up display. Characterized by overlapping and displaying.

  The third vehicle driving support device (corresponding to claim 3) is preferably configured so that the target travelable area and the predicted travel locus are in the vicinity of the driver's front gazing point on the head-up display. Characterized by being displayed.

The present invention has the following effects.
First, on the display screen of a head-up display device or the like, the target travel route is displayed as a target travelable region with a predicted travel trajectory superimposed with the scenery in front of the vehicle. In addition to knowing, it is sufficient to perform driving along the predicted traveling locus, eliminating the need for complicated enforcement such as changing the course and determining the timing of turning, thereby facilitating driving. Furthermore, since the predicted travel locus on the forward traveling road of the vehicle is taught superimposed on the front scenery in the same way as the target travel course, it is possible to know in advance the deviation from the target travel course when the predicted predicted travel position passes. It is possible to easily know the current driving situation, the necessity of corrective steering, and the amount of steering.
Second, in order to enable a driving method that matches the target travel route (target travel possible region) projected in front of the driver with the predicted travel locus, predictive steering based on experience required for conventional driving is performed. It becomes unnecessary, and it becomes easy for a driver who is unfamiliar with driving experience, such as a beginner, to easily control the vehicle, and the driving burden can be greatly reduced.
Thirdly, since the target travel route and the predicted travel miracle are superimposed and displayed at the position of the forward gazing point that the driver sees as a target during driving, accurate and easy-to-maneuver information can be given to the driver.

  DESCRIPTION OF EMBODIMENTS Preferred embodiments (examples) of the present invention will be described below with reference to the accompanying drawings.

  FIG. 1 shows an overall apparatus configuration including a usage mode of a vehicle driving support apparatus according to an embodiment of the present invention, and FIG. 2 shows a block diagram of a control system of the vehicle driving support apparatus.

  As shown in FIG. 1, the vehicle driving support apparatus 10 according to the present embodiment is additionally provided in a vehicle 1 such as an automobile. In FIG. 1, the driver 11 is in a state of getting on the driver's seat 12 of the vehicle 1 and driving while looking in front of the vehicle 1 through the front window 13 of the vehicle 1. A portion 13A of the front window 13 is a display screen. A block 14 indicated by a broken line indicates a head-up display (HUD) device that displays an image related to travel information necessary for the part 13A of the front window 13 as a display screen. The head-up display device 14 includes a display controller 15, a video projector 16, reflection mirrors 17 and 18, and a condenser lens 19. An image related to the travel information projected from the image projector 16 is displayed on the part 13 </ b> A of the front window 13 via the reflection mirrors 17 and 18 and the condenser lens 19.

  The driver 11 who is driving by operating the handle 21 is looking forward in the traveling direction of the vehicle 1 through the portion 13A of the front window 13. In FIG. 1, reference numeral 22 denotes an imaging position, and an image displayed on the part 13 </ b> A of the front window 13, that is, an image on the head-up display, is displayed near the driver's 11 forward gazing point. Reference numeral 23 denotes a virtual imaging position set in front of the vehicle 1. For the driver 11, the image displayed on the part 13 </ b> A of the front window 13 is displayed so as to overlap with the scenery in front of the traveling direction of the vehicle 1 at the virtual imaging position 23.

  The video or image displayed on the part 13A of the front window 13 is at least an image representing the target travelable area and an image representing the predicted travel locus (scheduled travel course), as shown in FIG.

  The vehicle driving support apparatus 10 equipped in the vehicle 1 has, as a basic configuration, a navigation system 30 for creating travel route guidance information, and a vehicle state detection unit 40 for detecting a behavior state related to traveling of the vehicle 1. The vehicle position information detecting unit 50, the surrounding state detecting unit 60 for detecting surrounding conditions and threats of the own vehicle, and the input of detection signals from these detecting units to perform necessary information processing and take overall control A driving support control unit 70 is provided. The vehicle state detection unit 40 includes a vehicle speed sensor 41, a lateral acceleration (lateral G) sensor 42, a longitudinal acceleration (longitudinal G) sensor 43, and the like. The vehicle position information detection unit 50 includes a GPS antenna 51, a beacon antenna 52, a telephone antenna 53, and the like. Further, the ambient condition detection unit 60 includes a CCD camera 61, a laser radar 62, a millimeter wave radar 63, and the like.

  The travel information and the like created by the driving support control unit 70 is provided to the display controller 15 and displayed on the portion 13A of the front window 13 by the function of the head-up display device 14.

  The vehicle driving support device 10 further includes a driver state detection unit 80 and an alarm generation unit 90. The driver state detection unit 80 includes a seat front / rear position sensor 81, a seat load sensor 82, a viewpoint detection sensor 83, and the like, and can provide an appropriate display to the driver 11. A detection signal from the driver state detection unit 80 is input to the driving support control unit 70. The alarm generation unit 90 includes a speaker 91, a meter display 92, and the like, and gives an alarm to the driver 11 in a dangerous situation. The driving support control unit 70 supplies an alarm signal to the alarm generation unit 90.

  The input unit (detection unit and the like) and the output unit of the driving support control unit 70 will be described according to FIG. 2, and the internal configuration of the driving support control unit 70 will be described in detail.

  The driving support control unit 70 includes the navigation system 30, the vehicle state detection unit 40, the own vehicle position information detection unit 50, the surrounding state detection unit 60, and the driver state detection unit 80 as input units. In addition to the vehicle speed sensor 41, the lateral acceleration sensor 42, and the longitudinal acceleration sensor 43, the vehicle state detection unit 40 includes a yaw angular velocity sensor 44, a steering angle sensor 45, a steering torque sensor 46, a brake pedal force sensor 47, an accelerator opening sensor. 48 and a direction indicator switch 49 are included. The ambient condition detection unit 60 includes a solar radiation sensor 64 and a temperature sensor 65 in addition to the CCD camera 61, the laser radar 62, and the millimeter wave radar 63 described above. The driver state detection unit 80 includes a display position setting switch 84 and a display brightness setting switch 85 in addition to the seat front / rear position sensor 81, the seat load sensor 82, and the viewpoint detection sensor 83 described above.

  The driving support control unit 70 includes the head-up display device 14 and the alarm generation unit 90 as output units. The alarm generation unit 90 includes a steering 93, a brake 94, and an accelerator 95 in addition to the speaker 91 and the meter display 92 described above.

  The internal configuration of the driving support control unit 70 will be described. Detection signals and the like from the sensors of the vehicle state detection unit 40 are input to the traveling state detection unit 71. A signal from each element of the ambient condition detection unit 60 is input to the ambient condition detection input unit 72.

  Detection signals and the like from the sensors and the like, which are taken in by the traveling state detection unit 71 from the vehicle state detection unit 40, are further input to the predicted traveling locus calculation unit 73. The predicted travel locus calculation unit 73 calculates the future predicted travel position of the vehicle 1 from time to time based on signal values such as detection signals from the sensors and the like. A signal related to the predicted travel position calculated by the predicted travel locus calculation unit 73 is input to the display position setting unit 74. The future “predicted travel position” of the vehicle 1 is displayed in the expression format “predicted travel locus” on the display screen as described later.

  In addition, detection signals and the like from the vehicle state detection unit 40 and captured by the traveling state detection unit 71 are also directly input to the display position setting unit 74.

  Signals taken from the elements of the surrounding state detection unit 60 and captured by the surrounding state detection input unit 72 are input to the travelable area calculation unit 75 and the display position setting unit 74, respectively. The travelable area calculation unit 75 calculates the travelable area based on information related to the surrounding situation. A signal related to the travelable area calculated by the travelable area calculation unit 75 is input to the route guidance setting unit 76.

  The route guidance setting unit 76 includes navigation information from the navigation system 30 (preliminarily input destination and route guidance) in addition to a signal related to the travelable area created based on the detection signal from the surrounding state detection unit 60. Information relating to information such as guidance) and detection signals (vehicle position information and the like) from the respective elements of the vehicle position information detection unit 50 are input. For example, the route guidance setting unit 76 sets a route for guiding the vehicle 1 to the destination as a target travel route based on the navigation information and the vehicle position information, or detects from the surrounding state detection unit 60. A travelable route is set as a target travel route based on a signal related to the travelable region created based on the signal. That is, the route guidance setting unit 76 sets a target travel route suitable for the travel of the vehicle 1 on the road for the driver 11 driving the vehicle 1. A signal related to the information on the target travel route set by the route guidance setting unit 75 is input to the display position setting unit 74.

  Further, the display position setting unit 74 receives detection signals of each element from the driver state detection unit 80.

  The driving support control unit 70 further includes a superimposed display control unit 77, a dangerous situation determination unit 78, and an alarm signal generation unit 79.

  The target travel route of the vehicle 1 is set by the route guidance setting unit 76 based on, for example, navigation information from the navigation system 30 and vehicle position information from the vehicle position information detection unit 5 as described above. . At that time, the surrounding state detection input unit 72 and the travelable region calculation unit 75 also calculate a safe travelable region that avoids obstacles and threats from the ambient state information, and set the target travel route in the route guidance setting unit 76. Can also be reflected. The “target travel route” is set in the expression format “target travel possible region” on the display screen as described later.

  The predicted travel locus data calculated by the predicted travel locus calculation unit 73 and the target travel route (target travelable region) data set by the route guidance setting unit 76 are input to the display position setting unit 74. The display position setting unit 74 sets the display position in accordance with a traveling state such as a difference in vehicle speed, turning, or straight traveling. The predicted travel locus and the display position data of the target travelable area set by the display position setting unit 74 are converted into a display signal by the superimposed display control unit 77 and sent to the head-up display device 14 for head-up display. It is displayed on the display part 13A of the device 14.

  The detection signals from the sensors of the vehicle state detection unit 40 that detects the behavior of the vehicle 1 in the traveling state are used to calculate the predicted traveling locus of the vehicle 1 as accurately as possible. The example of the detection signal shown in FIG. 2 and the like is a main example for calculating the motion of the vehicle 1, but in addition to this, the slip angle of the tire, the slip angle of the vehicle body, the vertical load of each wheel, the road surface μ By detecting the above, it is possible to calculate a predicted traveling locus with higher accuracy.

  In addition, each element of the vehicle position information detection unit 50 is not limited to the position information by GPS, but it can be used to import road information around the infrastructure such as VICS and ETC, as well as new maps and news from mobile and radio. By obtaining information, a more accurate travel route can be set.

  In the above configuration, the safe travelable area is calculated from the ambient condition information from the elements such as the CCD camera 61 of the ambient condition detection unit 60 and reflected in the setting of the target travel route. When the situation determination unit 78 determines the danger level of an obstacle or a threat using the surrounding situation information, and determines that the danger is dangerous, the danger signal is transmitted to the alarm generation unit 90 via the alarm signal generation unit 79. Each element is driven to issue an alarm to the driver 11 or supplied to the head-up display device 14 via the superimposed display control unit 77.

  FIG. 3 shows a first example of an image displayed on the part 13 </ b> A of the front window 13 of the head-up display device 14. 101 is a two-lane road, 102 is a plurality of vehicles ahead, 103 is trees, and 104 is a boundary fence. These road 101, forward vehicle 102, trees 103, and boundary fence 104 enter the field of view of the driver 11 through the front window 13 as a landscape or scenery ahead of the traveling direction of the vehicle.

  In the region of the part 13A of the front window 13, the target travelable region 111 is further shown in a horizontal bar shape, and the predicted travel locus 112 is shown in the form of a linear array representation of a plurality of triangular marks. . In addition, a special mark based on, for example, a round shape as the predicted travel position 113 is drawn at a portion where the target travelable area 111 and the predicted travel locus 112 intersect. The predicted traveling position 113 is displayed with increased brightness. In the part 13A of the front window 13, the target travelable area 111 and the predicted travel locus 112 are displayed so as to be superimposed on the scenery in front of the vehicle 1. Both of the images of the target travelable area 111 and the predicted travel locus 112 are displayed in the vicinity of the position of the forward gaze distance corresponding to the vehicle speed that is generally said to be watched when the driver 11 is driving. Is done. In the example of FIG. 3, the target travelable area 111 is expressed as a horizontal bar having a band shape, and the recommended travel area 111a is darkened, and the safe travelable area 111b is slightly lighter in color. To express.

  When the driver 11 steers the steering wheel 21 left and right, the display position of the predicted traveling locus 112 and the predicted traveling position 113 shown in the part 13A of the front window 13 changes as indicated by an arrow 114 accordingly.

  FIG. 4 shows a second example of the image when the steering wheel 21 is slightly turned to the left and the traveling lane of the vehicle 1 is changed to the left lane on the display screen in the part 13A similar to FIG. Elements that are the same as those shown in FIG. 3 are given the same reference numerals. In the display example of FIG. 4, the target travelable area 111 does not change, but the steering wheel 21 is slightly steered leftward to change the travel lane, and therefore based on the behavior of the vehicle 1 detected by the vehicle state detection unit 40. Thus, the predicted travel locus 112 and the predicted travel position 113 gradually move to the left side. That is, when the driver 11 operates the handle 21, the predicted travel position 113 is moved according to the operation, and the future predicted travel locus 112 is taught to the driver 11.

  The marks such as the target travelable area 111, the predicted travel locus 112, and the predicted travel position 113 may be expressed by changing colors, or other marks such as lines and arrows may be used.

  FIG. 5 shows a third example of an image displayed on the part 13 </ b> A of the front window 13 of the head-up display device 14. 201 is a one-lane road. The scenery of the road 201 and its both sides enters the driver's 11 view through the front window 13 as a scenery in front of the traveling direction of the vehicle. In FIG. 5, 21 is a handle of the driver's seat, 111 is a target travelable area, 112 is a predicted travel locus, and 113 is a predicted travel position. In the third video example, the predicted travel position 113 is shown in relation to the planned travel course on the road 201.

  Further, the danger information such as obstacles obtained by the danger situation determination unit 78 as described above is supplied to the head-up display device 14 via the superimposed display control unit 77, which is shown in FIGS. In the display portion 13A, the display mark such as the target travelable area 111 is emphasized by changing the color to red or by flashing, so that the driver 11 is informed of the danger. Further, it is possible to issue a warning on the meter display 92 or a sound from the speaker 91. Further, using the parts operated by the driver 11 such as the steering 93 and the brake 94, it is possible to give a vibration or operate the system in a direction that avoids a dangerous situation, thereby giving an alarm or avoidance. .

    FIG. 6 shows a fourth example of an image displayed on the portion 13A of the front window 13 of the head-up display device 14. 301 is a three-lane straight road. The scenery of the road 301 and both sides thereof enters the field of view of the driver 11 as a scenery in front of the traveling direction of the vehicle 1 through the front window 13. In FIG. 6, 21 is a handle of the driver's seat, 111A is the first target travelable area, 111B is the second target travelable area, and 113 is the predicted travel position. The first target travelable area 111A is a target travelable area when the vehicle 1 travels at a low speed. The second target travelable area 111B is a target travelable area when the vehicle 1 travels at a high speed. In the fourth video example, the display position of the target travelable area is moved to the near side or the far side at the front position of the vehicle while changing the size according to the vehicle speed of the vehicle 1 and displayed. Thus, the display position of the target travelable area can be changed according to the vehicle speed.

  FIG. 7 shows a fifth example of an image displayed on the portion 13A of the front window 13 of the head-up display device 14. 401 is a two-lane straight road, and 402 is a road that enables a left turn. The driver 11 turns the steering wheel 21 to the left and performs a turning operation to change the course from the road 401 to the road 402. The roads 401 and 402 and the scenery on both sides thereof enter the field of view of the driver 11 through the front window 13 as the scenery in front of the traveling direction of the vehicle 1. In FIG. 7, when the driver 11 tries to turn left from the road 401 to the road 402 by turning the handle 21 to the left, the viewpoint of the driver 11 is directed to the left turn direction. 112, the predicted traveling position 113 is set to the left side of the driver 11 with the display position on the left side. The same applies when turning right.

  The display contents in the part 13A of the front window 13 based on the head-up display device 14, that is, the target travelable area 111, the predicted travel locus 112, and the predicted travel position 113 can be changed as follows.

  In the normal display, the target travelable area set by the travelable area calculation unit 75 based on the signal from the ambient condition detection unit 60 is displayed with the shape, color, brightness, etc. changed according to the degree of safety. It is preferable to teach 11 target travel routes in an easy-to-understand manner.

  The ambient condition detection input unit 72 detects external conditions such as nighttime (light switch, solar radiation sensor), rainy weather (wiper switch, raindrop detection sensor), and snowfall (image, temperature), and displays the information as a display position. Send to setting unit 74. It is preferable that the display position setting unit 74 performs display by changing the display color, brightness, and the like that are easily perceived by the driver 11 according to each field of view based on the information.

  Further, the position of the seat, the height of the viewpoint, the position of the front gazing point, and the like, which are different depending on the driver 11, are detected by the seat position sensor and the viewpoint position of the driver state detection unit 80 and input to the display position setting unit 74. Is done. The display position setting unit 74 predicts the forward gazing point position based on the information, and displays it at the forward position corresponding to the driver 11. If it is desired to change the front display position, brightness, etc., depending on the driver's 11 habit or preference, it is also possible to make adjustments with the setting switches.

  According to the vehicle driving support device 10 having the above-described configuration, the following operational effects are produced, and the vehicle 1 can be easily operated.

  The current structure of the steering system has a characteristic that the lateral position of the vehicle corresponds to the double integration of the steering angle of the steering wheel with respect to the relationship between the course of the vehicle 1 and the steering angle of the steering wheel 21. Therefore, according to the current steering system, when a person drives, the steering wheel steering angle is input in anticipation of a change in the lateral position of the vehicle due to double integration, which is difficult for the driver. ing. The driving school aims to acquire this by experiencing this, but in actual driving, the response of the vehicle changes variously depending on the vehicle's speed, motion state, road surface condition, etc., so unfamiliar driving with little experience It is difficult for a driver or a driver who is not good at expecting to steer the vehicle to the target lateral position.

  According to the vehicle driving support device 10 according to the present embodiment, the computer system calculates a double integral with respect to the steering angle of the steering wheel 21 instead of the driver 11, and sets the future predicted arrival position to the driver 11. Will teach. Therefore, since it is possible to quantitatively grasp whether the steering angle of the steering wheel is excessive or insufficient with respect to the target position for traveling, it is very easy for the driver who is not accustomed to driving.

  Furthermore, the driver (driver) predicts the lateral position of the vehicle with respect to the input not only in a steering system having a conventional mechanical structure but also in a control system using a control stick or an operation controller constituted by steer-by-wire. Therefore, it is more difficult to predict and track in an SBW system designed to reduce the amount of input.

  On the other hand, according to the vehicle driving support device 10 according to the present embodiment, since tracking with respect to the target can be performed in the front view, these problems are solved, and the operation burden is greatly increased while maintaining the effect of the SBW system. Can be reduced.

  Furthermore, in the steering system of the azimuth feedback system, the steering is facilitated by performing one of the two integrals when the person is driving in the steering system. Further application makes it possible to perform the remaining integration by teaching, so that maneuvering can be performed more easily.

  The configurations described in the above embodiments are merely shown to the extent that the present invention can be understood and implemented. Therefore, the present invention is not limited to the described embodiments, and can be variously modified without departing from the scope of the technical idea shown in the claims.

  The present invention is applied to a steering system of a vehicle such as an automobile to reduce a driver's steering burden, and is used to realize a simple steering method by driving a vehicle.

1 is a configuration diagram illustrating an overall configuration of a vehicle driving support apparatus according to an embodiment of the present invention. It is a block block diagram which shows the internal structure of the driving assistance control unit of the vehicle driving assistance device which concerns on this embodiment, its input part, and an output part. It is a figure which shows the 1st example of the image | video displayed on the front window of the head-up display apparatus in a driving assistance device for vehicles. It is a figure which shows the 2nd example of the image | video displayed on the front window of the head-up display apparatus in the driving assistance device for vehicles. It is a figure which shows the 3rd example of the image | video displayed on the front window of the head-up display apparatus in a driving assistance device for vehicles. It is a figure which shows the 4th example of the image | video displayed on the front window of the head-up display apparatus in a vehicle driving assistance device. It is a figure which shows the 5th example of the image | video displayed on the front window of the head-up display apparatus in a vehicle driving assistance device.

Explanation of symbols

DESCRIPTION OF SYMBOLS 1 Vehicle 10 Vehicle driving assistance device 11 Driver 13 Front window 14 Head-up display device 30 Navigation system 40 Vehicle state detection unit 50 Own vehicle position information detection unit 60 Ambient state detection unit 70 Driving support control unit 80 Driver state detection 90 Alarm generation unit

Claims (3)

  1. A display device for displaying vehicle travel information on a display screen;
    Target travel route creating means for creating information on a target travel route suitable for traveling of the vehicle;
    Predicted travel locus creating means for creating information on the predicted travel locus of the vehicle based on a behavior signal output from the behavior detection means of the vehicle;
    First display control means for displaying on the display screen the target travel route created by the target passing route creation means as a target travelable area;
    Second display control means for displaying the predicted traveling locus created by the predicted traveling locus creating means on the display screen;
    A vehicle driving support apparatus comprising:
  2.   2. The vehicle driving support apparatus according to claim 1, wherein the display device is a head-up display, and the target travelable area and the predicted travel locus are superimposed and displayed on the head-up display.
  3.   The vehicle driving support device according to claim 2, wherein the target travelable region and the predicted travel locus are displayed in the vicinity of a driver's forward gazing point on the head-up display.
JP2006094529A 2006-03-30 2006-03-30 Vehicle driving support device Expired - Fee Related JP4847178B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006094529A JP4847178B2 (en) 2006-03-30 2006-03-30 Vehicle driving support device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006094529A JP4847178B2 (en) 2006-03-30 2006-03-30 Vehicle driving support device

Publications (2)

Publication Number Publication Date
JP2007272350A true JP2007272350A (en) 2007-10-18
JP4847178B2 JP4847178B2 (en) 2011-12-28

Family

ID=38675124

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006094529A Expired - Fee Related JP4847178B2 (en) 2006-03-30 2006-03-30 Vehicle driving support device

Country Status (1)

Country Link
JP (1) JP4847178B2 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009119008A1 (en) 2008-03-28 2009-10-01 Kabushiki Kaisha Toshiba Image display apparatus and method for displaying an image
WO2010098449A1 (en) * 2009-02-27 2010-09-02 トヨタ自動車株式会社 Movement trajectory generator
WO2011033766A1 (en) * 2009-09-16 2011-03-24 株式会社 東芝 Display device for image information
JP2011096064A (en) * 2009-10-30 2011-05-12 Equos Research Co Ltd Driving assist system
JP2011192070A (en) * 2010-03-15 2011-09-29 Honda Motor Co Ltd Apparatus for monitoring surroundings of vehicle
JP2014010800A (en) * 2012-07-03 2014-01-20 Alpine Electronics Inc On-vehicle system
CN103707812A (en) * 2012-09-28 2014-04-09 富士重工业株式会社 Visual guidance system
JP2014071023A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
JP2014071019A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
JP2014071021A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
JP2014071022A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
JP2014075079A (en) * 2012-10-05 2014-04-24 Denso Corp Display device
KR101451859B1 (en) * 2012-12-20 2014-10-16 에스엘 주식회사 Image processing device and method thereof
WO2014174575A1 (en) * 2013-04-22 2014-10-30 トヨタ自動車株式会社 Vehicular head-up display device
WO2015001815A1 (en) * 2013-07-05 2015-01-08 クラリオン株式会社 Drive assist device
JP2016045705A (en) * 2014-08-22 2016-04-04 トヨタ自動車株式会社 In-vehicle device, in-vehicle device control method, and in-vehicle device control program
WO2016152553A1 (en) * 2015-03-26 2016-09-29 修一 田山 Vehicle image display system and method
DE102017105221A1 (en) 2016-03-31 2017-10-05 Subaru Corporation Vehicle control device and vehicle
WO2018051573A1 (en) * 2016-09-16 2018-03-22 富士フイルム株式会社 Projective display device, display control method, and display control program
WO2018078732A1 (en) * 2016-10-26 2018-05-03 三菱電機株式会社 Display control apparatus, display apparatus, and display control method
WO2019053987A1 (en) * 2017-09-15 2019-03-21 マクセル株式会社 Information display device
WO2019073935A1 (en) * 2017-10-10 2019-04-18 マクセル株式会社 Information display device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08178679A (en) * 1994-12-19 1996-07-12 Honda Motor Co Ltd On-vehicle display device
JPH08241493A (en) * 1995-03-02 1996-09-17 Honda Motor Co Ltd Calculation and display device for predictive running locus of vehicle
JPH097100A (en) * 1995-06-19 1997-01-10 Honda Motor Co Ltd Driver's gaze point predicting device and vehicle drive support system using the same
JPH10176928A (en) * 1996-12-18 1998-06-30 Hitachi Ltd Viewpoint position measuring method and device, head-up display, and mirror adjustment device
JP2003112589A (en) * 2001-10-04 2003-04-15 Mazda Motor Corp Driving support device
JP2005202787A (en) * 2004-01-16 2005-07-28 Denso Corp Display device for vehicle
JP2006190237A (en) * 2004-12-10 2006-07-20 Toyota Motor Corp Direction change supporting system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08178679A (en) * 1994-12-19 1996-07-12 Honda Motor Co Ltd On-vehicle display device
JPH08241493A (en) * 1995-03-02 1996-09-17 Honda Motor Co Ltd Calculation and display device for predictive running locus of vehicle
JPH097100A (en) * 1995-06-19 1997-01-10 Honda Motor Co Ltd Driver's gaze point predicting device and vehicle drive support system using the same
JPH10176928A (en) * 1996-12-18 1998-06-30 Hitachi Ltd Viewpoint position measuring method and device, head-up display, and mirror adjustment device
JP2003112589A (en) * 2001-10-04 2003-04-15 Mazda Motor Corp Driving support device
JP2005202787A (en) * 2004-01-16 2005-07-28 Denso Corp Display device for vehicle
JP2006190237A (en) * 2004-12-10 2006-07-20 Toyota Motor Corp Direction change supporting system

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8451111B2 (en) 2008-03-28 2013-05-28 Kabushiki Kaisha Toshiba Image display apparatus and method for displaying an image
JP2009244355A (en) * 2008-03-28 2009-10-22 Toshiba Corp Image display and image display method
KR101104503B1 (en) * 2008-03-28 2012-01-12 가부시끼가이샤 도시바 Image display apparatus and method for displaying an image
WO2009119008A1 (en) 2008-03-28 2009-10-01 Kabushiki Kaisha Toshiba Image display apparatus and method for displaying an image
JP2010198578A (en) * 2009-02-27 2010-09-09 Toyota Motor Corp Movement locus generating device
JP4614005B2 (en) * 2009-02-27 2011-01-19 トヨタ自動車株式会社 Moving locus generator
US8983679B2 (en) 2009-02-27 2015-03-17 Toyota Jidosha Kabushiki Kaisha Movement trajectory generator
US9417080B2 (en) 2009-02-27 2016-08-16 Toyota Jidosha Kabushiki Kaisha Movement trajectory generator
WO2010098449A1 (en) * 2009-02-27 2010-09-02 トヨタ自動車株式会社 Movement trajectory generator
CN102334151A (en) * 2009-02-27 2012-01-25 丰田自动车株式会社 Movement trajectory generator
CN102334151B (en) * 2009-02-27 2014-07-09 丰田自动车株式会社 Movement trajectory generator
WO2011033766A1 (en) * 2009-09-16 2011-03-24 株式会社 東芝 Display device for image information
JP2011096064A (en) * 2009-10-30 2011-05-12 Equos Research Co Ltd Driving assist system
JP2011192070A (en) * 2010-03-15 2011-09-29 Honda Motor Co Ltd Apparatus for monitoring surroundings of vehicle
JP2014010800A (en) * 2012-07-03 2014-01-20 Alpine Electronics Inc On-vehicle system
JP2014071019A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
JP2014071020A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
JP2014071021A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
JP2014071022A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
CN103707812A (en) * 2012-09-28 2014-04-09 富士重工业株式会社 Visual guidance system
JP2014071023A (en) * 2012-09-28 2014-04-21 Fuji Heavy Ind Ltd Sight line guidance system
US9267808B2 (en) 2012-09-28 2016-02-23 Fuji Jukogyo Kabushiki Kaisha Visual guidance system
US9771022B2 (en) 2012-10-05 2017-09-26 Denso Corporation Display apparatus
JP2014075079A (en) * 2012-10-05 2014-04-24 Denso Corp Display device
US9216684B2 (en) 2012-10-05 2015-12-22 Denso Corporation Display apparatus
US9475420B2 (en) 2012-10-05 2016-10-25 Denso Corporation Display apparatus
KR101451859B1 (en) * 2012-12-20 2014-10-16 에스엘 주식회사 Image processing device and method thereof
WO2014174575A1 (en) * 2013-04-22 2014-10-30 トヨタ自動車株式会社 Vehicular head-up display device
WO2015001815A1 (en) * 2013-07-05 2015-01-08 クラリオン株式会社 Drive assist device
CN105324267A (en) * 2013-07-05 2016-02-10 歌乐株式会社 Drive assist device
US9827907B2 (en) 2013-07-05 2017-11-28 Clarion Co., Ltd. Drive assist device
JP6051307B2 (en) * 2013-07-05 2016-12-27 クラリオン株式会社 Driving assistance device
JP2016045705A (en) * 2014-08-22 2016-04-04 トヨタ自動車株式会社 In-vehicle device, in-vehicle device control method, and in-vehicle device control program
US10436600B2 (en) 2015-03-26 2019-10-08 Shuichi Tayama Vehicle image display system and method
WO2016152553A1 (en) * 2015-03-26 2016-09-29 修一 田山 Vehicle image display system and method
DE102017105221A1 (en) 2016-03-31 2017-10-05 Subaru Corporation Vehicle control device and vehicle
JPWO2018051573A1 (en) * 2016-09-16 2019-08-08 富士フイルム株式会社 Projection display device, display control method, and display control program
WO2018051573A1 (en) * 2016-09-16 2018-03-22 富士フイルム株式会社 Projective display device, display control method, and display control program
WO2018078732A1 (en) * 2016-10-26 2018-05-03 三菱電機株式会社 Display control apparatus, display apparatus, and display control method
WO2019053987A1 (en) * 2017-09-15 2019-03-21 マクセル株式会社 Information display device
WO2019073935A1 (en) * 2017-10-10 2019-04-18 マクセル株式会社 Information display device

Also Published As

Publication number Publication date
JP4847178B2 (en) 2011-12-28

Similar Documents

Publication Publication Date Title
JP4342146B2 (en) Parking assistance device
US10293826B2 (en) Systems and methods for navigating a vehicle among encroaching vehicles
US8346426B1 (en) User interface for displaying internal state of autonomous driving system
US6735517B2 (en) Windshield display for a navigation system
US7216035B2 (en) Method and device for displaying navigational information for a vehicle
DE102008004160B4 (en) Towing vehicle for a trailer or semi-trailer and method of assisting a driver in maneuvering a towing vehicle rearward
EP1796392B1 (en) Apparatus for assisting steering of vehicle when backing
US9766626B1 (en) System and method for predicting behaviors of detected objects through environment representation
JP4630066B2 (en) Navigation device
EP2608186A2 (en) Trailer backing path prediction using GPS and camera images
EP3016835B1 (en) Vehicle control system
DE10131720B4 (en) Head-Up Display System and Procedures
DE102013200462A1 (en) Autonomous circuit control system
EP1222441B2 (en) Camera system for vehicles
DE102011121948A1 (en) Perspective on actions of an autonomous driving system
US20090312888A1 (en) Display of a relevant traffic sign or a relevant traffic installation
US7617037B2 (en) System for automatically monitoring a motor vehicle
US9988047B2 (en) Vehicle control system with traffic driving control
US8354944B2 (en) Night vision device
JP2006284458A (en) System for displaying drive support information
CN104057956B (en) The display system of autonomous land vehicle and method
US20060080005A1 (en) Method for displaying a vehicle driving space
US7688221B2 (en) Driving support apparatus
JP2007223338A (en) Display system, moving body, display method, display program and storage medium thereof
WO2012043092A1 (en) Parking assistance device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20081127

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20101012

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20101209

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20111011

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20111013

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20141021

Year of fee payment: 3

LAPS Cancellation because of no payment of annual fees