WO2017082067A1 - Image display system for vehicle - Google Patents

Image display system for vehicle Download PDF

Info

Publication number
WO2017082067A1
WO2017082067A1 PCT/JP2016/081866 JP2016081866W WO2017082067A1 WO 2017082067 A1 WO2017082067 A1 WO 2017082067A1 JP 2016081866 W JP2016081866 W JP 2016081866W WO 2017082067 A1 WO2017082067 A1 WO 2017082067A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vehicle
automobile
eye image
display system
Prior art date
Application number
PCT/JP2016/081866
Other languages
French (fr)
Japanese (ja)
Inventor
修一 田山
Original Assignee
修一 田山
株式会社イマージュ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 修一 田山, 株式会社イマージュ filed Critical 修一 田山
Publication of WO2017082067A1 publication Critical patent/WO2017082067A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an image display system for automobiles used to support driving of vehicles such as automobiles.
  • a vehicle information transmission device for intuitively transmitting danger and navigation around the own vehicle to a driver intuitively (see, for example, Patent Document 1).
  • This device detects dangerous objects such as pedestrians, bicycles or other vehicles existing around the front using a camera or radar sensor mounted on the vehicle, and displays them on the display device of the instrument panel as a graphic. And let the driver recognize its existence.
  • a vehicle periphery monitoring device that displays the presence and type of a detection object that has a high possibility of contact with the host vehicle on an image display device including a HUD (head-up display) and notifies the driver.
  • HUD head-up display
  • the type of pedestrian, bicycle, animal, etc. is determined, and the mark in a different form and the position of the detection object are placed according to the type difference.
  • the surrounding rectangular frame is displayed on the HUD.
  • Patent Document 2 According to the conventional technology of Patent Document 2, it is not necessary for the driver to change the line of sight in order to display information on the front windshield, which is more advantageous than that of Patent Document 1.
  • the present invention has been made to solve the above problems, and an object thereof is to provide an automobile image display system capable of displaying information toward a front windshield in an easy-to-understand manner for a driver. It is in.
  • the present invention is an image display system for an automobile that displays information provided to an occupant of an automobile as an image, the detecting means for detecting the provided information, and a right-eye image and a left-eye image constituting a stereoscopic image of the provided information.
  • the detection means is a camera that captures a situation around the automobile, and a captured image of the camera is used as the provision information.
  • This camera may be a stereo camera that captures a right-eye image and a left-eye image.
  • the detection means is a sensor that detects a situation around the automobile or a running state of the automobile, and converts the provided information by the sensor into a drawn image that can be visually displayed by the drawing means.
  • the drawn image is displayed in association with a specific detection target in a real scene that can be viewed through the front windshield, so that the occupant can view the detection target with caution.
  • the display means displays a drawn image
  • the occupant can visually recognize the detection object at a depth substantially equal to a depth in a depth direction when the occupant visually recognizes the detection object from the front windshield.
  • the right-eye image and the left-eye image having binocular parallax that can be projected are projected, the occupant can more easily and reliably visually recognize the detection target.
  • the detection means is a navigation device that identifies the current position of the vehicle and guides the traveling direction from surrounding map data, and it is preferable to display a drawing image of this guidance in association with the detection object.
  • the detection means provides the above-described provision of nearby vehicles by vehicle / vehicle communication (hereinafter referred to as “vehicle-to-vehicle communication” as appropriate) or road facility / vehicle communication (hereinafter referred to as “road-to-vehicle communication” as appropriate). It may be a communication device that acquires information. Vehicle-to-vehicle communication is performed by short-range wireless communication with other vehicles, and road-to-vehicle communication is various information communication between road peripheral equipment and the like. The situation of other vehicles can be obtained with high accuracy by vehicle-to-vehicle communication or road-to-vehicle communication.
  • the provision information to the occupant is projected by the right-eye image and the left-eye image, the occupant can visually recognize the provision information as a stereoscopic image. Therefore, since the occupant can view the provided information in a state where the eyes are focused in front of the front windshield, it is not necessary to switch the focus of the eyes for confirmation of the provided information, and there is no oversight.
  • FIG. 1 is a block diagram showing the overall configuration of an automobile image display system according to the present invention.
  • 1 is a plan view showing an example of an automobile in which a camera and a sensor are arranged in an automobile image display system according to the present invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS It shows with the conceptual diagram which looked at the front windshield and dashboard of the motor vehicle provided with the image display system for motor vehicles based on this invention from the driver's seat side.
  • the typical explanatory view of the image display by a HUD device is shown.
  • the schematic diagram explaining an example of the display in the front windshield of the image display system for motor vehicles concerning the present invention is shown.
  • the schematic diagram explaining another example of the display in the front windshield of the image display system for motor vehicles based on this invention is shown.
  • FIG. 1 is a block diagram showing the overall configuration of a driving support system that combines an automobile image display system according to the present invention.
  • movement of the image display for motor vehicles is shown.
  • the schematic diagram explaining an example of the display by the front windshield in a driving assistance system is shown.
  • the schematic diagram explaining another example of the display by the front windshield in a driving assistance system is shown.
  • the schematic diagram explaining another example of the display by the front windshield in a driving assistance system is shown.
  • FIG. 1 is a block diagram showing a schematic configuration of an automobile image display system according to the present invention.
  • the vehicle image display system 1 includes a detection unit 2, a display control unit 3, and a head-up display (HUD) device 4 that is a display unit.
  • HUD head-up display
  • the detection unit 2 includes a camera unit 2A and a sensor unit 2B that monitor the situation around the automobile.
  • the captured image by the camera unit 2A and the detection information by the sensor unit 2B are provided information to the vehicle occupant. Therefore, the detection unit 2 is a detection unit that detects the provided information.
  • the camera unit 2A has, for example, three cameras 17, 18, and 19 as shown in FIG. 2 for photographing the situation around the automobile.
  • the camera 17 is installed on the upper part of the front windshield 15 and photographs the front view of the automobile 10.
  • the camera 18 is disposed on the back windshield and photographs the rear view of the automobile 10.
  • the sensor unit 2B includes sensors 20 and 21 for detecting the presence of an automobile, a bicycle, a person, an animal, or the like (hereinafter referred to as “detection object”) around the automobile.
  • the sensors 20 and 21 are, for example, radar sensors. As shown in FIG. 3, the sensor 20 is attached to the lower part of the left and right door mirrors 35 and detects the presence of the detection object at the left and right rear.
  • the sensor 21 is attached to the central portion of the vehicle body at the front of the automobile 10 and detects the presence of a detection object existing in front.
  • the radar sensor millimeter wave radar, microwave radar, laser radar, infrared sensor, ultrasonic sensor, or the like is used.
  • the steering angle sensor 24, the vehicle speed sensor 25, the gear position sensor 26, and the like provided in the traveling system of the automobile 10 also constitute the detection unit 2 here.
  • the display control unit 3 displays the information acquired by the detection unit 2 on the HUD device 4 as a stereoscopic image.
  • images to be displayed include images taken by the cameras 17, 18, 19, and detection information from the sensors 20, 21, the steering angle sensor 24, and the vehicle speed sensor 25 in the form of graphics such as graphics and icons or letters / numbers.
  • graphics such as graphics and icons or letters / numbers.
  • the display control unit 3 includes an image memory M in which these various drawn images are stored in advance.
  • each camera 17, 18 and 19 uses a stereo camera in which a binocular lens corresponding to the left and right eyes of a human is housed in a single housing, thereby providing a display control unit. 3 outputs the right-eye image and the left-eye image input from the stereo camera to the HUD device 4.
  • the cameras 17, 18, and 19 are not limited to stereo cameras, and may be cameras that can capture stereo images by computer processing even if they are monocular cameras.
  • the display control unit 3 In order to display the drawn image, the display control unit 3 appropriately displays images and characters suitable for visually displaying detection information from the sensors 20 and 21, the steering angle sensor 24, the vehicle speed sensor 25, and the gear position sensor 26. A number drawing image is selected from those stored in advance in the image memory M. Then, a right-eye image and a left-eye image are created from the selected drawing image and output to the HUD device 4. Accordingly, in this case, the display control unit 3 functions as a drawing unit that converts the provided information into a drawing image that can be visually displayed.
  • the HUD device 4 has a conventionally known structure and mechanism. For example, as shown in FIG. 4, a right-eye image display unit 6, a left-eye image display unit 7, and a mirror 8. It is the composition provided with.
  • the HUD device 4 is disposed on the upper surface of the dashboard 39 and projects an image toward the front windshield 15. Further, the HUD device 4 may be incorporated in the dashboard 39 or may project an image onto the front windshield 15 from the ceiling of the driver's seat.
  • the right-eye image display unit 6 and the left-eye image display unit 7 of the HUD device 4 are displays for displaying the right-eye image RP and the left-eye image LP, respectively, for example, ELD (electroluminescence display), fluorescent display tube Display devices such as FED (Field Emission Display) and LCD (Liquid Crystal Display) are used.
  • ELD electroluminescence display
  • FED Field Emission Display
  • LCD Liquid Crystal Display
  • the mirror 8 of the HUD device 4 is disposed so as to face the right-eye image display unit 6 and the left-eye image display unit 7, and the right-eye image display unit 6 and the left-eye image display unit 7 respectively display the images.
  • the right-eye image RP and the left-eye image LP are reflected toward the front windshield 15.
  • the mirror 8 projects the right-eye image RP and the left-eye image LP displayed by the right-eye image display unit 6 and the left-eye image display unit 7 at a magnification determined in advance by the mirror shape.
  • the front windshield 15 is formed in a curved shape curved in the vertical direction and the horizontal direction according to the outer shape of the automobile 10 using laminated glass, IR cut glass, UV cut glass or the like.
  • the front windshield 15 transmits the line of sight of the driver of the occupant and reflects the right-eye image RP and the left-eye image LP projected from the HUD device 4 in the line-of-sight direction.
  • the HUD device 4 projects the right-eye image RP and the left-eye image LP output from the display control unit 3, so that the right-eye image RP and the left-eye image are displayed.
  • LP is reflected by the front windshield 15 to both eyes of the driver.
  • the human eye is focused several meters in front, it will not be greatly out of focus even for objects at infinity. Therefore, when the driver focuses on his / her eyes in the depth direction in front of the front windshield 15 to check the field of view in front of the automobile 10, he / she misses this when the image is projected on the front windshield 15. The possibility increases. However, by configuring the driver so that the provided information can be viewed stereoscopically, the provided information can be viewed in a state in which the driver is focused in the depth direction in front of the front windshield 15, and the provided information can be easily and reliably provided. It will not be recognized and overlooked.
  • the steering angle sensor 24 detects the camera 18 at that time.
  • 3D can be displayed on the HUD device 4. Therefore, as shown in FIG. 4, the driver can visually recognize the stereoscopic image TD in the state in which the eyes are focused in the front depth direction through the front windshield 15, so that the driver crosses the right side of the automobile 10.
  • the brake operation can be performed by recognizing the presence of a pedestrian.
  • the driver can accurately grasp the situation without fear of being confused with the actual scene in front of the car viewed through the front windshield 14. Become.
  • the display control unit 3 may reproduce the stereoscopic image at this depth as necessary. Processing of the right eye image RP and the left eye image LP taken by the camera 18 can be performed.
  • the display control unit 3 uses the right eye image RP and the left eye image LP captured by the camera 18 as the HUD device 4. Project by. Therefore, the driver can visually recognize the presence of a pedestrian or the like behind the vehicle 10 while viewing the situation in front of the automobile 10 through the front windshield 15.
  • the display control unit 3 acquires the current vehicle speed value from the vehicle speed sensor 25, the display control unit 3 reads out images of numbers and characters for visually displaying the value from the image memory M, and from this image, the right-eye image RP and A left-eye image LP is created, and a drawing image of the speed display 34 (“60 km / h” in the illustrated example) shown in FIG. 5 is displayed.
  • the display control unit 3 allows the right-eye image RP of the drawing image and the right-eye image RP to be visually recognized so that a binocular parallax reproduced in accordance with the depth in the depth direction from the front windshield 15 of the drawing image indicating the vehicle speed is visible.
  • a left-eye image LP is created.
  • a drawing image representing the state of the gear detected by the gear position sensor 26 is also displayed as a stereoscopic image based on binocular parallax.
  • the display control unit 3 detects that the driver turns the steering wheel 22 to the right or left in order to change the lane by the steering angle sensor 24, the display control unit 3 detects the presence of the detection object behind the direction from the sensor 20. Detect by information. At this time, if the detected object to be detected is larger than a predetermined size, the sensor 20 determines that the detected object is approaching and outputs a detection signal of the detected object to the display control unit 3.
  • the display control unit 3 reads out a drawing image of the prohibition mark 13 indicating that there is a danger in changing the lane from the drawing memory M, and displays the drawing in three dimensions by the HUD device 4.
  • a prohibition mark 13 is displayed on the left side because there is another vehicle approaching from the left rear in an attempt to change the lane in the left lane.
  • the prohibition mark 13 is displayed in a three-dimensional manner, the driver can recognize the prohibition mark 13 while focusing on the eyes in the depth direction ahead through the front windshield 15, so that the lane change can be surely made. Can detect danger. It is also possible to arrange a camera instead of the sensor 20 that detects the left and right rear detection objects, and display the rear other vehicle photographed by the camera as a stereoscopic image by the HUD device 4.
  • the above-described image display system 1 for an automobile according to the present invention is effectively applied to an automobile 10 provided with a driving support system for supporting safe driving.
  • the embodiment will be described below.
  • Level 1 is called a safe driving support system, where the vehicle performs acceleration, steering, or braking.
  • Level 2 is called a quasi-automatic driving system, in which a vehicle simultaneously performs a plurality of operations of acceleration, steering, and braking.
  • Level 3 is a state in which acceleration, steering, and braking are all performed by the vehicle and the driver responds only in an emergency. This is also called a semi-automatic driving system.
  • Level 4 is called a fully automatic driving system in which acceleration, steering and braking are all performed by a person other than the driver and the driver is not involved at all.
  • “automatic driving” refers to traveling at level 3 or level 4.
  • FIG. 7 schematically shows the overall configuration of a driving support system 20 that combines the automobile image display system 1 of the present embodiment.
  • the driving support system 20 includes a central processing unit 21 including a CPU, a ROM, and a RAM.
  • the central processing unit 21 includes the detection unit 2 described in FIG. 1, an HMI (Human Interface) unit 23, The traveling control device 25 and the navigation device 33 are connected to each other.
  • HMI Human Interface
  • the HMI unit 23 includes the touch panel 30 and the monitor 31 (shown in FIG. 3) together with the HUD device 4 described in FIG.
  • the monitor 31 displays, for example, a rear view taken by the camera 18 when the automobile 10 is moved backward.
  • the touch panel 30 is formed by software on the screen of the monitor 31.
  • the traveling control device 25 includes a brake control device 26 and an accelerator control device 27.
  • the brake control device 26 controls a brake operating device (not shown) to automatically decelerate the automobile 10.
  • the accelerator control device 27 automatically accelerates the automobile 10 by controlling a throttle actuator (not shown) that adjusts the throttle opening.
  • the navigation device 33 reads map data around the automobile 10 from the map database based on GPS information received from a GPS (global positioning system) 32.
  • GPS global positioning system
  • the central processing unit 21 executes a control program stored in the ROM by the CPU, thereby realizing a pattern matching processing unit 21a, a stop control unit 21b, a follow-up travel control unit 21c, and a constant speed travel control. It has each function of the part 21d, the target preceding vehicle determination part 21e, the automatic driving control part 21f, and the display control part 3 demonstrated in FIG.
  • the pattern matching processing unit 21a performs pattern matching processing on the images sent from the cameras 17, 18, and 19 to detect the contours of pedestrians, bicycles, motorcycles, etc. from the images. Determine the existence of.
  • the ROM includes a pattern data file F in which contours that are image features to be detected are registered in advance. Further, the ROM includes the above-described image memory M in which a drawing image visually represented by a pattern such as a graphic or an icon displayed by the display control unit 3 on the HUD device 4 or a character / number is stored in advance. Yes.
  • the pattern matching processing unit 21a also detects the lane line by the white line or the yellow line based on the luminance information of the image simultaneously with the pattern matching process.
  • the stop control unit 21b determines the distance from the parallax between the left and right images of the camera 17 to the detection target. It is a stop control means that calculates and controls the stop of the automobile 10 according to the distance.
  • the follow-up running control unit 21c calculates the inter-vehicle distance with the preceding vehicle so that the automobile 10 follows the preceding vehicle and compares the data as a result of comparison with the preset inter-vehicle distance. To 25.
  • the constant speed traveling control unit 21d calculates a difference between the current speed and a preset vehicle speed and outputs the difference to the traveling control device 25 in order to cause the host vehicle to travel at a constant speed in the constant speed traveling mode.
  • the target preceding vehicle determination unit 21e determines a preceding vehicle that is a target of the follow-up traveling mode.
  • the automatic operation control unit 21f performs control for automatically driving the automobile 10 at the level 3 or 4 described above.
  • the central processing unit 21 determines whether or not a steering operation for the driver to turn the vehicle 10 to the right is performed (step S1). At this time, the central processing unit 21 detects whether the steering wheel 22 is rotated by the steering angle sensor 24. When the steering operation for the right turn is not performed and “NO”, the operation of displaying the image of the sight that is a blind spot is not performed.
  • the pattern matching processing unit 21a of the central processing unit 21 acquires the image sent from the camera 19 at this time (step S2).
  • the right eye image RP and the left eye image LP captured by the camera 19 are displayed by the HUD device 4. Therefore, the driver can visually recognize the image at a position in the depth direction across the front windshield 15 on the line of sight of the driver in front of the car by stereoscopically displaying the image with binocular parallax (step S3).
  • the central processing unit 21 performs pattern matching processing between the acquired image and the contour of the detection target registered in the pattern data file F, and detection targets such as pedestrians, bicycles, and motorcycles are included in the image.
  • detection targets such as pedestrians, bicycles, and motorcycles are included in the image.
  • the presence is determined (step S4). In the case of “NO” in which the detection target is not included, the operation of displaying the image of the sight that is a blind spot is ended.
  • the central processing unit 21 determines the distance from the parallax between the left and right images of the camera 19 to the detection target by the stop control unit 21b. Calculation is performed to determine whether the distance is within a predetermined range (step S5). For example, when the distance to the detection object exceeds a range of 10 meters (“NO” in step S5), the operation of displaying the image of the sight that is a blind spot is ended.
  • step S5 when the distance to the object to be detected is within a range of 10 meters (“YES” in step S5), the central processing unit 21 performs control to stop the automobile 10 by the brake control device 26 (step S6). Thereafter, the central processing unit 21 repeats the processing from step 2 so that the detection object and other detection objects no longer exist ("NO” in step 4), or even if they exist. If the distance exceeds 10 meters (“NO” in step S5), the operation of displaying the image of the sight that is a blind spot is terminated. In this case, the stop control unit 21b releases the stop state of the automobile 10 by the brake control device 26.
  • the automobile 10 is maintained in a stopped state. Therefore, according to the safe driving support system 20, since the automatic stop is controlled according to the condition of the blind spot range where the driver's eyes cannot reach, the safe driving of the driver at the right turn can be surely supported.
  • the driving support system 20 in which the image display system 1 for an automobile according to the present invention is integrated is not limited to the pillar portion represented by the A pillar 6, and other vehicle structures such as a frame and an outer member are also blind spots from the inside of the vehicle.
  • the camera can be attached to these surfaces, and the stereo image can be displayed on the front windshield 15 by the HUD device 4.
  • the driving support system 20 displays various information in addition to the blind spot image by the A pillar 6 in the form of a stereoscopic image to the driver as a captured image or a drawn image. Several display examples will be described below.
  • the constant speed travel and inter-vehicle distance control of the automobile 10 by the travel support system 20 will be described.
  • the constant speed travel / inter-vehicle distance control in the present embodiment is performed as follows, for example, but of course is not limited to this.
  • the central processing unit 21 starts constant speed traveling and inter-vehicle distance control of the host vehicle when the driver turns on a driving support switch (not shown).
  • the vehicle speed of constant speed driving and the inter-vehicle distance from the preceding vehicle may be set by inputting desired values on the touch panel 30 of the HMI unit 23 immediately before the driver turns on the driving support switch.
  • the previous set value stored in the 20 memories (not shown) can be used as it is.
  • the constant speed traveling / inter-vehicle distance control is performed by switching between the following traveling mode when the preceding vehicle is detected and the constant speed traveling mode when the preceding vehicle is not detected.
  • the target preceding vehicle determination unit 21e detects all preceding other vehicles by the front camera 17 and the sensor 21.
  • the target preceding vehicle determination unit 21e detects and stores the inter-vehicle distance, relative speed, direction with respect to the own vehicle, etc. with respect to all the detected preceding vehicles, and travels in the same lane as the own vehicle.
  • the closest preceding vehicle is determined as the target preceding vehicle for following the vehicle. Whether the preceding vehicle is traveling in the same lane as the own vehicle can be determined by the pattern matching processing unit 21a detecting the lane.
  • the sensor 21 always scans regardless of whether the driving support switch is on or off. Thus, not only can the target preceding vehicle be quickly determined in response to the turning-on operation of the travel support switch, but detection of the preceding vehicle can also be used for the rear-end collision prevention function.
  • the presence of the preceding vehicle can also be obtained by performing short-range wireless communication with the preceding vehicle.
  • the preceding vehicle detection unit 11 can detect the preceding vehicle. The accuracy of the position information can be further increased.
  • Presence of the vehicle is a variety of information communication between the vehicle and road peripheral equipment, i.e., a so-called sensor or antenna installed on the road, or directly or wirelessly through a server in the surrounding area. It can also be obtained by road-to-vehicle communication.
  • the driving support system 20 includes a communication device for communicating with the outside. Therefore, a communication device that acquires position information and travel conditions of surrounding vehicles by vehicle-to-vehicle communication or road-to-vehicle communication serves as detection means for detecting information provided to the driver.
  • the central processing unit 21 controls the traveling control device 25 so that the following distance control unit 21c maintains the determined inter-vehicle distance with the target preceding vehicle at the set inter-vehicle distance. That is, if the current actual inter-vehicle distance with the target preceding vehicle is longer than the set inter-vehicle distance, the accelerator control device 27 is controlled to increase the vehicle speed of the automobile 10 and reduce the inter-vehicle distance with the target preceding vehicle.
  • the brake control device 26 is controlled so as to increase the inter-vehicle distance from the target preceding vehicle by lowering the vehicle speed of the own vehicle, and if the same, the brake control device 26 and the accelerator control are maintained so as to maintain the current vehicle speed of the own vehicle.
  • the device 27 is controlled.
  • FIG. 9 shows an example of the screen display of the HUD device 4 in the follow-up running mode.
  • the preceding vehicle 41 that is following is displayed by drawing a marking 42.
  • one preceding vehicle 41 is in the same overtaking lane 43 as the own vehicle, and another preceding vehicle lane 44 is further in front.
  • Vehicle 45 is running.
  • both the preceding vehicles 42 and 45 are detected by the sensor 21 and the camera 17, but the target preceding vehicle determination unit 21e targets the preceding vehicle 41 closest to the own vehicle on the same lane 43 as the own vehicle. Decide on the preceding vehicle. Therefore, in this case, the preceding vehicle 41 is a detection target that the driver particularly pays attention to in the real scenery.
  • the target preceding vehicle determination unit 21e stores data such as the detected inter-vehicle distance, relative speed, and direction from the target preceding vehicle 42 in the memory, and the sensor 21 and the camera 17 continuously scan the data.
  • the target preceding vehicle 42 is automatically tracked, and data such as the inter-vehicle distance, relative speed, and direction are continuously collected and stored.
  • the display control unit 3 displays an image of the marking 42 indicating that the preceding vehicle 41 is a tracking target vehicle by determining the target preceding vehicle.
  • the marking 42 is a frame pattern that substantially overlaps the rear contour of the preceding vehicle 41, and the display control unit 3 reads the frame drawing image from the image memory M and displays this as a stereoscopic image.
  • the right-eye image RP and the left-eye image LP are created so as to be displayed.
  • the display control unit 3 creates a right-eye image RP and a left-eye image LP having binocular parallax from the rendered image of the marking 42 and displays them on the HUD device 4.
  • the display control unit 3 uses the binocular parallax so that the driver can visually recognize the marking 42 at the above-described coordinate position whose center is obtained by calculation, and at a depth in the depth direction from the front windshield 15 to the preceding vehicle 41.
  • a right-eye image RP and a left-eye image LP having
  • the driver can also visually recognize the marking 42 while the eyes are focused on the preceding vehicle 41 through the front windshield 15.
  • the driver can visually recognize the target preceding vehicle that is being tracked by the driver because the driver can see the target preceding vehicle 42 as if it was actually surrounded by a frame.
  • the image of the marking 42 in the present embodiment is formed with a square thick frame line surrounding the outline of the target preceding vehicle 41, but various shapes and displays other than the square thick frame line are used. be able to. For example, it may be indicated by a circular thick border line or an arrow instead of being surrounded by a frame, and it may be displayed in a conspicuous color such as red or orange to further draw the driver's attention. .
  • the display control section 3 displays the traveling speed set by the input to the touch panel 30 of the HMI section 23 in a three-dimensional manner with a numerical drawing image. Furthermore, when changing the constant speed traveling speed, the display control unit 3 acquires the vehicle speed in the middle stage until reaching the set speed from the vehicle speed sensor 25 and sequentially displays it.
  • the navigation information displayed by the HUD device 4 is, for example, an arrow along the traveling direction of the automobile 10, and displays a straight arrow when traveling straight at an intersection, and a refracted arrow corresponding to the direction when turning left or right. To do.
  • FIG. 10 shows an example of a screen display by the HUD device 4 at the time of guidance for turning the route to the left during navigation travel.
  • the display control unit 3 reads a drawing image of an arrow 46 that displays a left turn from the image memory M and displays the image on the HUD device 4 in a stereoscopic manner.
  • the display control unit 3 sets the coordinate position on the Y axis that is set on the surface of the front windshield 15 based on the distance.
  • the right-eye image RP and the left-eye image LP having binocular parallax such that the arrow 46 is stereoscopically displayed at the position are created and displayed.
  • the driver can also visually recognize the arrow 46 while the eyes are focused on the intersection 47 through the front windshield 15. As a result, the driver can prepare for the left turn because the left turn arrow is visually recognized over the intersection 47. Therefore, in this case, the intersection 47 is a detection target that the driver particularly pays attention to in the real scenery.
  • the driver can monitor the automatic driving situation by the stereoscopic image projected by the HUD device 4 in a fully automatic driving vehicle in which the driver is not directly involved in driving.
  • An example of display by such an automobile image display system according to the present invention during automatic driving will be described.
  • the navigation device 33 calculates one or a plurality of driving route candidates to the destination input in advance by the occupant, and along the driving route determined by the driver's confirmation or approval, the automatic driving control unit 21f is performed by controlling the traveling control device 25.
  • the display control unit 3 creates a right-eye image RP and a left-eye image LP of the drawn image by the arrow 48 representing the traveling course of the own vehicle, as shown in FIG. indicate.
  • the display control unit 3 detects the coordinate position of the traveling path 49 of the host vehicle by processing the image captured by the camera 17 by the pattern matching processing unit 21a, and the driver displays an arrow 48 with binocular parallax at the coordinate position.
  • a right-eye image RP and a left-eye image LP are created so as to be visually recognized as a stereoscopic image.
  • the driver can visually recognize the arrow 48 in a state in which the eyes are focused in the depth direction through the front windshield 15, the driver can see the road 49 in the real scenery 36 seen through the front windshield 15. It can be recognized as the arrow 48 is shown.
  • the display control unit 3 intuitively informs the driver and other occupants of the next driving behavior that the automatic driving vehicle 10 is about to take before the vehicle 10 deviates from the current driving course or changes lanes. Predict by displaying easily and momentarily in an easy-to-understand manner.
  • the determination as to whether or how to take a driving action different from the planned driving action is an automatic driving executed by the central processing unit 21 to realize the automatic driving control unit 21f. Determine according to the control program.
  • FIG. 12 schematically shows a preview screen displayed on the HUD device 4 prior to the lane change from the overtaking lane 52 to the traveling lane 53 while the vehicle is in automatic operation. .
  • the drawing image indicated by the arrow 50 indicating the scheduled driving course in addition to the drawing image indicated by the arrow 50 indicating the scheduled driving course, the drawing image indicated by the arrow 51 indicating the driving course for changing the lane is displayed.
  • the arrow 51 has a shape that curves along a trajectory where the host vehicle is predicted to automatically travel from a position on the overtaking lane 38 of the real scene 36 to a position on the traveling lane 45 adjacent to the destination. .
  • These arrows 50 and 51 are also read from the image memory M and displayed as a stereoscopic image, so that the driver can visually recognize the arrows in a state where the eyes are focused on the vehicle ahead. Therefore, the driver can more clearly, intuitively and specifically recognize that the driving operation for changing the lane from the currently overtaking lane 52 to the traveling lane 53 is about to be performed.
  • FIG. 12 the drawing image by the icon 54 which makes the own vehicle virtual is displayed with the drawing image by the arrows 50 and 51.
  • FIG. 12 the display control unit 3 virtually displays the progress of the lane change that the vehicle is about to perform from the HUD device 4 by the movement of the icon 54.
  • FIGS. 12A to 12D schematically show the display at this time in time order.
  • the icon 54 indicates a state as if the vehicle is traveling just before the front windshield 15.
  • the display control unit 3 displays the right-eye image RP and the left-eye image so that the binocular parallax can be obtained as if the position where the icon 54 is visually recognized by the driver is behind the preceding vehicle 55 in the same lane 52.
  • An LP is created and displayed on the HUD device 4.
  • the display control unit 3 then changes the lane from the overtaking lane 52 to the traveling lane 53 as shown in the virtual images of FIGS. 12 (b), (c), and (d).
  • a right-eye image RP and a left-eye image LP are created and displayed on the HUD device 4 so that the driver can visually recognize each of the moving images.
  • the appearance shape of the host vehicle also changes with the progress of the lane change, but the display control unit 3 displays a drawing image of the icon 54 representing the appearance shape of the host vehicle according to the progress from the drawing memory M. Read and display.
  • the simulation display of the lane change by such a virtual vehicle icon 54 is displayed at a speed higher than the actual traveling speed of the own vehicle.
  • the icon 54 can be disappeared immediately after the virtual lane change is completed, or can be maintained for a short period of time while being displayed at the same position on the front windshield 15. Also, the moving images from (a) to (d) of FIG. 12 can be repeatedly displayed.
  • the image display system for an automobile according to the present invention is an automatic driving at any stage from level 1 to level 4 realized when the driving operation is directly performed by the driver or the driving support system.
  • information necessary for safe driving can be provided to the driver by display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Traffic Control Systems (AREA)
  • Instrument Panels (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Navigation (AREA)

Abstract

Provided is an image display system for a vehicle with which it is possible for provided information to be displayed on a front windshield in a manner that is easily understandable for a driver. A detecting unit (2) detects the situation around the vehicle using a stereo camera and sensors. A display control unit (3) then uses a HUD device (4) to project toward a front windshield (15) of the vehicle a right-eye image and a left-eye image which form a stereoscopic image of the information detected by the detecting unit (2). If the detected information is an image captured by the stereo camera, a captured right-eye image and left-eye image are projected, and if the detected information is from a sensor, a right-eye image and a left-eye image of a rendered image based on the information are created and projected. An occupant of the vehicle can thus check the information from the stereoscopic image by means of binocular parallax.

Description

自動車用画像表示システムImage display system for automobiles
 本発明は、特に自動車等の車輌の運転を支援するために使用される自動車用画像表示システムに関する。 The present invention relates to an image display system for automobiles used to support driving of vehicles such as automobiles.
 従来より、自動車の安全な運転を支援するために様々な自動車用画像表示システムが開発されている。 Conventionally, various image display systems for automobiles have been developed to support safe driving of automobiles.
 例えば、自車輌周辺の危険やナビゲーションをドライバーに直感的に分かり易く伝達するための車輌用情報伝達装置が知られている(例えば、特許文献1を参照)。この装置は、自車輌に搭載したカメラやレーダーセンサ等を用いて、前方周辺に存在する歩行者、自転車又は他車輌等の危険対象を検出し、それをインストルメントパネルの表示装置に図形で表示して、その存在をドライバーに認識させている。 For example, there is known a vehicle information transmission device for intuitively transmitting danger and navigation around the own vehicle to a driver intuitively (see, for example, Patent Document 1). This device detects dangerous objects such as pedestrians, bicycles or other vehicles existing around the front using a camera or radar sensor mounted on the vehicle, and displays them on the display device of the instrument panel as a graphic. And let the driver recognize its existence.
 しかしながら、特許文献1に開示の装置は、検出した危険対象がインストルメントパネルの表示装置に表示されるため、それを見るのにドライバーは、運転中にフロントウインドシールド越しに前方に向けている視線を表示装置に移さなければならず、危険を瞬時に判断しなければならない状況においては認知が遅れることになる。 However, in the device disclosed in Patent Document 1, since the detected dangerous object is displayed on the display device of the instrument panel, the driver is looking forward through the front windshield while driving. Must be transferred to the display device, and recognition is delayed in situations where the danger must be determined instantaneously.
 同様に、自車輌との接触の可能性が高い検知対象物の存在及び種類を、HUD(ヘッドアップディスプレイ)からなる画像表示装置に表示して、ドライバーに通報する車輌周辺監視装置が知られている(例えば、特許文献2を参照)。このときの検知対象物は、その形状及びサイズに基づいて、歩行者、自転車、動物等の種類が判定され、その種類の違いに応じて異なる形態のマークと、検知対象物の位置にそれを囲む矩形状のフレームとがHUDに表示される。 Similarly, there is known a vehicle periphery monitoring device that displays the presence and type of a detection object that has a high possibility of contact with the host vehicle on an image display device including a HUD (head-up display) and notifies the driver. (For example, refer to Patent Document 2). Based on the shape and size of the detection object at this time, the type of pedestrian, bicycle, animal, etc. is determined, and the mark in a different form and the position of the detection object are placed according to the type difference. The surrounding rectangular frame is displayed on the HUD.
国際公開第2013/088535号International Publication No. 2013/085535 特開2010-108264号公報JP 2010-108264 A
 特許文献2の従来技術によれば、フロントウインドシールドに情報を表示するために、ドライバーにとっては視線を変更する必要がなく、特許文献1の場合よりも有利となる。 According to the conventional technology of Patent Document 2, it is not necessary for the driver to change the line of sight in order to display information on the front windshield, which is more advantageous than that of Patent Document 1.
 しかしながら、フロントウインドシールドに情報を表示する場合であっても、ドライバーの視線の先はフロントウインドシールド越しの前方にあるため、フロントウインドシールド面に表示された情報を視認するには目の焦点を変えなければならず、瞬時に認知できなかったり見落すこともある。 However, even when displaying information on the front windshield, the driver's line of sight is ahead of the front windshield, so focus on the eyes to see the information displayed on the front windshield surface. It must be changed, and may not be instantly recognized or overlooked.
 本発明は、上記問題点を解消するためになされたもので、その目的は、ドライバーに対して分かりやすくフロントウインドシールドに向けて情報を表示することが可能な自動車用画像表示システムを提供することにある。 The present invention has been made to solve the above problems, and an object thereof is to provide an automobile image display system capable of displaying information toward a front windshield in an easy-to-understand manner for a driver. It is in.
 本発明は、自動車の乗員への提供情報を画像で表示する自動車用画像表示システムであって、前記提供情報を検出する検出手段と、前記提供情報の立体画像を構成する右目用画像及び左目用画像を前記自動車のフロントウインドシールドに向けて投影する表示手段と、を備えて、前記乗員が前記自動車の前方への視線上の前記フロントウインドシールドを挟んだ奥行き方向の位置に前記提供情報を表示可能にしたことを特徴とする。 The present invention is an image display system for an automobile that displays information provided to an occupant of an automobile as an image, the detecting means for detecting the provided information, and a right-eye image and a left-eye image constituting a stereoscopic image of the provided information. Display means for projecting an image toward the front windshield of the automobile, and the occupant displays the provided information at a position in the depth direction across the front windshield on the line of sight toward the front of the automobile. It is possible to make it possible.
 このとき、前記検出手段は、前記自動車の周囲の状況を撮影するカメラであり、カメラの撮影画像を前記提供情報とする。このカメラは、右目用画像と左目用画像とを撮影するステレオカメラであるとよい。 At this time, the detection means is a camera that captures a situation around the automobile, and a captured image of the camera is used as the provision information. This camera may be a stereo camera that captures a right-eye image and a left-eye image.
 また、前記検出手段は、前記自動車の周囲の状況又は自動車の走行状態を検知するセンサであり、描画手段によって前記センサによる前記提供情報を視覚的に表示可能な描画画像に変換する。 Further, the detection means is a sensor that detects a situation around the automobile or a running state of the automobile, and converts the provided information by the sensor into a drawn image that can be visually displayed by the drawing means.
 そして、前記描画画像は、前記フロントウインドシールドを通して視認可能な現実風景における特定の検知対象物に関連付けて表示することで、乗員は検知対象物を注意をもって視認することができる。 The drawn image is displayed in association with a specific detection target in a real scene that can be viewed through the front windshield, so that the occupant can view the detection target with caution.
 さらに、前記表示手段は、描画画像を表示するとき、前記乗員が前記フロントウインドシールドから前記検知対象物を視認するときの奥行き方向の深度と略同じ深度で、前記検知対象物を視認することができる両眼視差を有する前記右目用画像及び前記左目用画像を投影すると、乗員は検知対象物の視認がより容易で確実となる。 Furthermore, when the display means displays a drawn image, the occupant can visually recognize the detection object at a depth substantially equal to a depth in a depth direction when the occupant visually recognizes the detection object from the front windshield. When the right-eye image and the left-eye image having binocular parallax that can be projected are projected, the occupant can more easily and reliably visually recognize the detection target.
 また、前記検知手段は、前記自動車の現在位置を特定して周辺の地図データから進行方向を案内するナビゲーション装置であり、この案内の描画画像を前記検知対象物に関連付けて表示するとよい。 Further, the detection means is a navigation device that identifies the current position of the vehicle and guides the traveling direction from surrounding map data, and it is preferable to display a drawing image of this guidance in association with the detection object.
 また、前記検出手段は、車両/車両間通信(以下、適宜「車車間通信」という)又は道路施設/車両間通信(以下、適宜「路車間通信」という)によって、付近の車輛についての前記提供情報を取得する通信装置であってもよい。車車間通信は、他車輌との間での近距離無線通信で行われ、路車間通信は、道路周辺設備間等における各種情報通信である。車車間通信又は路車間通信により、他車輌の状況を精度よく入手できる。 In addition, the detection means provides the above-described provision of nearby vehicles by vehicle / vehicle communication (hereinafter referred to as “vehicle-to-vehicle communication” as appropriate) or road facility / vehicle communication (hereinafter referred to as “road-to-vehicle communication” as appropriate). It may be a communication device that acquires information. Vehicle-to-vehicle communication is performed by short-range wireless communication with other vehicles, and road-to-vehicle communication is various information communication between road peripheral equipment and the like. The situation of other vehicles can be obtained with high accuracy by vehicle-to-vehicle communication or road-to-vehicle communication.
 本発明によれば、乗員への提供情報を右目用画像と左目用画像とで投影するために、乗員は立体画像で提供情報を視認することができる。よって、乗員はフロントウインドシールド越しの前方に目の焦点を合せている状態で提供情報を視認することができるため、提供情報の確認のために目の焦点を切り変える必要がなく見落としがなくなる。 According to the present invention, since the provision information to the occupant is projected by the right-eye image and the left-eye image, the occupant can visually recognize the provision information as a stereoscopic image. Therefore, since the occupant can view the provided information in a state where the eyes are focused in front of the front windshield, it is not necessary to switch the focus of the eyes for confirmation of the provided information, and there is no oversight.
本発明に係る自動車用画像表示システムの全体構成をブロック図で示す。1 is a block diagram showing the overall configuration of an automobile image display system according to the present invention. 本発明に係る自動車用画像表示システムにおけるカメラ及びセンサを配置した自動車の一例を平面図で示す。1 is a plan view showing an example of an automobile in which a camera and a sensor are arranged in an automobile image display system according to the present invention. 本発明に係る自動車用画像表示システムを備える自動車のフロントウインドシールド及びダッシュボードを運転席側から見た概念図で示す。BRIEF DESCRIPTION OF THE DRAWINGS It shows with the conceptual diagram which looked at the front windshield and dashboard of the motor vehicle provided with the image display system for motor vehicles based on this invention from the driver's seat side. HUD装置による画像表示の模式的な説明図を示す。The typical explanatory view of the image display by a HUD device is shown. 本発明に係る自動車用画像表示システムのフロントウインドシールドでの表示の一例を説明する模式図を示す。The schematic diagram explaining an example of the display in the front windshield of the image display system for motor vehicles concerning the present invention is shown. 本発明に係る自動車用画像表示システムのフロントウインドシールドでの表示の別の一例を説明する模式図を示す。The schematic diagram explaining another example of the display in the front windshield of the image display system for motor vehicles based on this invention is shown. 本発明に係る自動車用画像表示システムを組み合わせた走行支援システムの全体構成をブロック図で示す。1 is a block diagram showing the overall configuration of a driving support system that combines an automobile image display system according to the present invention. 自動車用画像表示の動作の一例を説明するフローチャートを示す。The flowchart explaining an example of operation | movement of the image display for motor vehicles is shown. 走行支援システムにおけるフロントウインドシールドでの表示の一例を説明する模式図を示す。The schematic diagram explaining an example of the display by the front windshield in a driving assistance system is shown. 走行支援システムにおけるフロントウインドシールドでの表示の別の一例を説明する模式図を示す。The schematic diagram explaining another example of the display by the front windshield in a driving assistance system is shown. 走行支援システムにおけるフロントウインドシールドでの表示のさらに別の一例を説明する模式図を示す。The schematic diagram explaining another example of the display by the front windshield in a driving assistance system is shown. 走行支援システムにおけるフロントウインドシールドに表示される仮想車輌の車線変更を時間経過順に説明する模式図を示す。The schematic diagram explaining the lane change of the virtual vehicle displayed on the front windshield in a driving assistance system in order of time passage.
 図1は、本発明に係る自動車用画像表示システムの概略構成をブロック図で示している。図1が示すように、自動車用画像表示システム1は、検出部2と、表示制御部3と、表示手段であるヘッドアップディスプレイ(HUD)装置4とで構成される。 FIG. 1 is a block diagram showing a schematic configuration of an automobile image display system according to the present invention. As shown in FIG. 1, the vehicle image display system 1 includes a detection unit 2, a display control unit 3, and a head-up display (HUD) device 4 that is a display unit.
 検出部2は、自動車の周囲の状況を監視するカメラ部2A及びセンサ部2Bを備えている。カメラ部2Aによる撮影画像やセンサ部2Bによる検知情報は、自動車の乗員への提供情報となる。したがって、検出部2は、提供情報を検出する検出手段である。 The detection unit 2 includes a camera unit 2A and a sensor unit 2B that monitor the situation around the automobile. The captured image by the camera unit 2A and the detection information by the sensor unit 2B are provided information to the vehicle occupant. Therefore, the detection unit 2 is a detection unit that detects the provided information.
 カメラ部2Aは、例えば、自動車の周囲の状況を撮影するために、図2に示すように3通りのカメラ17,18,19を配置している。 The camera unit 2A has, for example, three cameras 17, 18, and 19 as shown in FIG. 2 for photographing the situation around the automobile.
 カメラ17は、フロントウインドシールド15の上部に設置されて自動車10の前方視界を撮影する。カメラ18は、バックウインドシールドに配置されて自動車10の後方視界を撮影する。 The camera 17 is installed on the upper part of the front windshield 15 and photographs the front view of the automobile 10. The camera 18 is disposed on the back windshield and photographs the rear view of the automobile 10.
 また、自動車10の内側から周囲の状況を目視する際、自動車の構造上、車内からの視線が遮られて外の状況を視認することができない死角を形成する構造体がある。例えば、車体の屋根を支持するピラー等の構造体であり、図2の例では、車体の屋根共にフロントウインドシールド15を保持するAピラー16による死角の視界を確保するために、Aピラー16の外側にカメラ19を配置している。 Also, when viewing the surrounding situation from the inside of the automobile 10, there is a structure that forms a blind spot in which the line of sight from the inside of the automobile is blocked due to the structure of the automobile and the outside situation cannot be seen. For example, it is a structure such as a pillar that supports the roof of the vehicle body. In the example of FIG. 2, in order to ensure the blind spot visibility by the A pillar 16 that holds the front windshield 15 together with the roof of the vehicle body, A camera 19 is arranged outside.
 センサ部2Bは、自動車の周囲における自動車、自転車、人又は動物等(以下、「検知対象物」という)の存在を検知するためのセンサ20,21を備えている。センサ20,21は、例えば、レーダーセンサであり、図3で示すように、センサ20は、左右のドアミラー35の下部に取り付けられて、左右後方における検知対象物の存在を検知する。センサ21は、自動車10の前部の車体の中央部に取り付けられて、前方に存在する検知対象物の存在を検知する。レーダーセンサとしては、ミリ波レーダー、マイクロ波レーダー、レーザーレーダー、赤外線センサ、超音波センサ等が用いられる。 The sensor unit 2B includes sensors 20 and 21 for detecting the presence of an automobile, a bicycle, a person, an animal, or the like (hereinafter referred to as “detection object”) around the automobile. The sensors 20 and 21 are, for example, radar sensors. As shown in FIG. 3, the sensor 20 is attached to the lower part of the left and right door mirrors 35 and detects the presence of the detection object at the left and right rear. The sensor 21 is attached to the central portion of the vehicle body at the front of the automobile 10 and detects the presence of a detection object existing in front. As the radar sensor, millimeter wave radar, microwave radar, laser radar, infrared sensor, ultrasonic sensor, or the like is used.
 さらに、自動車10の走行系に設けられている操舵角センサ24や車速センサ25、ギアポジションセンサ26等も、ここでの検出部2を構成している。 Furthermore, the steering angle sensor 24, the vehicle speed sensor 25, the gear position sensor 26, and the like provided in the traveling system of the automobile 10 also constitute the detection unit 2 here.
 表示制御部3は、検出部2が取得した情報を立体画像でHUD装置4に表示する。このとき、表示する画像には、カメラ17,18,19による撮影画像と、センサ20,21や操舵角センサ24、車速センサ25からの検知情報を図形やアイコン等の絵柄又は文字・数字で視覚的に表す描画画像との二種類がある。そして、表示制御部3は、これら各種の描画画像が予め格納されている画像メモリMを含む。 The display control unit 3 displays the information acquired by the detection unit 2 on the HUD device 4 as a stereoscopic image. At this time, images to be displayed include images taken by the cameras 17, 18, 19, and detection information from the sensors 20, 21, the steering angle sensor 24, and the vehicle speed sensor 25 in the form of graphics such as graphics and icons or letters / numbers. There are two types of drawn images. The display control unit 3 includes an image memory M in which these various drawn images are stored in advance.
 撮影画像を立体表示するには、それぞれのカメラ17,18,19には、人間の左右の目に相当する両眼のレンズを一つの筐体に収めたステレオカメラを用いることで、表示制御部3は、ステレオカメラから入力される右目用画像及び左目画像をHUD装置4に出力する。カメラ17,18,19には、ステレオカメラに限らず、単眼のカメラであってもコンピュータ処理によってステレオ画像を撮影することができるカメラを用いることもできる。 In order to display captured images in three dimensions, each camera 17, 18 and 19 uses a stereo camera in which a binocular lens corresponding to the left and right eyes of a human is housed in a single housing, thereby providing a display control unit. 3 outputs the right-eye image and the left-eye image input from the stereo camera to the HUD device 4. The cameras 17, 18, and 19 are not limited to stereo cameras, and may be cameras that can capture stereo images by computer processing even if they are monocular cameras.
 描画画像を表示するには、表示制御部3は、センサ20,21及び操舵角センサ24や車速センサ25、ギアポジションセンサ26からの検知情報を視覚的に表示するのに、適切な絵柄や文字・数字の描画画像を画像メモリMに予め記憶されている中から選択する。そして、選択した描画画像から右目用画像及び左目用画像を作成し、HUD装置4に出力する。したがって、この場合に表示制御部3は、前記提供情報を視覚的に表示可能な描画画像に変換する描画手段として機能する。 In order to display the drawn image, the display control unit 3 appropriately displays images and characters suitable for visually displaying detection information from the sensors 20 and 21, the steering angle sensor 24, the vehicle speed sensor 25, and the gear position sensor 26. A number drawing image is selected from those stored in advance in the image memory M. Then, a right-eye image and a left-eye image are created from the selected drawing image and output to the HUD device 4. Accordingly, in this case, the display control unit 3 functions as a drawing unit that converts the provided information into a drawing image that can be visually displayed.
 ここでのHUD装置4には、従来から知られた構造や仕組みのものが使用され、例えば、図4で示すように、右目用画像表示部6と、左目用画像表示部7と、ミラー8とを備える構成となっている。そして、HUD装置4は、ダッシュボード39の上面に配置されて、フロントウインドシールド15に向けて画像を投影する。また、HUD装置4は、ダッシュボード39内に組み込むことや、運転席の天井部からフロントウインドシールド15に画像を投影してもよい。 Here, the HUD device 4 has a conventionally known structure and mechanism. For example, as shown in FIG. 4, a right-eye image display unit 6, a left-eye image display unit 7, and a mirror 8. It is the composition provided with. The HUD device 4 is disposed on the upper surface of the dashboard 39 and projects an image toward the front windshield 15. Further, the HUD device 4 may be incorporated in the dashboard 39 or may project an image onto the front windshield 15 from the ceiling of the driver's seat.
 HUD装置4の右目用画像表示部6及び左目用画像表示部7は、それぞれ右目用画像RP及び左目用画像LPを表示するディスプレイであり、例えば、ELD(エレクトロ・ルミネッセンス・ディスプレイ)、蛍光表示管、FED(フィールド・エミッション・ディスプレイ)、LCD(液晶ディスプレイ)等の表示デバイスが用いられる。 The right-eye image display unit 6 and the left-eye image display unit 7 of the HUD device 4 are displays for displaying the right-eye image RP and the left-eye image LP, respectively, for example, ELD (electroluminescence display), fluorescent display tube Display devices such as FED (Field Emission Display) and LCD (Liquid Crystal Display) are used.
 そして、HUD装置4のミラー8は、右目用画像表示部6及び左目用画像表示部7と対向するように配置されて、右目用画像表示部6及び左目用画像表示部7がそれぞれ表示している右目用画像RPと左目用画像LPとを、フロントウインドシールド15に向けて反射する。また、ミラー8は、右目用画像表示部6や左目用画像表示部7が表示している右目用画像RPと左目用画像LPを、ミラー形状によって予め決定される倍率に拡大して投影する。 The mirror 8 of the HUD device 4 is disposed so as to face the right-eye image display unit 6 and the left-eye image display unit 7, and the right-eye image display unit 6 and the left-eye image display unit 7 respectively display the images. The right-eye image RP and the left-eye image LP are reflected toward the front windshield 15. In addition, the mirror 8 projects the right-eye image RP and the left-eye image LP displayed by the right-eye image display unit 6 and the left-eye image display unit 7 at a magnification determined in advance by the mirror shape.
 フロントウインドシールド15は、合わせガラス、IRカットガラス、UVカットガラス等を用いて、自動車10の外形に応じて縦方向及び横方向に湾曲した曲面形状に形成されている。そして、フロントウインドシールド15は、乗員のドライバーの視線を透過するとともに、HUD装置4から投射された右目用画像RPと左目用画像LPとは視線方向に反射する。 The front windshield 15 is formed in a curved shape curved in the vertical direction and the horizontal direction according to the outer shape of the automobile 10 using laminated glass, IR cut glass, UV cut glass or the like. The front windshield 15 transmits the line of sight of the driver of the occupant and reflects the right-eye image RP and the left-eye image LP projected from the HUD device 4 in the line-of-sight direction.
 このように構成された自動車用画像表示システム1において、HUD装置4が表示制御部3から出力される右目用画像RPと左目用画像LPとを投影することで、右目用画像RPと左目用画像LPはフロントウインドシールド15にドライバーの両目にそれぞれ反射される。 In the vehicle image display system 1 configured as described above, the HUD device 4 projects the right-eye image RP and the left-eye image LP output from the display control unit 3, so that the right-eye image RP and the left-eye image are displayed. LP is reflected by the front windshield 15 to both eyes of the driver.
 これによりドライバーは、両眼視差によってフロントウインドシールド15を挟んだ自動車の前方の奥行き方向の視線上に立体画像TDを視認することができる。よって、ドライバーは、フロントウインドシールド15越しに自動車10の前方の状況を目視している状態で、検出部2が検出した提供情報を撮影画像や描画画像の立体画像で認識することができる。 This allows the driver to visually recognize the stereoscopic image TD on the line of sight in the depth direction ahead of the automobile with the front windshield 15 sandwiched by binocular parallax. Therefore, the driver can recognize the provided information detected by the detection unit 2 from the captured image or the stereoscopic image of the drawn image while viewing the situation in front of the automobile 10 through the front windshield 15.
 人の目の焦点は数メートル前に合せていれば、無限遠にある対象に対しても大きくピントがずれることはない。したがって、ドライバーが自動車10の前方の視界を確認するのに、フロントウインドシールド15の前方の奥行き方向に目の焦点を合せているとき、画像がフロントウインドシールド15上に写し出されると、これを見落とす可能性が高くなる。しかし、ドライバーが提供情報を立体視可能なように構成することで、フロントウインドシールド15の前方の奥行き方向に焦点を合せている状態で提供情報が視認可能となり、提供情報を容易に且つ確実に認知して見落とすことがなくなる。 ¡If the human eye is focused several meters in front, it will not be greatly out of focus even for objects at infinity. Therefore, when the driver focuses on his / her eyes in the depth direction in front of the front windshield 15 to check the field of view in front of the automobile 10, he / she misses this when the image is projected on the front windshield 15. The possibility increases. However, by configuring the driver so that the provided information can be viewed stereoscopically, the provided information can be viewed in a state in which the driver is focused in the depth direction in front of the front windshield 15, and the provided information can be easily and reliably provided. It will not be recognized and overlooked.
 上記構成の自動車用画像表示システム1は、表示制御部3は、例えば、ドライバーが自動車10を右折させようと、ステアリングホイール22を操作するのを操舵角センサ24で検知したとき、そのときカメラ18が撮影した画像をHUD装置4に立体表示することができる。したがって、図4で示すように、ドライバーは、フロントウインドシールド15を通して前方の奥行き方向に目の焦点を合わせている状態で、立体画像TDを視認可能なことから、自動車10の右側を横断している歩行者の存在を認識してブレーキ操作を行うことができる。しかも、死角の光景は立体画像で表示されるため、フロントウインドシールド14を通して視認される自動車の前方の実際の光景と混同視される虞もなく、ドライバーは正確な状況把握を行うことが可能となる。 In the automobile image display system 1 configured as described above, when the display control unit 3 detects that the driver operates the steering wheel 22 to turn the automobile 10 to the right, for example, the steering angle sensor 24 detects the camera 18 at that time. 3D can be displayed on the HUD device 4. Therefore, as shown in FIG. 4, the driver can visually recognize the stereoscopic image TD in the state in which the eyes are focused in the front depth direction through the front windshield 15, so that the driver crosses the right side of the automobile 10. The brake operation can be performed by recognizing the presence of a pedestrian. Moreover, since the blind spot scene is displayed as a three-dimensional image, the driver can accurately grasp the situation without fear of being confused with the actual scene in front of the car viewed through the front windshield 14. Become.
 このとき、ドライバーによって視認される立体画像のフロントウインドシールド15からの奥行方向の深度は両眼視差によって決まるため、必要に応じて、表示制御部3は、この深度で立体画像が再現されるようカメラ18が撮影した右目用画像RPと左目用画像LPの加工処理を行うことができる。 At this time, since the depth in the depth direction of the stereoscopic image visually recognized by the driver from the front windshield 15 is determined by binocular parallax, the display control unit 3 may reproduce the stereoscopic image at this depth as necessary. Processing of the right eye image RP and the left eye image LP taken by the camera 18 can be performed.
 同様に、自動車10を後退させるときには、ギアポジションセンサ24がリバースギアに入ったことを検出すると、表示制御部3は、カメラ18が撮影した右目用画像RPと左目用画像LPとをHUD装置4によって投影する。よって、ドライバーは、フロントウインドシールド15越しに自動車10の前方の状況を目視している状態で後方の歩行者等の存在を視認できる。 Similarly, when the vehicle 10 is moved backward, when the gear position sensor 24 detects that the reverse gear has been entered, the display control unit 3 uses the right eye image RP and the left eye image LP captured by the camera 18 as the HUD device 4. Project by. Therefore, the driver can visually recognize the presence of a pedestrian or the like behind the vehicle 10 while viewing the situation in front of the automobile 10 through the front windshield 15.
 また、表示制御部3は、車速センサ25から現在の車速値を取得すると、これを視覚的に表示するための数字及び文字の画像を画像メモリMから読み出して、この画像から右目用画像RPと左目用画像LPを作成して、図5に示す速度表示34(図示の例では、「60km/h」)の描画画像を表示する。このとき、表示制御部3は、車速を示す描画画像のフロントウインドシールド15からの奥行方向の深度に応じた両眼視差が再現した立体画像が視認可能なよう、描画画像の右目用画像RP及び左目用画像LPを作成する。速度の表示以外にも、ギアポジションセンサ26が検知したギアの状態を表わす描画画像等も両眼視差に基づく立体画像で表示する。 Further, when the display control unit 3 acquires the current vehicle speed value from the vehicle speed sensor 25, the display control unit 3 reads out images of numbers and characters for visually displaying the value from the image memory M, and from this image, the right-eye image RP and A left-eye image LP is created, and a drawing image of the speed display 34 (“60 km / h” in the illustrated example) shown in FIG. 5 is displayed. At this time, the display control unit 3 allows the right-eye image RP of the drawing image and the right-eye image RP to be visually recognized so that a binocular parallax reproduced in accordance with the depth in the depth direction from the front windshield 15 of the drawing image indicating the vehicle speed is visible. A left-eye image LP is created. In addition to the speed display, a drawing image representing the state of the gear detected by the gear position sensor 26 is also displayed as a stereoscopic image based on binocular parallax.
 そして、表示制御部3は、操舵角センサ24によってドライバーが車線を変更するためにステアリングホィール22を右又は左に切るのを検知すると、その方向の後方における検知対象物の存在をセンサ20からの情報で検知する。このとき、センサ20は、検知した検知対象物が所定以上の大きさであると、検知対象物が接近していると判別して、検知対象物の検知信号を表示制御部3に出力する。 When the display control unit 3 detects that the driver turns the steering wheel 22 to the right or left in order to change the lane by the steering angle sensor 24, the display control unit 3 detects the presence of the detection object behind the direction from the sensor 20. Detect by information. At this time, if the detected object to be detected is larger than a predetermined size, the sensor 20 determines that the detected object is approaching and outputs a detection signal of the detected object to the display control unit 3.
 これにより、表示制御部3は、図6に示すように、車線変更には危険があることを表わす禁止マーク13の描画画像を描画メモリMから読み出して、HUD装置4によって立体表示する。図示の例では、左側のレーンに車線変更しようとして、左後方から接近する他車輛が存在するために、禁止マーク13を左側に表示している。 Thereby, as shown in FIG. 6, the display control unit 3 reads out a drawing image of the prohibition mark 13 indicating that there is a danger in changing the lane from the drawing memory M, and displays the drawing in three dimensions by the HUD device 4. In the illustrated example, a prohibition mark 13 is displayed on the left side because there is another vehicle approaching from the left rear in an attempt to change the lane in the left lane.
 よって、ドライバーは、禁止マーク13が立体表示されるため、フロントウインドシールド15を通して前方の奥行き方向に目の焦点を合わせて注視している状態で禁止マーク13を認識できるため、確実に車線変更の危険を察知することができる。また、左右後方の検知対象物を検知するセンサ20に代えてカメラを配置し、カメラが撮影した後方の他車輛をHUD装置4によって立体画像で表示することも可能である。 Therefore, since the prohibition mark 13 is displayed in a three-dimensional manner, the driver can recognize the prohibition mark 13 while focusing on the eyes in the depth direction ahead through the front windshield 15, so that the lane change can be surely made. Can detect danger. It is also possible to arrange a camera instead of the sensor 20 that detects the left and right rear detection objects, and display the rear other vehicle photographed by the camera as a stereoscopic image by the HUD device 4.
 上記した本発明に係る自動車用画像表示システム1は、安全運転を支援するための走行支援システムを備えた自動車10に有効に適用される。以下、その実施形態を説明する。 The above-described image display system 1 for an automobile according to the present invention is effectively applied to an automobile 10 provided with a driving support system for supporting safe driving. The embodiment will be described below.
 先ず、走行支援システムについて述べると、我が国では、走行支援システムについて、その自動化レベルがレベル1からレベル4まで4段階に分けて定義されている。レベル1は、加速・操舵・制動のいずれかを自動車が行い、安全運転支援システムと呼ばれる。レベル2は、加速・操舵・制動の内複数の操作を同時に自動車が行い、準自動走行システムと呼ばれる。レベル3は、加速・操舵・制動を全て自動車が行い、緊急時のみドライバーが対応する状態で、これも準自動走行システムと呼ばれる。レベル4は、加速・操舵・制動を全てドライバー以外が行い、ドライバーが全く関与しない状態で完全自動走行システムと呼ばれる。本明細書においての「自動運転」とは、レベル3又は4のレベルでの走行を言う。 First, regarding the driving support system, in Japan, the automation level of the driving support system is defined in four stages from level 1 to level 4. Level 1 is called a safe driving support system, where the vehicle performs acceleration, steering, or braking. Level 2 is called a quasi-automatic driving system, in which a vehicle simultaneously performs a plurality of operations of acceleration, steering, and braking. Level 3 is a state in which acceleration, steering, and braking are all performed by the vehicle and the driver responds only in an emergency. This is also called a semi-automatic driving system. Level 4 is called a fully automatic driving system in which acceleration, steering and braking are all performed by a person other than the driver and the driver is not involved at all. In this specification, “automatic driving” refers to traveling at level 3 or level 4.
 図7は、本実施形態の自動車用画像表示システム1を組み合わせた走行支援システム20の構成全体を概略的に示している。 FIG. 7 schematically shows the overall configuration of a driving support system 20 that combines the automobile image display system 1 of the present embodiment.
 走行支援システム20は、CPU、ROM、RAMから成る中央演算処理部21を備えて、この中央演算処理部21には、図1で説明した検出部2と、HMI(Human Interface)部23と、走行制御装置25と、ナビゲーション装置33とがそれぞれ接続されている。 The driving support system 20 includes a central processing unit 21 including a CPU, a ROM, and a RAM. The central processing unit 21 includes the detection unit 2 described in FIG. 1, an HMI (Human Interface) unit 23, The traveling control device 25 and the navigation device 33 are connected to each other.
 HMI部23は、図1で説明したHUD装置4と共に、タッチパネル30及びモニター31(図3に図示)を有して構成される。モニター31は、例えば、自動車10の後退時にカメラ18が撮影した後方視界を表示する。また、タッチパネル30は、モニター31の画面上にソフトウェアで形成される。 The HMI unit 23 includes the touch panel 30 and the monitor 31 (shown in FIG. 3) together with the HUD device 4 described in FIG. The monitor 31 displays, for example, a rear view taken by the camera 18 when the automobile 10 is moved backward. The touch panel 30 is formed by software on the screen of the monitor 31.
 走行制御装置25は、ブレーキ制御装置26及びアクセル制御装置27を備える。ブレーキ制御装置26は、ブレーキ作動装置(図示せず)を制御して自動車10を自動的に減速させる。アクセル制御装置27は、スロットル開度を調整するスロットル作動装置(図示せず)を制御して、自動車10を自動的に加速させる。 The traveling control device 25 includes a brake control device 26 and an accelerator control device 27. The brake control device 26 controls a brake operating device (not shown) to automatically decelerate the automobile 10. The accelerator control device 27 automatically accelerates the automobile 10 by controlling a throttle actuator (not shown) that adjusts the throttle opening.
 ナビゲーション装置33は、GPS(グローバル・ポジショニング・システム)32から受け取るGPS情報に基づいて、地図データベースから自動車10の周囲の地図データを読み出す。 The navigation device 33 reads map data around the automobile 10 from the map database based on GPS information received from a GPS (global positioning system) 32.
 中央演算処理部21は、前記CPUが前記ROMに格納されている制御プログラムを実行することにより、それぞれ実現されるパターンマッチング処理部21a、停止制御部21b、追従走行制御部21c、定速走行制御部21d、目標先行車輌決定部21e、自動運転制御部21f及び図1で説明した表示制御部3の各機能を有している。 The central processing unit 21 executes a control program stored in the ROM by the CPU, thereby realizing a pattern matching processing unit 21a, a stop control unit 21b, a follow-up travel control unit 21c, and a constant speed travel control. It has each function of the part 21d, the target preceding vehicle determination part 21e, the automatic driving control part 21f, and the display control part 3 demonstrated in FIG.
 パターンマッチング処理部21aは、カメラ17,18,19からそれぞれ送られてくる画像にパターンマッチング処理を行うことで、画像の中から歩行者や自転車、自動二輪車等の輪郭を検出することで、これらの存在を判別する。この処理を行うために、前記ROMには、これら検出対象の画像特徴である輪郭が予め登録されているパターンデータファイルFが含まれている。さらに、このROMには、表示制御部3がHUD装置4に表示する図形やアイコン等の絵柄又は文字・数字で視覚的に表す描画画像が予め格納されている前述の画像メモリMが含まれている。 The pattern matching processing unit 21a performs pattern matching processing on the images sent from the cameras 17, 18, and 19 to detect the contours of pedestrians, bicycles, motorcycles, etc. from the images. Determine the existence of. In order to perform this processing, the ROM includes a pattern data file F in which contours that are image features to be detected are registered in advance. Further, the ROM includes the above-described image memory M in which a drawing image visually represented by a pattern such as a graphic or an icon displayed by the display control unit 3 on the HUD device 4 or a character / number is stored in advance. Yes.
 そして、パターンマッチング処理部21aは、パターンマッチング処理と同時に、画像の輝度情報を基にして、白線又は黄線による車線ラインの検出も行う。 The pattern matching processing unit 21a also detects the lane line by the white line or the yellow line based on the luminance information of the image simultaneously with the pattern matching process.
 停止制御部21bは、パターンマッチング処理部21aが画像の中から歩行者や自転車、自動二輪車等の検知対象物を検出したとき、カメラ17の左右の両画像の視差から検知対象物までの距離を演算して、その距離に応じて自動車10の停止を制御する停止制御手段である。 When the pattern matching processing unit 21a detects a detection target such as a pedestrian, a bicycle, or a motorcycle from the image, the stop control unit 21b determines the distance from the parallax between the left and right images of the camera 17 to the detection target. It is a stop control means that calculates and controls the stop of the automobile 10 according to the distance.
 追従走行制御部21cは、追従走行モードにおいて、自動車10が先行車輌に追従走行させるために、先行車輌との車間距離を演算して、予め設定した車間距離と比較した結果のデータを走行制御装置25に出力する。 In the follow-up running mode, the follow-up running control unit 21c calculates the inter-vehicle distance with the preceding vehicle so that the automobile 10 follows the preceding vehicle and compares the data as a result of comparison with the preset inter-vehicle distance. To 25.
 定速走行制御部21dは、定速走行モードにおいて、自車輌を定速走行させるために、現在の速度と予め設定した車速と差を演算して走行制御装置25に出力する。 The constant speed traveling control unit 21d calculates a difference between the current speed and a preset vehicle speed and outputs the difference to the traveling control device 25 in order to cause the host vehicle to travel at a constant speed in the constant speed traveling mode.
 目標先行車輌決定部21eは、追従走行モードの目標となる先行車輌を決定する。 The target preceding vehicle determination unit 21e determines a preceding vehicle that is a target of the follow-up traveling mode.
 自動運転制御部21fは、上記レベル3又は4のレベルでの自動車10を自動走行するための制御を行なう。 The automatic operation control unit 21f performs control for automatically driving the automobile 10 at the level 3 or 4 described above.
 上記の走行支援システム20において、自動車10が右折する際に、人の存在を検知したときに自動停止する場合の具体的な動作について、図8のフローチャートに基づいて説明する。この自動停止は、上記したレベル1からレベル4までの全ての走行支援は勿論、ドライバーによるマニュアル運転時の安全支援においても適用できる。 In the travel support system 20 described above, a specific operation in the case of automatically stopping when the presence of a person is detected when the automobile 10 makes a right turn will be described based on the flowchart of FIG. This automatic stop can be applied not only to all the above-described driving support from level 1 to level 4 but also to safety support during manual driving by the driver.
 中央演算処理部21は、ドライバーが自動車10を右折させるためのステアリング操作が行われたか否かを判定する(ステップS1)。このとき中央演算処理部21は、ステアリングホイール22が回転操作されたかを操舵角センサ24によって検出する。右折のステアリング操作が行われておらず「NO」の場合には、死角となっている光景の画像を表示する動作は行われない。 The central processing unit 21 determines whether or not a steering operation for the driver to turn the vehicle 10 to the right is performed (step S1). At this time, the central processing unit 21 detects whether the steering wheel 22 is rotated by the steering angle sensor 24. When the steering operation for the right turn is not performed and “NO”, the operation of displaying the image of the sight that is a blind spot is not performed.
 右折のステアリング操作が行われて「YES」の場合には、中央演算処理部21のパターンマッチング処理部21aは、このときカメラ19から送られてくる画像を取得し(ステップS2)、次に、カメラ19が撮影した右目用画像RPと左目用画像LPとをHUD装置4によって表示する。したがって、画像が両眼視差によって立体表示されることで、ドライバーは自動車の前方のドライバーの視線上のフロントウインドシールド15を挟んだ奥行き方向の位置に画像を視認できる(ステップS3)。 If the steering operation for the right turn is performed and “YES”, the pattern matching processing unit 21a of the central processing unit 21 acquires the image sent from the camera 19 at this time (step S2). The right eye image RP and the left eye image LP captured by the camera 19 are displayed by the HUD device 4. Therefore, the driver can visually recognize the image at a position in the depth direction across the front windshield 15 on the line of sight of the driver in front of the car by stereoscopically displaying the image with binocular parallax (step S3).
 そして、中央演算処理部21は、取得した画像とパターンデータファイルFに登録されている検知対象物の輪郭とのパターンマッチング処理を行い、画像の中に歩行者や自転車、自動二輪車等の検知対象物の画像特徴が含まれているか否かを検出することで、その存在を判別する(ステップS4)。検知対象物が含まれていない「NO」の場合は、死角となっている光景の画像を表示する動作を終了する。 Then, the central processing unit 21 performs pattern matching processing between the acquired image and the contour of the detection target registered in the pattern data file F, and detection targets such as pedestrians, bicycles, and motorcycles are included in the image. By detecting whether or not the image feature of the object is included, the presence is determined (step S4). In the case of “NO” in which the detection target is not included, the operation of displaying the image of the sight that is a blind spot is ended.
 中央演算処理部21は、画像に検知対象物が含まれていると(ステップS4の「YES」)、停止制御部21bによって、カメラ19の左右の両画像の視差から検知対象物までの距離を演算して、その距離が所定範囲内にあるか否かを判断する(ステップS5)。そして、検知対象物までの距離が、例えば、10メートルの範囲を超えているときには(ステップS5の「NO」)、死角となっている光景の画像を表示する動作を終了する。 When the detection target is included in the image (“YES” in step S4), the central processing unit 21 determines the distance from the parallax between the left and right images of the camera 19 to the detection target by the stop control unit 21b. Calculation is performed to determine whether the distance is within a predetermined range (step S5). For example, when the distance to the detection object exceeds a range of 10 meters (“NO” in step S5), the operation of displaying the image of the sight that is a blind spot is ended.
 一方、検知対象物までの距離が10メートルの範囲内であると(ステップS5の「YES」)、中央演算処理部21は、ブレーキ制御装置26により自動車10を停止させる制御を行う(ステップS6)。その後、中央演算処理部21は、ステップ2からの処理を繰り返して、当該検知対象物や他の検知対象物が存在しなくなるか(ステップ4の「NO」)、又は存在していてもそれらとの距離が10メートルを超えるようになると(ステップS5の「NO」)、死角となっている光景の画像を表示する動作を終了する。この場合、停止制御部21bは、ブレーキ制御装置26による自動車10の停止状態を解除する。 On the other hand, when the distance to the object to be detected is within a range of 10 meters (“YES” in step S5), the central processing unit 21 performs control to stop the automobile 10 by the brake control device 26 (step S6). . Thereafter, the central processing unit 21 repeats the processing from step 2 so that the detection object and other detection objects no longer exist ("NO" in step 4), or even if they exist. If the distance exceeds 10 meters (“NO” in step S5), the operation of displaying the image of the sight that is a blind spot is terminated. In this case, the stop control unit 21b releases the stop state of the automobile 10 by the brake control device 26.
 これにより、カメラ19が検知対象物を所定範囲内の距離で検出されている間、自動車10は停止状態に維持される。よって、この安全運転支援システム20によれば、ドライバーの目が届かない死角の範囲の状況に応じて自動停止を制御するために、右折時のドライバーの安全運転を確実に支援することができる。 Thereby, while the camera 19 is detecting the detection target at a distance within a predetermined range, the automobile 10 is maintained in a stopped state. Therefore, according to the safe driving support system 20, since the automatic stop is controlled according to the condition of the blind spot range where the driver's eyes cannot reach, the safe driving of the driver at the right turn can be surely supported.
 本発明に係る自動車用画像表示システム1を統合した走行支援システム20は、Aピラー6に代表されるようなピラー部分以外にも、他のフレームやアウター部材等の自動車の構造体も車内から死角を形成する場合があり、これらの表面にカメラを取り付けて、そのステレオ画像をHUD装置4によりフロントウインドシールド15で表示させることもできる。 The driving support system 20 in which the image display system 1 for an automobile according to the present invention is integrated is not limited to the pillar portion represented by the A pillar 6, and other vehicle structures such as a frame and an outer member are also blind spots from the inside of the vehicle. The camera can be attached to these surfaces, and the stereo image can be displayed on the front windshield 15 by the HUD device 4.
 走行支援システム20は、上記のAピラー6により死角の画像以外にも、種々の情報を撮影画像や描画画像でドライバーに立体画像により表示するが、以下、幾つかの表示例を説明する。 The driving support system 20 displays various information in addition to the blind spot image by the A pillar 6 in the form of a stereoscopic image to the driver as a captured image or a drawn image. Several display examples will be described below.
[定速走行・車間距離制御を行なっているときの表示]
 走行支援システム20による自動車10の定速走行・車間距離制御について説明する。本実施形態での定速走行・車間距離制御は、例えば以下のように行われるが、当然ながら、これに限定されるものではない。
[Display during constant speed driving and inter-vehicle distance control]
The constant speed travel and inter-vehicle distance control of the automobile 10 by the travel support system 20 will be described. The constant speed travel / inter-vehicle distance control in the present embodiment is performed as follows, for example, but of course is not limited to this.
 中央演算処理部21は、ドライバーが走行支援スイッチ(図示せず)をオンにすることによって、自車輌の定速走行・車間距離制御を開始する。定速走行の車速及び先行車輌との車間距離は、ドライバーが走行支援スイッチをオンにする直前に、HMI部23のタッチパネル30に所望の値を入力して設定してもよいし、走行支援システム20のメモリ(図示せず)に記憶されている前回の設定値をそのまま使用することもできる。定速走行・車間距離制御は、先行車輌が検知されているときは追従走行モードで、先行車輌が検知されていないときは定速走行モードで、互いに切り換えて行われる。 The central processing unit 21 starts constant speed traveling and inter-vehicle distance control of the host vehicle when the driver turns on a driving support switch (not shown). The vehicle speed of constant speed driving and the inter-vehicle distance from the preceding vehicle may be set by inputting desired values on the touch panel 30 of the HMI unit 23 immediately before the driver turns on the driving support switch. The previous set value stored in the 20 memories (not shown) can be used as it is. The constant speed traveling / inter-vehicle distance control is performed by switching between the following traveling mode when the preceding vehicle is detected and the constant speed traveling mode when the preceding vehicle is not detected.
 最初に、目標先行車輌決定部21eがフロント側のカメラ17とセンサ21とにより、先行する全ての他車輌を検知する。そして、目標先行車輌決定部21eは、検知された全ての先行車輌との車間距離、相対速度、自車輌に対する方向等を検出して記憶すると共に、その中で自車輌と同じ車線を走行している先行車両の内、自車輌に最も近いものを追従走行するための目標の先行車輌に決定する。先行車両が自車両と同じ車線を走行しているか否かは、パターンマッチング処理部21aが車線を検出することで判別できる。 First, the target preceding vehicle determination unit 21e detects all preceding other vehicles by the front camera 17 and the sensor 21. The target preceding vehicle determination unit 21e detects and stores the inter-vehicle distance, relative speed, direction with respect to the own vehicle, etc. with respect to all the detected preceding vehicles, and travels in the same lane as the own vehicle. Among the preceding vehicles, the closest preceding vehicle is determined as the target preceding vehicle for following the vehicle. Whether the preceding vehicle is traveling in the same lane as the own vehicle can be determined by the pattern matching processing unit 21a detecting the lane.
 センサ21は、前記走行支援スイッチのオン/オフに拘わらず、常時走査させていることが好ましい。それにより、前記走行支援スイッチのオン操作に対して、目標先行車輌の決定を素早くできるだけでなく、先行車輌の検知を追突防止機能にも利用することができる。 It is preferable that the sensor 21 always scans regardless of whether the driving support switch is on or off. Thus, not only can the target preceding vehicle be quickly determined in response to the turning-on operation of the travel support switch, but detection of the preceding vehicle can also be used for the rear-end collision prevention function.
 先行車輌の存在は、先行車輌との間で近距離無線通信を行うことによって入手することもできる。このような先行車輌との車両間における情報通信によって行われる所謂、車車間通信により入手した情報を、センサ21及びフロントカメラ17による検出結果に加えることによって、先行車輌検知部11は、先行車輌の位置情報の精度を更に高めることができる。 The presence of the preceding vehicle can also be obtained by performing short-range wireless communication with the preceding vehicle. By adding information obtained by so-called vehicle-to-vehicle communication performed by information communication between vehicles with such a preceding vehicle to the detection results by the sensor 21 and the front camera 17, the preceding vehicle detection unit 11 can detect the preceding vehicle. The accuracy of the position information can be further increased.
 先行車輌の存在は、車両と道路周辺設備間等における各種情報通信、すなわち道路上に設置されたセンサやアンテナ等の通信装置と直接又はその周辺エリアのサーバーを介した無線通信によって行われる所謂、路車間通信によっても、入手可能である。かかる車車間通信及び/又は路車間通信のために、走行支援システム20は、外部と通信するための通信装置を備える。よって、車車間通信や路車間通信によって、周囲の車輛の位置情報や走行状況を取得する通信装置は、ドライバーへの提供情報を検出する検出手段となる。 Presence of the vehicle is a variety of information communication between the vehicle and road peripheral equipment, i.e., a so-called sensor or antenna installed on the road, or directly or wirelessly through a server in the surrounding area. It can also be obtained by road-to-vehicle communication. For such vehicle-to-vehicle communication and / or road-to-vehicle communication, the driving support system 20 includes a communication device for communicating with the outside. Therefore, a communication device that acquires position information and travel conditions of surrounding vehicles by vehicle-to-vehicle communication or road-to-vehicle communication serves as detection means for detecting information provided to the driver.
 中央演算処理部21は、追従走行モードにおいては、追従走行制御部21cにより、決定した目標先行車輌との車間距離を設定車間距離に維持するように、走行制御装置25を制御する。すなわち、目標先行車輌との現在の実際の車間距離が設定車間距離よりも長ければ、自動車10の車速を上げて目標先行車輌との車間距離を縮めるようにアクセル制御装置27を制御し、短ければ、自車輌の車速を下げて目標先行車輌との車間距離を伸ばすようにブレーキ制御装置26を制御し、同じであれば、現在の自車輌の車速を維持するようにブレーキ制御装置26とアクセル制御装置27とを制御する。 In the following traveling mode, the central processing unit 21 controls the traveling control device 25 so that the following distance control unit 21c maintains the determined inter-vehicle distance with the target preceding vehicle at the set inter-vehicle distance. That is, if the current actual inter-vehicle distance with the target preceding vehicle is longer than the set inter-vehicle distance, the accelerator control device 27 is controlled to increase the vehicle speed of the automobile 10 and reduce the inter-vehicle distance with the target preceding vehicle. The brake control device 26 is controlled so as to increase the inter-vehicle distance from the target preceding vehicle by lowering the vehicle speed of the own vehicle, and if the same, the brake control device 26 and the accelerator control are maintained so as to maintain the current vehicle speed of the own vehicle. The device 27 is controlled.
 図9は、追従走行モードにおけるHUD装置4の画面表示の一例を示している。この例では、追従している先行車41をマーキング42の描画で表示している。フロントウインドシールド15を通して見える現実風景40の道路上の自車輌の前方には、自車輌と同じ追越し車線43に1台の先行車輌41が、更に隣の走行車線44には別の1台の先行車輌45が走行している。この場面において、センサ21やカメラ17によって両方の先行車輌42,45が検知されているが、目標先行車輌決定部21eは、自車輌と同じ車線43上で自車輌に最も近い先行車輌41を目標先行車輌に決定する。したがって、この場合、先行車輌41が現実風景において、ドライバーが特に注視する検知対象物となる。 FIG. 9 shows an example of the screen display of the HUD device 4 in the follow-up running mode. In this example, the preceding vehicle 41 that is following is displayed by drawing a marking 42. In front of the vehicle on the road in the real scenery 40 seen through the front windshield 15, one preceding vehicle 41 is in the same overtaking lane 43 as the own vehicle, and another preceding vehicle lane 44 is further in front. Vehicle 45 is running. In this scene, both the preceding vehicles 42 and 45 are detected by the sensor 21 and the camera 17, but the target preceding vehicle determination unit 21e targets the preceding vehicle 41 closest to the own vehicle on the same lane 43 as the own vehicle. Decide on the preceding vehicle. Therefore, in this case, the preceding vehicle 41 is a detection target that the driver particularly pays attention to in the real scenery.
 このとき、目標先行車輌決定部21eは、検出した目標先行車輌42との車間距離、相対速度、方向等のデータは、メモリに記憶すると共に、センサ21及びカメラ17が継続して連続的に走査することによって、目標先行車輛42を自動追尾し、それとの車間距離、相対速度、方向等のデータを連続的に収集して記憶する。 At this time, the target preceding vehicle determination unit 21e stores data such as the detected inter-vehicle distance, relative speed, and direction from the target preceding vehicle 42 in the memory, and the sensor 21 and the camera 17 continuously scan the data. As a result, the target preceding vehicle 42 is automatically tracked, and data such as the inter-vehicle distance, relative speed, and direction are continuously collected and stored.
 そして、表示制御部3は、目標先行車輌決定により、先行車輛41が追従対象車両であることを表すマーキング42の画像を表示する。マーキング42は、図9に示すように、先行車輛41の後部輪郭と概ね重なるような枠の絵柄であり、表示制御部3は、画像メモリMから枠の描画画像を読み出して、これを立体画像で表示するよう右目用画像RP及び左目用画像LPの作成処理を行う。 Then, the display control unit 3 displays an image of the marking 42 indicating that the preceding vehicle 41 is a tracking target vehicle by determining the target preceding vehicle. As shown in FIG. 9, the marking 42 is a frame pattern that substantially overlaps the rear contour of the preceding vehicle 41, and the display control unit 3 reads the frame drawing image from the image memory M and displays this as a stereoscopic image. The right-eye image RP and the left-eye image LP are created so as to be displayed.
 このときの処理について説明する。フロントウインドシールド15の上部に固定で据え付けられているカメラ17が撮影する範囲とフロントウインドシールド15の面とが重なる領域は一定であり、表示制御部3にはこの重なる領域が予め設定されている。よって、表示制御部3は、目標先行車輌が決定されたとき、フロントウインドシールド15の中心点Oを通るX-Y軸を基準にして、カメラが撮影した画像の中から、先行車輛41の輪郭の中心点Pの座標位置を演算する。このとき先行車輛41の大きさは、その輪郭が画像に占める割合から判別できるため、その大きさに見合ったサイズのマーキング42の描画画像を画像メモリMから読み出す。 The processing at this time will be described. An area where the range of the image taken by the camera 17 fixedly installed on the upper part of the front windshield 15 and the surface of the front windshield 15 overlap is constant, and this overlapping area is preset in the display control unit 3. . Therefore, when the target preceding vehicle is determined, the display control unit 3 uses the contour of the preceding vehicle 41 from the image photographed by the camera with reference to the XY axis passing through the center point O of the front windshield 15. The coordinate position of the center point P is calculated. At this time, since the size of the preceding vehicle 41 can be determined from the ratio of the outline to the image, a drawing image of the marking 42 having a size corresponding to the size is read from the image memory M.
 そして、表示制御部3は、このマーキング42の描画画像から両眼視差を有する右目用画像RP及び左目用画像LPを作成してHUD装置4で表示する。この場合、表示制御部3は、ドライバーがマーキング42をその中心が演算で求めた前述の座標位置で、且つフロントウインドシールド15から先行車輛41までの奥行き方向の深度で視認できるような両眼視差を有する右目用画像RP及び左目用画像LPを作成する。 Then, the display control unit 3 creates a right-eye image RP and a left-eye image LP having binocular parallax from the rendered image of the marking 42 and displays them on the HUD device 4. In this case, the display control unit 3 uses the binocular parallax so that the driver can visually recognize the marking 42 at the above-described coordinate position whose center is obtained by calculation, and at a depth in the depth direction from the front windshield 15 to the preceding vehicle 41. A right-eye image RP and a left-eye image LP having
 よって、ドライバーは、フロントウインドシールド15を通して先行車輛41に目の焦点を合せている状態でマーキング42も同時に視認できる。これにより、ドライバーは、目標先行車輛42を実際に枠で囲ったように目に映るため、自車輛が追尾の対象としている目標先行車輛を確実に視認することができる。 Therefore, the driver can also visually recognize the marking 42 while the eyes are focused on the preceding vehicle 41 through the front windshield 15. As a result, the driver can visually recognize the target preceding vehicle that is being tracked by the driver because the driver can see the target preceding vehicle 42 as if it was actually surrounded by a frame.
 本実施形態でのマーキング42の画像は、図示するように、目標先行車輛41の輪郭を囲むような四角い太枠線で形成されているが、四角い太枠線以外の様々な形や表示を用いることができる。例えば、円形の太枠線や、枠で囲むのではなく矢印で示してもよく、さらには、ドライバーの注意をさらに喚起するために、例えば、赤又は橙等の目立つ色で表示することもできる。 As shown in the figure, the image of the marking 42 in the present embodiment is formed with a square thick frame line surrounding the outline of the target preceding vehicle 41, but various shapes and displays other than the square thick frame line are used. be able to. For example, it may be indicated by a circular thick border line or an arrow instead of being surrounded by a frame, and it may be displayed in a conspicuous color such as red or orange to further draw the driver's attention. .
 そして、定速走行制御部21dの制御による定速走行時には、表示制御部3は、HMI部23のタッチパネル30への入力により設定された走行速度を数字の描画画像によって立体表示する。さらに、表示制御部3は、定速走行速度を変更する場合には、設定速度に到達するまでの途中段階での車速を車速センサ25から取得して、逐次表示していく。 And, at the time of constant speed traveling under the control of the constant speed traveling control section 21d, the display control section 3 displays the traveling speed set by the input to the touch panel 30 of the HMI section 23 in a three-dimensional manner with a numerical drawing image. Furthermore, when changing the constant speed traveling speed, the display control unit 3 acquires the vehicle speed in the middle stage until reaching the set speed from the vehicle speed sensor 25 and sequentially displays it.
[ナビゲーションの表示]
 自動車10がナビゲーション装置33によって、目的地までへの経路案内が設定されているときには、HUD装置4によってナビゲーション情報を立体表示する。
[Show Navigation]
When the route guidance to the destination of the automobile 10 is set by the navigation device 33, the navigation information is stereoscopically displayed by the HUD device 4.
 HUD装置4により表示されるナビゲーション情報は、例えば、自動車10の進行方向に沿った矢印であり、交差点において真っ直ぐに進むときには直線の矢印、左折又は右折のときにはその方向に応じた屈折した矢印を表示する。 The navigation information displayed by the HUD device 4 is, for example, an arrow along the traveling direction of the automobile 10, and displays a straight arrow when traveling straight at an intersection, and a refracted arrow corresponding to the direction when turning left or right. To do.
 図10は、ナビゲーション走行時において、経路を左折する案内時のHUD装置4による画面表示の一例を示している。表示制御部3は、左折を表示する矢印46の描画画像を画像メモリMから読み出してきてHUD装置4に立体表示する。このとき、表示制御部3は、ナビゲーション装置33から左折する交差点47までの距離情報が送信されてくると、その距離に基づきフロントウインドシールド15の面上に設定しているY軸での座標位置を決定して、その位置に矢印46が立体表示されるような両眼視差を有する右目用画像RP及び左目用画像LPを作成し表示する。 FIG. 10 shows an example of a screen display by the HUD device 4 at the time of guidance for turning the route to the left during navigation travel. The display control unit 3 reads a drawing image of an arrow 46 that displays a left turn from the image memory M and displays the image on the HUD device 4 in a stereoscopic manner. At this time, when the distance information from the navigation device 33 to the intersection 47 that makes a left turn is transmitted, the display control unit 3 sets the coordinate position on the Y axis that is set on the surface of the front windshield 15 based on the distance. The right-eye image RP and the left-eye image LP having binocular parallax such that the arrow 46 is stereoscopically displayed at the position are created and displayed.
 よって、ドライバーは、フロントウインドシールド15を通して交差点47に目の焦点を合せている状態で、矢印46も視認することができる。これにより、ドライバーは、交差点47に重ねて左折の矢印が視認されるため、左折の準備を執ることができる。したがって、この場合、交差点47が現実風景において、ドライバーが特に注視する検知対象物となる。 Therefore, the driver can also visually recognize the arrow 46 while the eyes are focused on the intersection 47 through the front windshield 15. As a result, the driver can prepare for the left turn because the left turn arrow is visually recognized over the intersection 47. Therefore, in this case, the intersection 47 is a detection target that the driver particularly pays attention to in the real scenery.
[自動運転を行っているときの表示]
 本発明に係る自動車用画像表示システムは、ドライバーが直接には運転に関与しない完全自動運転車においては、ドライバーは、HUD装置4によって投影される立体画像によって自動運転状況を監視することができる。このような、自動運転中での本発明に係る自動車用画像表示システムによる表示の例を説明する。
[Display during automatic operation]
In the vehicle image display system according to the present invention, the driver can monitor the automatic driving situation by the stereoscopic image projected by the HUD device 4 in a fully automatic driving vehicle in which the driver is not directly involved in driving. An example of display by such an automobile image display system according to the present invention during automatic driving will be described.
 自動運転は、ナビゲーション装置33が事前に乗員から入力された目的地までの走行ルートの候補を1つ又は複数演算し、ドライバーの確認又は承認により決定された走行ルートに沿って、自動運転制御部21fが走行制御装置25を制御することで行われる。 In the automatic driving, the navigation device 33 calculates one or a plurality of driving route candidates to the destination input in advance by the occupant, and along the driving route determined by the driver's confirmation or approval, the automatic driving control unit 21f is performed by controlling the traveling control device 25.
 この自動運転中に表示制御部3は、図11で示すように、自車輌の走行コースを表す矢印48による描画画像の右目用画像RPと左目用画像LPとを作成して、HUD装置4で表示する。この場合、表示制御部3は、パターンマッチング処理部21aによるカメラ17が撮影した画像の処理により自車両の走行路49の座標位置を検出し、その座標位置にドライバーが両眼視差で矢印48を立体画像で視認可能なよう右目用画像RP及び左目用画像LPを作成する。 During this automatic driving, the display control unit 3 creates a right-eye image RP and a left-eye image LP of the drawn image by the arrow 48 representing the traveling course of the own vehicle, as shown in FIG. indicate. In this case, the display control unit 3 detects the coordinate position of the traveling path 49 of the host vehicle by processing the image captured by the camera 17 by the pattern matching processing unit 21a, and the driver displays an arrow 48 with binocular parallax at the coordinate position. A right-eye image RP and a left-eye image LP are created so as to be visually recognized as a stereoscopic image.
 よって、ドライバーは、フロントウインドシールド15を通してその奥行き方向に目の焦点を合せている状態で矢印48を視認できるため、ドライバーには、あたかもフロントウインドシールド15を通して見える現実風景36の道路49上に、矢印48が示されているように認識できる。 Therefore, since the driver can visually recognize the arrow 48 in a state in which the eyes are focused in the depth direction through the front windshield 15, the driver can see the road 49 in the real scenery 36 seen through the front windshield 15. It can be recognized as the arrow 48 is shown.
 さらに、表示制御部3は、自動運転中の自動車10が現在の走行コースから逸れたり車線変更の走行行動をする前に、その執ろうとしている次の走行行動を、ドライバー及び他の乗員に直感的かつ瞬間的に分かり易く表示することで予告する。本実施形態において、予定の走行行動と異なる走行行動を取るべきか、又はどのように行動するかの判断は、中央演算処理部21が自動運転制御部21fを実現するために実行する自動運転の制御プログラムに従って決定する。 Further, the display control unit 3 intuitively informs the driver and other occupants of the next driving behavior that the automatic driving vehicle 10 is about to take before the vehicle 10 deviates from the current driving course or changes lanes. Predict by displaying easily and momentarily in an easy-to-understand manner. In the present embodiment, the determination as to whether or how to take a driving action different from the planned driving action is an automatic driving executed by the central processing unit 21 to realize the automatic driving control unit 21f. Determine according to the control program.
 図12は、自車輌が自動運転中に、追越車線52から走行車線53へ車線変更する場合に、それに先立ってHUD装置4にそれぞれ表示される予告のための画面を模式的に示している。このときの表示には、予定走行コースを示す矢印50による描画画像に加えて、車線変更の走行コースを矢印51による描画画像が表示される。 FIG. 12 schematically shows a preview screen displayed on the HUD device 4 prior to the lane change from the overtaking lane 52 to the traveling lane 53 while the vehicle is in automatic operation. . In the display at this time, in addition to the drawing image indicated by the arrow 50 indicating the scheduled driving course, the drawing image indicated by the arrow 51 indicating the driving course for changing the lane is displayed.
 矢印51は、現実風景36の追越車線38上の位置から、移動先の隣接する走行車線45上の位置まで、自車輌が自動走行すると予測される軌跡に沿って湾曲する形状となっている。これらの矢印50,51も画像メモリMから読み出されて、立体画像で表示されることで、ドライバーは、前方の車両に目の焦点を併せている状態で矢印が視認できる。よって、ドライバーは、現在走行中の追越車線52から、走行車線53へ車線変更の運転操作が行われようとしていることを、より明確に、直感的かつ具体的に認識することができる。 The arrow 51 has a shape that curves along a trajectory where the host vehicle is predicted to automatically travel from a position on the overtaking lane 38 of the real scene 36 to a position on the traveling lane 45 adjacent to the destination. . These arrows 50 and 51 are also read from the image memory M and displayed as a stereoscopic image, so that the driver can visually recognize the arrows in a state where the eyes are focused on the vehicle ahead. Therefore, the driver can more clearly, intuitively and specifically recognize that the driving operation for changing the lane from the currently overtaking lane 52 to the traveling lane 53 is about to be performed.
 さらに、図12の例では、矢印50,51による描画画像と共に自車輌を仮想させるアイコン54による描画画像を表示している。そして、表示制御部3は、自車輛がこれから行おうとする車線変更の経過をアイコン54の動きでHUD装置4によって仮想表示する。図12の(a)から(d)までは、このときの表示を時間順に模式的に示している。 Furthermore, in the example of FIG. 12, the drawing image by the icon 54 which makes the own vehicle virtual is displayed with the drawing image by the arrows 50 and 51. FIG. Then, the display control unit 3 virtually displays the progress of the lane change that the vehicle is about to perform from the HUD device 4 by the movement of the icon 54. FIGS. 12A to 12D schematically show the display at this time in time order.
 先ず、図12(a)では、アイコン54は、自車輛がフロントウインドシールド15の直前を走行しているかの如き状態を示している。このとき、表示制御部3は、アイコン54がドライバーによって視認される位置が、同じ車線52の先行車輛55の後方であるかのような両目視差が得られるように右目用画像RP及び左目用画像LPを作成して、HUD装置4で表示する。 First, in FIG. 12A, the icon 54 indicates a state as if the vehicle is traveling just before the front windshield 15. At this time, the display control unit 3 displays the right-eye image RP and the left-eye image so that the binocular parallax can be obtained as if the position where the icon 54 is visually recognized by the driver is behind the preceding vehicle 55 in the same lane 52. An LP is created and displayed on the HUD device 4.
 そして、表示制御部3は、以下、図12(b),(c),(d)の仮想画像で順次示すように、アイコン54が追越車線52から走行車線53へと車線変更して離れていく様子をそれぞれ立体画像でドライバーが視認できるように、右目用画像RP及び左目用画像LPを作成して、HUD装置4で表示する。このとき、車線変更の経過に伴い自車両の外観形状も変化することになるが、表示制御部3は、その経過時に応じた自車両の外観形状を表わすアイコン54の描画画像を描画メモリMから読み出して表示する。 The display control unit 3 then changes the lane from the overtaking lane 52 to the traveling lane 53 as shown in the virtual images of FIGS. 12 (b), (c), and (d). A right-eye image RP and a left-eye image LP are created and displayed on the HUD device 4 so that the driver can visually recognize each of the moving images. At this time, the appearance shape of the host vehicle also changes with the progress of the lane change, but the display control unit 3 displays a drawing image of the icon 54 representing the appearance shape of the host vehicle according to the progress from the drawing memory M. Read and display.
 このような仮想車輌のアイコン54による車線変更のシミュレーション表示は、自車輌の実際の走行速度よりも高速で表示することが好ましい。それにより、車線変更を予告した後、自動運転制御部21fが予定している車線変更のタイミングまでの間に、ドライバーに車線変更の中止や手動運転への切換を判断させ、必要があれば準備させる時間的余裕を持たせることができる。 It is preferable that the simulation display of the lane change by such a virtual vehicle icon 54 is displayed at a speed higher than the actual traveling speed of the own vehicle. As a result, after the advance notice of the lane change and before the timing of the lane change scheduled by the automatic driving control unit 21f, the driver decides to cancel the lane change or switch to the manual operation, and prepare if necessary. It is possible to have a time allowance.
 また、アイコン54は、仮想での車線変更の終了後直ちに消滅させることができ、又はそれぞれフロントウインドシールド15の同じ位置に表示させた状態で、或る短い時間維持することもできる。また、図12の(a)乃至(d)までの動画を繰り返して表示させることもできる。 Further, the icon 54 can be disappeared immediately after the virtual lane change is completed, or can be maintained for a short period of time while being displayed at the same position on the front windshield 15. Also, the moving images from (a) to (d) of FIG. 12 can be repeatedly displayed.
 以上、詳述したように、本発明に係る自動車用画像表示システムは、ドライバーによって運転操作が直接行われる場合や走行支援システムで実現されるレベル1からレベル4までの何れの段階での自動運転においても、安全走行に必要な情報を表示によりドライバーに提供することができる。 As described above in detail, the image display system for an automobile according to the present invention is an automatic driving at any stage from level 1 to level 4 realized when the driving operation is directly performed by the driver or the driving support system. In this case, information necessary for safe driving can be provided to the driver by display.
1   自動車用画像表示システム
2   検出部(検出手段)
3   表示制御部(描画手段)
4   HUD装置(表示手段)
17、18、19  カメラ
20、21  センサ
DESCRIPTION OF SYMBOLS 1 Image display system for cars 2 Detection part (detection means)
3 Display controller (drawing means)
4 HUD device (display means)
17, 18, 19 Camera 20, 21 Sensor

Claims (8)

  1.  自動車の乗員への提供情報を画像で表示する自動車用画像表示システムであって、
     前記提供情報を検出する検出手段と、
     前記提供情報の立体画像を構成する右目用画像及び左目用画像を前記自動車のフロントウインドシールドに向けて投影する表示手段と、を備えて、
     前記乗員が前記自動車の前方への視線上の前記フロントウインドシールドを挟んだ奥行き方向の位置に前記提供情報を表示可能にしたことを特徴とする自動車用画像表示システム。
    An image display system for an automobile that displays information provided to an occupant of an automobile as an image,
    Detecting means for detecting the provided information;
    Display means for projecting a right-eye image and a left-eye image constituting the stereoscopic image of the provided information toward the front windshield of the automobile,
    An image display system for an automobile, wherein the occupant can display the provided information at a position in a depth direction with the front windshield on the line of sight toward the front of the automobile.
  2.  前記検出手段を前記自動車の周囲の状況を撮影するカメラで構成したことを特徴とする請求項1に記載の自動車用画像表示システム。 2. The image display system for an automobile according to claim 1, wherein the detecting means is constituted by a camera for photographing a situation around the automobile.
  3.  前記カメラは、右目用画像と左目用画像とを撮影するステレオカメラであることを特徴とする請求項2に記載の自動車用画像表示システム。 The vehicle image display system according to claim 2, wherein the camera is a stereo camera that captures a right-eye image and a left-eye image.
  4.  前記検出手段は、前記自動車の周囲の状況又は自動車の走行状態を検知するセンサであり、前記センサによる前記提供情報を視覚的に表示可能な描画画像に変換する描画手段を含むことを特徴とする請求項1に記載の自動車用画像表示システム。 The detection means is a sensor for detecting a situation around the automobile or a running state of the automobile, and includes a drawing means for converting the provided information by the sensor into a drawing image that can be visually displayed. The image display system for automobiles according to claim 1.
  5.  前記描画画像は、前記フロントウインドシールドを通して視認可能な現実風景における特定の検知対象物に関連付けて表示することを特徴とする請求項4に記載の自動車用画像表示システム。 5. The vehicle image display system according to claim 4, wherein the drawn image is displayed in association with a specific detection object in a real scene that is visible through the front windshield.
  6.  前記表示手段は、前記乗員が前記フロントウインドシールドから前記検知対象物を視認するときの奥行き方向の深度と略同じ深度で、前記検知対象物を視認することができる両眼視差を有する前記右目用画像及び前記左目用画像を投影することを特徴とする請求項5に記載の自動車用画像表示システム。 The display means is for the right eye having binocular parallax that allows the occupant to visually recognize the detection object at a depth substantially the same as the depth in the depth direction when the occupant visually recognizes the detection object from the front windshield. The vehicle image display system according to claim 5, wherein the image and the left-eye image are projected.
  7.  前記検知手段は、前記自動車の現在位置を特定して周辺の地図データから進行方向を案内するナビゲーション装置であって、
     前記案内の描画画像を前記風景に関連付けて表示することを特徴とする請求項5又は6に記載の自動車用画像表示システム。
    The detection means is a navigation device that identifies a current position of the automobile and guides a traveling direction from surrounding map data,
    The car image display system according to claim 5 or 6, wherein the drawn image of the guidance is displayed in association with the landscape.
  8.  前記検出手段は、車車間通信又は路車間通信によって、付近の車輛についての前記提供情報を取得する通信装置であることを特徴とする請求項1に記載の自動車用画像表示システム。 2. The vehicle image display system according to claim 1, wherein the detection means is a communication device that acquires the provision information about a nearby vehicle by vehicle-to-vehicle communication or road-to-vehicle communication.
PCT/JP2016/081866 2015-11-09 2016-10-27 Image display system for vehicle WO2017082067A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-219546 2015-11-09
JP2015219546A JP6744064B2 (en) 2015-11-09 2015-11-09 Automotive image display system

Publications (1)

Publication Number Publication Date
WO2017082067A1 true WO2017082067A1 (en) 2017-05-18

Family

ID=58695212

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/081866 WO2017082067A1 (en) 2015-11-09 2016-10-27 Image display system for vehicle

Country Status (2)

Country Link
JP (1) JP6744064B2 (en)
WO (1) WO2017082067A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018198422A (en) * 2017-05-23 2018-12-13 トヨタ自動車株式会社 Traffic situation awareness for autonomous vehicle
CN111521414A (en) * 2020-06-19 2020-08-11 上海机动车检测认证技术研究中心有限公司 Projection measurement system
CN112740007A (en) * 2018-09-21 2021-04-30 本田技研工业株式会社 Vehicle inspection system
EP4046846A1 (en) * 2021-02-18 2022-08-24 Toyota Jidosha Kabushiki Kaisha Vehicle display device

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6958252B2 (en) 2017-11-07 2021-11-02 トヨタ自動車株式会社 Remote monitoring system, autonomous vehicle and remote monitoring method
JP2019177726A (en) * 2018-03-30 2019-10-17 コニカミノルタ株式会社 Virtual rear view mirror device
JP7064688B2 (en) * 2018-04-20 2022-05-11 日本精機株式会社 Vehicle display devices, vehicle display systems, and vehicle display programs
JPWO2021039257A1 (en) * 2019-08-28 2021-03-04
DE102020214843A1 (en) 2020-11-26 2022-06-02 Volkswagen Aktiengesellschaft Method for representing a virtual element
JP7447039B2 (en) 2021-03-10 2024-03-11 矢崎総業株式会社 Vehicle display device
JP2022148856A (en) * 2021-03-24 2022-10-06 本田技研工業株式会社 Vehicle display device, display control method, and program

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3194024B2 (en) * 1993-10-04 2001-07-30 本田技研工業株式会社 Information display device for vehicles
JP2005067515A (en) * 2003-08-27 2005-03-17 Denso Corp Vehicular display device
JP2006350934A (en) * 2005-06-20 2006-12-28 Denso Corp Information display device
JP2009276994A (en) * 2008-05-14 2009-11-26 Nissan Motor Co Ltd Method for acquiring vehicle information and vehicle drive supporting system
JP2011081141A (en) * 2009-10-06 2011-04-21 Seiko Epson Corp Lens array unit, electrooptical apparatus and electric apparatus
JP2014089510A (en) * 2012-10-29 2014-05-15 Toyota Motor Corp Preceding vehicle display device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004284417A (en) * 2003-03-19 2004-10-14 Sumitomo Electric Ind Ltd Image display method and image display system
JP2007168670A (en) * 2005-12-22 2007-07-05 Fujitsu Ten Ltd On-vehicle display device
JP2009132259A (en) * 2007-11-30 2009-06-18 Denso It Laboratory Inc Vehicle surrounding-monitoring device
JP2010030524A (en) * 2008-07-30 2010-02-12 Toyota Motor Corp Blind side indicator
JP2010137697A (en) * 2008-12-11 2010-06-24 Honda Motor Co Ltd Drive assisting device for vehicle
JP2014010418A (en) * 2012-07-03 2014-01-20 Yazaki Corp Stereoscopic display device and stereoscopic display method
JP6337418B2 (en) * 2013-03-26 2018-06-06 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3194024B2 (en) * 1993-10-04 2001-07-30 本田技研工業株式会社 Information display device for vehicles
JP2005067515A (en) * 2003-08-27 2005-03-17 Denso Corp Vehicular display device
JP2006350934A (en) * 2005-06-20 2006-12-28 Denso Corp Information display device
JP2009276994A (en) * 2008-05-14 2009-11-26 Nissan Motor Co Ltd Method for acquiring vehicle information and vehicle drive supporting system
JP2011081141A (en) * 2009-10-06 2011-04-21 Seiko Epson Corp Lens array unit, electrooptical apparatus and electric apparatus
JP2014089510A (en) * 2012-10-29 2014-05-15 Toyota Motor Corp Preceding vehicle display device

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018198422A (en) * 2017-05-23 2018-12-13 トヨタ自動車株式会社 Traffic situation awareness for autonomous vehicle
CN112740007A (en) * 2018-09-21 2021-04-30 本田技研工业株式会社 Vehicle inspection system
CN112740007B (en) * 2018-09-21 2023-06-30 本田技研工业株式会社 Vehicle inspection system
CN111521414A (en) * 2020-06-19 2020-08-11 上海机动车检测认证技术研究中心有限公司 Projection measurement system
EP4046846A1 (en) * 2021-02-18 2022-08-24 Toyota Jidosha Kabushiki Kaisha Vehicle display device
US11705007B2 (en) 2021-02-18 2023-07-18 Toyota Jidosha Kabushiki Kaisha Vehicle display device

Also Published As

Publication number Publication date
JP2017092678A (en) 2017-05-25
JP6744064B2 (en) 2020-08-19

Similar Documents

Publication Publication Date Title
WO2017082067A1 (en) Image display system for vehicle
JP6466899B2 (en) Vehicle display device
US10436600B2 (en) Vehicle image display system and method
JP5160564B2 (en) Vehicle information display device
JP6699646B2 (en) Vehicle display control device
JP6252365B2 (en) Safety confirmation support system, safety confirmation support method
JP6273976B2 (en) Display control device for vehicle
WO2016186039A1 (en) Automobile periphery information display system
CN110730740B (en) Vehicle control system, vehicle control method, and storage medium
JP2018203007A (en) Vehicle control system, vehicle control method and vehicle control program
JP6459205B2 (en) Vehicle display system
JP6827378B2 (en) Vehicle control systems, vehicle control methods, and programs
JP2006284458A (en) System for displaying drive support information
CN110603166A (en) Vehicle control system, vehicle control method, and vehicle control program
CN110786004A (en) Display control device, display control method, and program
EP3775780A1 (en) Control apparatus, display apparatus, movable body, and image display method
JP2016107947A (en) Information providing device, information providing method, and control program for providing information
JP6840035B2 (en) Vehicle control system
US20210197863A1 (en) Vehicle control device, method, and program
JP6589991B2 (en) Human interface
JP2021020519A (en) Display control device for vehicle and display control method for vehicle
US11338824B2 (en) Surrounding situation display method and surrounding situation display device
JP6365409B2 (en) Image display device for vehicle driver
WO2017138343A1 (en) Driving teaching device
JP7302311B2 (en) Vehicle display control device, vehicle display control method, vehicle display control program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16864031

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16864031

Country of ref document: EP

Kind code of ref document: A1