WO2017082067A1 - Système d'affichage d'image pour véhicule - Google Patents

Système d'affichage d'image pour véhicule Download PDF

Info

Publication number
WO2017082067A1
WO2017082067A1 PCT/JP2016/081866 JP2016081866W WO2017082067A1 WO 2017082067 A1 WO2017082067 A1 WO 2017082067A1 JP 2016081866 W JP2016081866 W JP 2016081866W WO 2017082067 A1 WO2017082067 A1 WO 2017082067A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
vehicle
automobile
eye image
display system
Prior art date
Application number
PCT/JP2016/081866
Other languages
English (en)
Japanese (ja)
Inventor
修一 田山
Original Assignee
修一 田山
株式会社イマージュ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 修一 田山, 株式会社イマージュ filed Critical 修一 田山
Publication of WO2017082067A1 publication Critical patent/WO2017082067A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to an image display system for automobiles used to support driving of vehicles such as automobiles.
  • a vehicle information transmission device for intuitively transmitting danger and navigation around the own vehicle to a driver intuitively (see, for example, Patent Document 1).
  • This device detects dangerous objects such as pedestrians, bicycles or other vehicles existing around the front using a camera or radar sensor mounted on the vehicle, and displays them on the display device of the instrument panel as a graphic. And let the driver recognize its existence.
  • a vehicle periphery monitoring device that displays the presence and type of a detection object that has a high possibility of contact with the host vehicle on an image display device including a HUD (head-up display) and notifies the driver.
  • HUD head-up display
  • the type of pedestrian, bicycle, animal, etc. is determined, and the mark in a different form and the position of the detection object are placed according to the type difference.
  • the surrounding rectangular frame is displayed on the HUD.
  • Patent Document 2 According to the conventional technology of Patent Document 2, it is not necessary for the driver to change the line of sight in order to display information on the front windshield, which is more advantageous than that of Patent Document 1.
  • the present invention has been made to solve the above problems, and an object thereof is to provide an automobile image display system capable of displaying information toward a front windshield in an easy-to-understand manner for a driver. It is in.
  • the present invention is an image display system for an automobile that displays information provided to an occupant of an automobile as an image, the detecting means for detecting the provided information, and a right-eye image and a left-eye image constituting a stereoscopic image of the provided information.
  • the detection means is a camera that captures a situation around the automobile, and a captured image of the camera is used as the provision information.
  • This camera may be a stereo camera that captures a right-eye image and a left-eye image.
  • the detection means is a sensor that detects a situation around the automobile or a running state of the automobile, and converts the provided information by the sensor into a drawn image that can be visually displayed by the drawing means.
  • the drawn image is displayed in association with a specific detection target in a real scene that can be viewed through the front windshield, so that the occupant can view the detection target with caution.
  • the display means displays a drawn image
  • the occupant can visually recognize the detection object at a depth substantially equal to a depth in a depth direction when the occupant visually recognizes the detection object from the front windshield.
  • the right-eye image and the left-eye image having binocular parallax that can be projected are projected, the occupant can more easily and reliably visually recognize the detection target.
  • the detection means is a navigation device that identifies the current position of the vehicle and guides the traveling direction from surrounding map data, and it is preferable to display a drawing image of this guidance in association with the detection object.
  • the detection means provides the above-described provision of nearby vehicles by vehicle / vehicle communication (hereinafter referred to as “vehicle-to-vehicle communication” as appropriate) or road facility / vehicle communication (hereinafter referred to as “road-to-vehicle communication” as appropriate). It may be a communication device that acquires information. Vehicle-to-vehicle communication is performed by short-range wireless communication with other vehicles, and road-to-vehicle communication is various information communication between road peripheral equipment and the like. The situation of other vehicles can be obtained with high accuracy by vehicle-to-vehicle communication or road-to-vehicle communication.
  • the provision information to the occupant is projected by the right-eye image and the left-eye image, the occupant can visually recognize the provision information as a stereoscopic image. Therefore, since the occupant can view the provided information in a state where the eyes are focused in front of the front windshield, it is not necessary to switch the focus of the eyes for confirmation of the provided information, and there is no oversight.
  • FIG. 1 is a block diagram showing the overall configuration of an automobile image display system according to the present invention.
  • 1 is a plan view showing an example of an automobile in which a camera and a sensor are arranged in an automobile image display system according to the present invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS It shows with the conceptual diagram which looked at the front windshield and dashboard of the motor vehicle provided with the image display system for motor vehicles based on this invention from the driver's seat side.
  • the typical explanatory view of the image display by a HUD device is shown.
  • the schematic diagram explaining an example of the display in the front windshield of the image display system for motor vehicles concerning the present invention is shown.
  • the schematic diagram explaining another example of the display in the front windshield of the image display system for motor vehicles based on this invention is shown.
  • FIG. 1 is a block diagram showing the overall configuration of a driving support system that combines an automobile image display system according to the present invention.
  • movement of the image display for motor vehicles is shown.
  • the schematic diagram explaining an example of the display by the front windshield in a driving assistance system is shown.
  • the schematic diagram explaining another example of the display by the front windshield in a driving assistance system is shown.
  • the schematic diagram explaining another example of the display by the front windshield in a driving assistance system is shown.
  • FIG. 1 is a block diagram showing a schematic configuration of an automobile image display system according to the present invention.
  • the vehicle image display system 1 includes a detection unit 2, a display control unit 3, and a head-up display (HUD) device 4 that is a display unit.
  • HUD head-up display
  • the detection unit 2 includes a camera unit 2A and a sensor unit 2B that monitor the situation around the automobile.
  • the captured image by the camera unit 2A and the detection information by the sensor unit 2B are provided information to the vehicle occupant. Therefore, the detection unit 2 is a detection unit that detects the provided information.
  • the camera unit 2A has, for example, three cameras 17, 18, and 19 as shown in FIG. 2 for photographing the situation around the automobile.
  • the camera 17 is installed on the upper part of the front windshield 15 and photographs the front view of the automobile 10.
  • the camera 18 is disposed on the back windshield and photographs the rear view of the automobile 10.
  • the sensor unit 2B includes sensors 20 and 21 for detecting the presence of an automobile, a bicycle, a person, an animal, or the like (hereinafter referred to as “detection object”) around the automobile.
  • the sensors 20 and 21 are, for example, radar sensors. As shown in FIG. 3, the sensor 20 is attached to the lower part of the left and right door mirrors 35 and detects the presence of the detection object at the left and right rear.
  • the sensor 21 is attached to the central portion of the vehicle body at the front of the automobile 10 and detects the presence of a detection object existing in front.
  • the radar sensor millimeter wave radar, microwave radar, laser radar, infrared sensor, ultrasonic sensor, or the like is used.
  • the steering angle sensor 24, the vehicle speed sensor 25, the gear position sensor 26, and the like provided in the traveling system of the automobile 10 also constitute the detection unit 2 here.
  • the display control unit 3 displays the information acquired by the detection unit 2 on the HUD device 4 as a stereoscopic image.
  • images to be displayed include images taken by the cameras 17, 18, 19, and detection information from the sensors 20, 21, the steering angle sensor 24, and the vehicle speed sensor 25 in the form of graphics such as graphics and icons or letters / numbers.
  • graphics such as graphics and icons or letters / numbers.
  • the display control unit 3 includes an image memory M in which these various drawn images are stored in advance.
  • each camera 17, 18 and 19 uses a stereo camera in which a binocular lens corresponding to the left and right eyes of a human is housed in a single housing, thereby providing a display control unit. 3 outputs the right-eye image and the left-eye image input from the stereo camera to the HUD device 4.
  • the cameras 17, 18, and 19 are not limited to stereo cameras, and may be cameras that can capture stereo images by computer processing even if they are monocular cameras.
  • the display control unit 3 In order to display the drawn image, the display control unit 3 appropriately displays images and characters suitable for visually displaying detection information from the sensors 20 and 21, the steering angle sensor 24, the vehicle speed sensor 25, and the gear position sensor 26. A number drawing image is selected from those stored in advance in the image memory M. Then, a right-eye image and a left-eye image are created from the selected drawing image and output to the HUD device 4. Accordingly, in this case, the display control unit 3 functions as a drawing unit that converts the provided information into a drawing image that can be visually displayed.
  • the HUD device 4 has a conventionally known structure and mechanism. For example, as shown in FIG. 4, a right-eye image display unit 6, a left-eye image display unit 7, and a mirror 8. It is the composition provided with.
  • the HUD device 4 is disposed on the upper surface of the dashboard 39 and projects an image toward the front windshield 15. Further, the HUD device 4 may be incorporated in the dashboard 39 or may project an image onto the front windshield 15 from the ceiling of the driver's seat.
  • the right-eye image display unit 6 and the left-eye image display unit 7 of the HUD device 4 are displays for displaying the right-eye image RP and the left-eye image LP, respectively, for example, ELD (electroluminescence display), fluorescent display tube Display devices such as FED (Field Emission Display) and LCD (Liquid Crystal Display) are used.
  • ELD electroluminescence display
  • FED Field Emission Display
  • LCD Liquid Crystal Display
  • the mirror 8 of the HUD device 4 is disposed so as to face the right-eye image display unit 6 and the left-eye image display unit 7, and the right-eye image display unit 6 and the left-eye image display unit 7 respectively display the images.
  • the right-eye image RP and the left-eye image LP are reflected toward the front windshield 15.
  • the mirror 8 projects the right-eye image RP and the left-eye image LP displayed by the right-eye image display unit 6 and the left-eye image display unit 7 at a magnification determined in advance by the mirror shape.
  • the front windshield 15 is formed in a curved shape curved in the vertical direction and the horizontal direction according to the outer shape of the automobile 10 using laminated glass, IR cut glass, UV cut glass or the like.
  • the front windshield 15 transmits the line of sight of the driver of the occupant and reflects the right-eye image RP and the left-eye image LP projected from the HUD device 4 in the line-of-sight direction.
  • the HUD device 4 projects the right-eye image RP and the left-eye image LP output from the display control unit 3, so that the right-eye image RP and the left-eye image are displayed.
  • LP is reflected by the front windshield 15 to both eyes of the driver.
  • the human eye is focused several meters in front, it will not be greatly out of focus even for objects at infinity. Therefore, when the driver focuses on his / her eyes in the depth direction in front of the front windshield 15 to check the field of view in front of the automobile 10, he / she misses this when the image is projected on the front windshield 15. The possibility increases. However, by configuring the driver so that the provided information can be viewed stereoscopically, the provided information can be viewed in a state in which the driver is focused in the depth direction in front of the front windshield 15, and the provided information can be easily and reliably provided. It will not be recognized and overlooked.
  • the steering angle sensor 24 detects the camera 18 at that time.
  • 3D can be displayed on the HUD device 4. Therefore, as shown in FIG. 4, the driver can visually recognize the stereoscopic image TD in the state in which the eyes are focused in the front depth direction through the front windshield 15, so that the driver crosses the right side of the automobile 10.
  • the brake operation can be performed by recognizing the presence of a pedestrian.
  • the driver can accurately grasp the situation without fear of being confused with the actual scene in front of the car viewed through the front windshield 14. Become.
  • the display control unit 3 may reproduce the stereoscopic image at this depth as necessary. Processing of the right eye image RP and the left eye image LP taken by the camera 18 can be performed.
  • the display control unit 3 uses the right eye image RP and the left eye image LP captured by the camera 18 as the HUD device 4. Project by. Therefore, the driver can visually recognize the presence of a pedestrian or the like behind the vehicle 10 while viewing the situation in front of the automobile 10 through the front windshield 15.
  • the display control unit 3 acquires the current vehicle speed value from the vehicle speed sensor 25, the display control unit 3 reads out images of numbers and characters for visually displaying the value from the image memory M, and from this image, the right-eye image RP and A left-eye image LP is created, and a drawing image of the speed display 34 (“60 km / h” in the illustrated example) shown in FIG. 5 is displayed.
  • the display control unit 3 allows the right-eye image RP of the drawing image and the right-eye image RP to be visually recognized so that a binocular parallax reproduced in accordance with the depth in the depth direction from the front windshield 15 of the drawing image indicating the vehicle speed is visible.
  • a left-eye image LP is created.
  • a drawing image representing the state of the gear detected by the gear position sensor 26 is also displayed as a stereoscopic image based on binocular parallax.
  • the display control unit 3 detects that the driver turns the steering wheel 22 to the right or left in order to change the lane by the steering angle sensor 24, the display control unit 3 detects the presence of the detection object behind the direction from the sensor 20. Detect by information. At this time, if the detected object to be detected is larger than a predetermined size, the sensor 20 determines that the detected object is approaching and outputs a detection signal of the detected object to the display control unit 3.
  • the display control unit 3 reads out a drawing image of the prohibition mark 13 indicating that there is a danger in changing the lane from the drawing memory M, and displays the drawing in three dimensions by the HUD device 4.
  • a prohibition mark 13 is displayed on the left side because there is another vehicle approaching from the left rear in an attempt to change the lane in the left lane.
  • the prohibition mark 13 is displayed in a three-dimensional manner, the driver can recognize the prohibition mark 13 while focusing on the eyes in the depth direction ahead through the front windshield 15, so that the lane change can be surely made. Can detect danger. It is also possible to arrange a camera instead of the sensor 20 that detects the left and right rear detection objects, and display the rear other vehicle photographed by the camera as a stereoscopic image by the HUD device 4.
  • the above-described image display system 1 for an automobile according to the present invention is effectively applied to an automobile 10 provided with a driving support system for supporting safe driving.
  • the embodiment will be described below.
  • Level 1 is called a safe driving support system, where the vehicle performs acceleration, steering, or braking.
  • Level 2 is called a quasi-automatic driving system, in which a vehicle simultaneously performs a plurality of operations of acceleration, steering, and braking.
  • Level 3 is a state in which acceleration, steering, and braking are all performed by the vehicle and the driver responds only in an emergency. This is also called a semi-automatic driving system.
  • Level 4 is called a fully automatic driving system in which acceleration, steering and braking are all performed by a person other than the driver and the driver is not involved at all.
  • “automatic driving” refers to traveling at level 3 or level 4.
  • FIG. 7 schematically shows the overall configuration of a driving support system 20 that combines the automobile image display system 1 of the present embodiment.
  • the driving support system 20 includes a central processing unit 21 including a CPU, a ROM, and a RAM.
  • the central processing unit 21 includes the detection unit 2 described in FIG. 1, an HMI (Human Interface) unit 23, The traveling control device 25 and the navigation device 33 are connected to each other.
  • HMI Human Interface
  • the HMI unit 23 includes the touch panel 30 and the monitor 31 (shown in FIG. 3) together with the HUD device 4 described in FIG.
  • the monitor 31 displays, for example, a rear view taken by the camera 18 when the automobile 10 is moved backward.
  • the touch panel 30 is formed by software on the screen of the monitor 31.
  • the traveling control device 25 includes a brake control device 26 and an accelerator control device 27.
  • the brake control device 26 controls a brake operating device (not shown) to automatically decelerate the automobile 10.
  • the accelerator control device 27 automatically accelerates the automobile 10 by controlling a throttle actuator (not shown) that adjusts the throttle opening.
  • the navigation device 33 reads map data around the automobile 10 from the map database based on GPS information received from a GPS (global positioning system) 32.
  • GPS global positioning system
  • the central processing unit 21 executes a control program stored in the ROM by the CPU, thereby realizing a pattern matching processing unit 21a, a stop control unit 21b, a follow-up travel control unit 21c, and a constant speed travel control. It has each function of the part 21d, the target preceding vehicle determination part 21e, the automatic driving control part 21f, and the display control part 3 demonstrated in FIG.
  • the pattern matching processing unit 21a performs pattern matching processing on the images sent from the cameras 17, 18, and 19 to detect the contours of pedestrians, bicycles, motorcycles, etc. from the images. Determine the existence of.
  • the ROM includes a pattern data file F in which contours that are image features to be detected are registered in advance. Further, the ROM includes the above-described image memory M in which a drawing image visually represented by a pattern such as a graphic or an icon displayed by the display control unit 3 on the HUD device 4 or a character / number is stored in advance. Yes.
  • the pattern matching processing unit 21a also detects the lane line by the white line or the yellow line based on the luminance information of the image simultaneously with the pattern matching process.
  • the stop control unit 21b determines the distance from the parallax between the left and right images of the camera 17 to the detection target. It is a stop control means that calculates and controls the stop of the automobile 10 according to the distance.
  • the follow-up running control unit 21c calculates the inter-vehicle distance with the preceding vehicle so that the automobile 10 follows the preceding vehicle and compares the data as a result of comparison with the preset inter-vehicle distance. To 25.
  • the constant speed traveling control unit 21d calculates a difference between the current speed and a preset vehicle speed and outputs the difference to the traveling control device 25 in order to cause the host vehicle to travel at a constant speed in the constant speed traveling mode.
  • the target preceding vehicle determination unit 21e determines a preceding vehicle that is a target of the follow-up traveling mode.
  • the automatic operation control unit 21f performs control for automatically driving the automobile 10 at the level 3 or 4 described above.
  • the central processing unit 21 determines whether or not a steering operation for the driver to turn the vehicle 10 to the right is performed (step S1). At this time, the central processing unit 21 detects whether the steering wheel 22 is rotated by the steering angle sensor 24. When the steering operation for the right turn is not performed and “NO”, the operation of displaying the image of the sight that is a blind spot is not performed.
  • the pattern matching processing unit 21a of the central processing unit 21 acquires the image sent from the camera 19 at this time (step S2).
  • the right eye image RP and the left eye image LP captured by the camera 19 are displayed by the HUD device 4. Therefore, the driver can visually recognize the image at a position in the depth direction across the front windshield 15 on the line of sight of the driver in front of the car by stereoscopically displaying the image with binocular parallax (step S3).
  • the central processing unit 21 performs pattern matching processing between the acquired image and the contour of the detection target registered in the pattern data file F, and detection targets such as pedestrians, bicycles, and motorcycles are included in the image.
  • detection targets such as pedestrians, bicycles, and motorcycles are included in the image.
  • the presence is determined (step S4). In the case of “NO” in which the detection target is not included, the operation of displaying the image of the sight that is a blind spot is ended.
  • the central processing unit 21 determines the distance from the parallax between the left and right images of the camera 19 to the detection target by the stop control unit 21b. Calculation is performed to determine whether the distance is within a predetermined range (step S5). For example, when the distance to the detection object exceeds a range of 10 meters (“NO” in step S5), the operation of displaying the image of the sight that is a blind spot is ended.
  • step S5 when the distance to the object to be detected is within a range of 10 meters (“YES” in step S5), the central processing unit 21 performs control to stop the automobile 10 by the brake control device 26 (step S6). Thereafter, the central processing unit 21 repeats the processing from step 2 so that the detection object and other detection objects no longer exist ("NO” in step 4), or even if they exist. If the distance exceeds 10 meters (“NO” in step S5), the operation of displaying the image of the sight that is a blind spot is terminated. In this case, the stop control unit 21b releases the stop state of the automobile 10 by the brake control device 26.
  • the automobile 10 is maintained in a stopped state. Therefore, according to the safe driving support system 20, since the automatic stop is controlled according to the condition of the blind spot range where the driver's eyes cannot reach, the safe driving of the driver at the right turn can be surely supported.
  • the driving support system 20 in which the image display system 1 for an automobile according to the present invention is integrated is not limited to the pillar portion represented by the A pillar 6, and other vehicle structures such as a frame and an outer member are also blind spots from the inside of the vehicle.
  • the camera can be attached to these surfaces, and the stereo image can be displayed on the front windshield 15 by the HUD device 4.
  • the driving support system 20 displays various information in addition to the blind spot image by the A pillar 6 in the form of a stereoscopic image to the driver as a captured image or a drawn image. Several display examples will be described below.
  • the constant speed travel and inter-vehicle distance control of the automobile 10 by the travel support system 20 will be described.
  • the constant speed travel / inter-vehicle distance control in the present embodiment is performed as follows, for example, but of course is not limited to this.
  • the central processing unit 21 starts constant speed traveling and inter-vehicle distance control of the host vehicle when the driver turns on a driving support switch (not shown).
  • the vehicle speed of constant speed driving and the inter-vehicle distance from the preceding vehicle may be set by inputting desired values on the touch panel 30 of the HMI unit 23 immediately before the driver turns on the driving support switch.
  • the previous set value stored in the 20 memories (not shown) can be used as it is.
  • the constant speed traveling / inter-vehicle distance control is performed by switching between the following traveling mode when the preceding vehicle is detected and the constant speed traveling mode when the preceding vehicle is not detected.
  • the target preceding vehicle determination unit 21e detects all preceding other vehicles by the front camera 17 and the sensor 21.
  • the target preceding vehicle determination unit 21e detects and stores the inter-vehicle distance, relative speed, direction with respect to the own vehicle, etc. with respect to all the detected preceding vehicles, and travels in the same lane as the own vehicle.
  • the closest preceding vehicle is determined as the target preceding vehicle for following the vehicle. Whether the preceding vehicle is traveling in the same lane as the own vehicle can be determined by the pattern matching processing unit 21a detecting the lane.
  • the sensor 21 always scans regardless of whether the driving support switch is on or off. Thus, not only can the target preceding vehicle be quickly determined in response to the turning-on operation of the travel support switch, but detection of the preceding vehicle can also be used for the rear-end collision prevention function.
  • the presence of the preceding vehicle can also be obtained by performing short-range wireless communication with the preceding vehicle.
  • the preceding vehicle detection unit 11 can detect the preceding vehicle. The accuracy of the position information can be further increased.
  • Presence of the vehicle is a variety of information communication between the vehicle and road peripheral equipment, i.e., a so-called sensor or antenna installed on the road, or directly or wirelessly through a server in the surrounding area. It can also be obtained by road-to-vehicle communication.
  • the driving support system 20 includes a communication device for communicating with the outside. Therefore, a communication device that acquires position information and travel conditions of surrounding vehicles by vehicle-to-vehicle communication or road-to-vehicle communication serves as detection means for detecting information provided to the driver.
  • the central processing unit 21 controls the traveling control device 25 so that the following distance control unit 21c maintains the determined inter-vehicle distance with the target preceding vehicle at the set inter-vehicle distance. That is, if the current actual inter-vehicle distance with the target preceding vehicle is longer than the set inter-vehicle distance, the accelerator control device 27 is controlled to increase the vehicle speed of the automobile 10 and reduce the inter-vehicle distance with the target preceding vehicle.
  • the brake control device 26 is controlled so as to increase the inter-vehicle distance from the target preceding vehicle by lowering the vehicle speed of the own vehicle, and if the same, the brake control device 26 and the accelerator control are maintained so as to maintain the current vehicle speed of the own vehicle.
  • the device 27 is controlled.
  • FIG. 9 shows an example of the screen display of the HUD device 4 in the follow-up running mode.
  • the preceding vehicle 41 that is following is displayed by drawing a marking 42.
  • one preceding vehicle 41 is in the same overtaking lane 43 as the own vehicle, and another preceding vehicle lane 44 is further in front.
  • Vehicle 45 is running.
  • both the preceding vehicles 42 and 45 are detected by the sensor 21 and the camera 17, but the target preceding vehicle determination unit 21e targets the preceding vehicle 41 closest to the own vehicle on the same lane 43 as the own vehicle. Decide on the preceding vehicle. Therefore, in this case, the preceding vehicle 41 is a detection target that the driver particularly pays attention to in the real scenery.
  • the target preceding vehicle determination unit 21e stores data such as the detected inter-vehicle distance, relative speed, and direction from the target preceding vehicle 42 in the memory, and the sensor 21 and the camera 17 continuously scan the data.
  • the target preceding vehicle 42 is automatically tracked, and data such as the inter-vehicle distance, relative speed, and direction are continuously collected and stored.
  • the display control unit 3 displays an image of the marking 42 indicating that the preceding vehicle 41 is a tracking target vehicle by determining the target preceding vehicle.
  • the marking 42 is a frame pattern that substantially overlaps the rear contour of the preceding vehicle 41, and the display control unit 3 reads the frame drawing image from the image memory M and displays this as a stereoscopic image.
  • the right-eye image RP and the left-eye image LP are created so as to be displayed.
  • the display control unit 3 creates a right-eye image RP and a left-eye image LP having binocular parallax from the rendered image of the marking 42 and displays them on the HUD device 4.
  • the display control unit 3 uses the binocular parallax so that the driver can visually recognize the marking 42 at the above-described coordinate position whose center is obtained by calculation, and at a depth in the depth direction from the front windshield 15 to the preceding vehicle 41.
  • a right-eye image RP and a left-eye image LP having
  • the driver can also visually recognize the marking 42 while the eyes are focused on the preceding vehicle 41 through the front windshield 15.
  • the driver can visually recognize the target preceding vehicle that is being tracked by the driver because the driver can see the target preceding vehicle 42 as if it was actually surrounded by a frame.
  • the image of the marking 42 in the present embodiment is formed with a square thick frame line surrounding the outline of the target preceding vehicle 41, but various shapes and displays other than the square thick frame line are used. be able to. For example, it may be indicated by a circular thick border line or an arrow instead of being surrounded by a frame, and it may be displayed in a conspicuous color such as red or orange to further draw the driver's attention. .
  • the display control section 3 displays the traveling speed set by the input to the touch panel 30 of the HMI section 23 in a three-dimensional manner with a numerical drawing image. Furthermore, when changing the constant speed traveling speed, the display control unit 3 acquires the vehicle speed in the middle stage until reaching the set speed from the vehicle speed sensor 25 and sequentially displays it.
  • the navigation information displayed by the HUD device 4 is, for example, an arrow along the traveling direction of the automobile 10, and displays a straight arrow when traveling straight at an intersection, and a refracted arrow corresponding to the direction when turning left or right. To do.
  • FIG. 10 shows an example of a screen display by the HUD device 4 at the time of guidance for turning the route to the left during navigation travel.
  • the display control unit 3 reads a drawing image of an arrow 46 that displays a left turn from the image memory M and displays the image on the HUD device 4 in a stereoscopic manner.
  • the display control unit 3 sets the coordinate position on the Y axis that is set on the surface of the front windshield 15 based on the distance.
  • the right-eye image RP and the left-eye image LP having binocular parallax such that the arrow 46 is stereoscopically displayed at the position are created and displayed.
  • the driver can also visually recognize the arrow 46 while the eyes are focused on the intersection 47 through the front windshield 15. As a result, the driver can prepare for the left turn because the left turn arrow is visually recognized over the intersection 47. Therefore, in this case, the intersection 47 is a detection target that the driver particularly pays attention to in the real scenery.
  • the driver can monitor the automatic driving situation by the stereoscopic image projected by the HUD device 4 in a fully automatic driving vehicle in which the driver is not directly involved in driving.
  • An example of display by such an automobile image display system according to the present invention during automatic driving will be described.
  • the navigation device 33 calculates one or a plurality of driving route candidates to the destination input in advance by the occupant, and along the driving route determined by the driver's confirmation or approval, the automatic driving control unit 21f is performed by controlling the traveling control device 25.
  • the display control unit 3 creates a right-eye image RP and a left-eye image LP of the drawn image by the arrow 48 representing the traveling course of the own vehicle, as shown in FIG. indicate.
  • the display control unit 3 detects the coordinate position of the traveling path 49 of the host vehicle by processing the image captured by the camera 17 by the pattern matching processing unit 21a, and the driver displays an arrow 48 with binocular parallax at the coordinate position.
  • a right-eye image RP and a left-eye image LP are created so as to be visually recognized as a stereoscopic image.
  • the driver can visually recognize the arrow 48 in a state in which the eyes are focused in the depth direction through the front windshield 15, the driver can see the road 49 in the real scenery 36 seen through the front windshield 15. It can be recognized as the arrow 48 is shown.
  • the display control unit 3 intuitively informs the driver and other occupants of the next driving behavior that the automatic driving vehicle 10 is about to take before the vehicle 10 deviates from the current driving course or changes lanes. Predict by displaying easily and momentarily in an easy-to-understand manner.
  • the determination as to whether or how to take a driving action different from the planned driving action is an automatic driving executed by the central processing unit 21 to realize the automatic driving control unit 21f. Determine according to the control program.
  • FIG. 12 schematically shows a preview screen displayed on the HUD device 4 prior to the lane change from the overtaking lane 52 to the traveling lane 53 while the vehicle is in automatic operation. .
  • the drawing image indicated by the arrow 50 indicating the scheduled driving course in addition to the drawing image indicated by the arrow 50 indicating the scheduled driving course, the drawing image indicated by the arrow 51 indicating the driving course for changing the lane is displayed.
  • the arrow 51 has a shape that curves along a trajectory where the host vehicle is predicted to automatically travel from a position on the overtaking lane 38 of the real scene 36 to a position on the traveling lane 45 adjacent to the destination. .
  • These arrows 50 and 51 are also read from the image memory M and displayed as a stereoscopic image, so that the driver can visually recognize the arrows in a state where the eyes are focused on the vehicle ahead. Therefore, the driver can more clearly, intuitively and specifically recognize that the driving operation for changing the lane from the currently overtaking lane 52 to the traveling lane 53 is about to be performed.
  • FIG. 12 the drawing image by the icon 54 which makes the own vehicle virtual is displayed with the drawing image by the arrows 50 and 51.
  • FIG. 12 the display control unit 3 virtually displays the progress of the lane change that the vehicle is about to perform from the HUD device 4 by the movement of the icon 54.
  • FIGS. 12A to 12D schematically show the display at this time in time order.
  • the icon 54 indicates a state as if the vehicle is traveling just before the front windshield 15.
  • the display control unit 3 displays the right-eye image RP and the left-eye image so that the binocular parallax can be obtained as if the position where the icon 54 is visually recognized by the driver is behind the preceding vehicle 55 in the same lane 52.
  • An LP is created and displayed on the HUD device 4.
  • the display control unit 3 then changes the lane from the overtaking lane 52 to the traveling lane 53 as shown in the virtual images of FIGS. 12 (b), (c), and (d).
  • a right-eye image RP and a left-eye image LP are created and displayed on the HUD device 4 so that the driver can visually recognize each of the moving images.
  • the appearance shape of the host vehicle also changes with the progress of the lane change, but the display control unit 3 displays a drawing image of the icon 54 representing the appearance shape of the host vehicle according to the progress from the drawing memory M. Read and display.
  • the simulation display of the lane change by such a virtual vehicle icon 54 is displayed at a speed higher than the actual traveling speed of the own vehicle.
  • the icon 54 can be disappeared immediately after the virtual lane change is completed, or can be maintained for a short period of time while being displayed at the same position on the front windshield 15. Also, the moving images from (a) to (d) of FIG. 12 can be repeatedly displayed.
  • the image display system for an automobile according to the present invention is an automatic driving at any stage from level 1 to level 4 realized when the driving operation is directly performed by the driver or the driving support system.
  • information necessary for safe driving can be provided to the driver by display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Traffic Control Systems (AREA)
  • Instrument Panels (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Navigation (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

La présente invention concerne un système d'affichage d'image pour un véhicule au moyen duquel il est possible, pour des informations fournies, d'être affichées sur un pare-brise avant d'une manière qui est facilement compréhensible pour un conducteur. Une unité de détection (2) détecte la situation autour du véhicule à l'aide d'une caméra stéréo et de capteurs. Une unité de commande d'affichage (3) utilise ensuite un dispositif HUD (4) pour projeter vers un pare-brise avant (15) du véhicule une image d'œil droit et une image d'œil gauche qui forment une image stéréoscopique des informations détectées par l'unité de détection (2). Si les informations détectées sont une image capturée par la caméra stéréo, une image d'œil droit et une image d'œil gauche capturées sont projetées, et si les informations détectées proviennent d'un capteur, une image d'œil droit et une image d'œil gauche d'une image rendue sur la base des informations sont créées et projetées. Un passager du véhicule peut ensuite vérifier les informations provenant de l'image stéréoscopique au moyen d'une parallaxe binoculaire.
PCT/JP2016/081866 2015-11-09 2016-10-27 Système d'affichage d'image pour véhicule WO2017082067A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015219546A JP6744064B2 (ja) 2015-11-09 2015-11-09 自動車用画像表示システム
JP2015-219546 2015-11-09

Publications (1)

Publication Number Publication Date
WO2017082067A1 true WO2017082067A1 (fr) 2017-05-18

Family

ID=58695212

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/081866 WO2017082067A1 (fr) 2015-11-09 2016-10-27 Système d'affichage d'image pour véhicule

Country Status (2)

Country Link
JP (1) JP6744064B2 (fr)
WO (1) WO2017082067A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018198422A (ja) * 2017-05-23 2018-12-13 トヨタ自動車株式会社 自律型車両向けの交通状況認知
CN111521414A (zh) * 2020-06-19 2020-08-11 上海机动车检测认证技术研究中心有限公司 一种投影测量系统
CN112740007A (zh) * 2018-09-21 2021-04-30 本田技研工业株式会社 车辆检查系统
EP4046846A1 (fr) * 2021-02-18 2022-08-24 Toyota Jidosha Kabushiki Kaisha Afficheur de véhicule

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6958252B2 (ja) 2017-11-07 2021-11-02 トヨタ自動車株式会社 遠隔監視システム及び自律走行車両並びに遠隔監視方法
JP2019177726A (ja) * 2018-03-30 2019-10-17 コニカミノルタ株式会社 仮想リアビューミラー装置
JP7064688B2 (ja) * 2018-04-20 2022-05-11 日本精機株式会社 車両用表示装置、車両用表示システム、及び車両用表示プログラム
JPWO2021039257A1 (fr) * 2019-08-28 2021-03-04
DE102020214843A1 (de) * 2020-11-26 2022-06-02 Volkswagen Aktiengesellschaft Verfahren zur Darstellung eines virtuellen Elements
JP7447039B2 (ja) * 2021-03-10 2024-03-11 矢崎総業株式会社 車両用表示装置
JP2022148856A (ja) * 2021-03-24 2022-10-06 本田技研工業株式会社 車両用表示装置、表示制御方法、及びプログラム

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3194024B2 (ja) * 1993-10-04 2001-07-30 本田技研工業株式会社 車両用情報表示装置
JP2005067515A (ja) * 2003-08-27 2005-03-17 Denso Corp 車両用表示装置
JP2006350934A (ja) * 2005-06-20 2006-12-28 Denso Corp 情報表示装置
JP2009276994A (ja) * 2008-05-14 2009-11-26 Nissan Motor Co Ltd 車両情報取得装置及びその方法、並びに車両走行支援システム
JP2011081141A (ja) * 2009-10-06 2011-04-21 Seiko Epson Corp レンズアレイユニット、電気光学装置及び電子機器
JP2014089510A (ja) * 2012-10-29 2014-05-15 Toyota Motor Corp 先行車両表示装置

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004284417A (ja) * 2003-03-19 2004-10-14 Sumitomo Electric Ind Ltd 画像表示方法及び画像表示システム
JP2007168670A (ja) * 2005-12-22 2007-07-05 Fujitsu Ten Ltd 車載用表示装置
JP2009132259A (ja) * 2007-11-30 2009-06-18 Denso It Laboratory Inc 車両周辺監視装置
JP2010030524A (ja) * 2008-07-30 2010-02-12 Toyota Motor Corp 死角表示装置
JP2010137697A (ja) * 2008-12-11 2010-06-24 Honda Motor Co Ltd 車両の運転支援装置
JP2014010418A (ja) * 2012-07-03 2014-01-20 Yazaki Corp 立体表示装置及び立体表示方法
JP6337418B2 (ja) * 2013-03-26 2018-06-06 セイコーエプソン株式会社 頭部装着型表示装置および頭部装着型表示装置の制御方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3194024B2 (ja) * 1993-10-04 2001-07-30 本田技研工業株式会社 車両用情報表示装置
JP2005067515A (ja) * 2003-08-27 2005-03-17 Denso Corp 車両用表示装置
JP2006350934A (ja) * 2005-06-20 2006-12-28 Denso Corp 情報表示装置
JP2009276994A (ja) * 2008-05-14 2009-11-26 Nissan Motor Co Ltd 車両情報取得装置及びその方法、並びに車両走行支援システム
JP2011081141A (ja) * 2009-10-06 2011-04-21 Seiko Epson Corp レンズアレイユニット、電気光学装置及び電子機器
JP2014089510A (ja) * 2012-10-29 2014-05-15 Toyota Motor Corp 先行車両表示装置

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018198422A (ja) * 2017-05-23 2018-12-13 トヨタ自動車株式会社 自律型車両向けの交通状況認知
CN112740007A (zh) * 2018-09-21 2021-04-30 本田技研工业株式会社 车辆检查系统
CN112740007B (zh) * 2018-09-21 2023-06-30 本田技研工业株式会社 车辆检查系统
CN111521414A (zh) * 2020-06-19 2020-08-11 上海机动车检测认证技术研究中心有限公司 一种投影测量系统
EP4046846A1 (fr) * 2021-02-18 2022-08-24 Toyota Jidosha Kabushiki Kaisha Afficheur de véhicule
US11705007B2 (en) 2021-02-18 2023-07-18 Toyota Jidosha Kabushiki Kaisha Vehicle display device

Also Published As

Publication number Publication date
JP2017092678A (ja) 2017-05-25
JP6744064B2 (ja) 2020-08-19

Similar Documents

Publication Publication Date Title
WO2017082067A1 (fr) Système d'affichage d'image pour véhicule
JP6466899B2 (ja) 車両用表示装置
US10436600B2 (en) Vehicle image display system and method
JP5160564B2 (ja) 車両情報表示装置
JP6699646B2 (ja) 車両用表示制御装置
JP6252365B2 (ja) 安全確認支援システム、安全確認支援方法
JP6273976B2 (ja) 車両用表示制御装置
WO2016186039A1 (fr) Système d'affichage d'informations périphériques d'automobile
CN110730740B (zh) 车辆控制系统、车辆控制方法及存储介质
JP2018203007A (ja) 車両制御システム、車両制御方法、および車両制御プログラム
WO2015163205A1 (fr) Système d'affichage pour véhicule
JP6827378B2 (ja) 車両制御システム、車両制御方法、およびプログラム
JP2006284458A (ja) 運転支援情報表示システム
CN110603166A (zh) 车辆控制系统、车辆控制方法及车辆控制程序
CN110786004A (zh) 显示控制装置、显示控制方法及程序
EP3775780A1 (fr) Appareil de commande, appareil d'affichage, corps mobile et procédé d'affichage d'image
JP2016107947A (ja) 情報提供装置、情報提供方法及び情報提供用制御プログラム
JP6840035B2 (ja) 車両制御システム
WO2019189515A1 (fr) Appareil de commande, appareil d'affichage, corps mobile et procédé d'affichage d'image
US20210197863A1 (en) Vehicle control device, method, and program
JP6589991B2 (ja) ヒューマンインターフェース
JP2021020519A (ja) 車両用表示制御装置および車両用表示制御方法
US11338824B2 (en) Surrounding situation display method and surrounding situation display device
JP6365409B2 (ja) 車両の運転者のための画像表示装置
WO2017138343A1 (fr) Dispositif d'apprentissage de la conduite

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16864031

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16864031

Country of ref document: EP

Kind code of ref document: A1