WO2018070193A1 - Dispositif d'affichage tête haute - Google Patents

Dispositif d'affichage tête haute Download PDF

Info

Publication number
WO2018070193A1
WO2018070193A1 PCT/JP2017/033683 JP2017033683W WO2018070193A1 WO 2018070193 A1 WO2018070193 A1 WO 2018070193A1 JP 2017033683 W JP2017033683 W JP 2017033683W WO 2018070193 A1 WO2018070193 A1 WO 2018070193A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
vehicle
information
display device
display area
Prior art date
Application number
PCT/JP2017/033683
Other languages
English (en)
Japanese (ja)
Inventor
望 下田
昭央 三沢
壮太 佐藤
健 荒川
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Publication of WO2018070193A1 publication Critical patent/WO2018070193A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle

Definitions

  • the present invention relates to a technology for a head-up display device, and more particularly to a technology effective when applied to a head-up display device using AR (Augmented Reality).
  • AR Augmented Reality
  • HUD head-up display
  • This HUD device projects driving information such as vehicle speed and engine speed or information such as car navigation onto the windshield as described above.
  • the driver can check information without moving the line of sight to an instrument panel incorporated in the dashboard, such as a so-called instrument panel, and the amount of movement of the line of sight can be reduced.
  • some HUD devices display information for supporting safe driving such as detection of pedestrians and obstacles in addition to the above-described traveling information and car navigation information. For example, when displaying a road sign on a side road or the presence of a pedestrian, it is required to enlarge the HUD device.
  • JP 2016-91084 A Japanese Unexamined Patent Publication No. 2016-60303
  • An object of the present invention is to provide a technology that allows a driver to easily recognize important information while increasing the display area of a head-up display device.
  • a typical head-up display device projects a video on a windshield of the vehicle, and displays a virtual image superimposed on the scenery in front of the vehicle to the driver.
  • This head-up display device has a vehicle information acquisition unit, a control unit, a video display device, a mirror, a mirror driving unit, and a display distance adjustment mechanism.
  • the vehicle information acquisition unit acquires various types of vehicle information that can be detected by the vehicle.
  • the control unit controls the display of the video displayed in the display area viewed through the windshield from the driver's seat of the vehicle based on the vehicle information acquired by the vehicle information acquisition unit.
  • the video display device generates a video based on an instruction from the control unit.
  • Mirror reflects the image generated by the image display device and projects it to the windshield.
  • the mirror driving unit changes the angle of the mirror based on an instruction from the control unit.
  • the display distance adjustment mechanism adjusts the display distance of the virtual image for the driver.
  • the display area controlled by the control unit includes a first display area and a second display area that is an area above the first display area.
  • the first display area is an area for displaying augmented reality.
  • the second display area is an area that does not display augmented reality.
  • the first display area displays the first information
  • the second display area displays the second information having a lower priority than the first information.
  • the first information is safe driving support information that is information that supports safe driving
  • the second information is driving support information that is information that supports driving behavior.
  • FIG. 6 is an explanatory diagram showing an outline of an example of an operation concept in AR-HUD according to Embodiment 1.
  • FIG. 3 is a functional block diagram showing an overview of an overall configuration example of an AR-HUD according to Embodiment 1.
  • FIG. 3 is an explanatory diagram showing an outline of an example of a hardware configuration related to acquisition of vehicle information in the AR-HUD of FIG. 2.
  • FIG. 3 is a functional block diagram illustrating details of a configuration example of the AR-HUD in FIG. 2. It is explanatory drawing which showed the detail about the example of a structure in the control part and display distance adjustment mechanism of FIG. 3 is a flowchart showing an outline of an example of an initial operation in the AR-HUD of FIG.
  • FIG. 3 is a flowchart showing an outline of an example of normal operation in the AR-HUD of FIG. It is the flowchart which showed the outline
  • summary about the example of the brightness level adjustment process which is a process of step S22 of FIG. 3 is a flowchart showing an outline of an example of a flow of processing for adjusting display contents and display method of a virtual image in the AR-HUD of FIG. 2.
  • 10 is a flowchart showing an overview of an example of the flow of display position adjustment processing that is processing of step S233 in the display video determination / change processing of FIG. 9;
  • FIG. 3 is an explanatory diagram illustrating an example of a display area by AR-HUD in FIG. 2.
  • FIG. 12 is an explanatory diagram illustrating another example of display on the display screen using the enlarged display area of FIG. 11.
  • FIG. 10 is an explanatory diagram showing an example of a region when a display region by AR-HUD according to the second embodiment is enlarged in a horizontal direction and a vertical direction. It is explanatory drawing which shows an example of the display in the display area expanded in the horizontal direction and the vertical direction of FIG. It is explanatory drawing which shows an example of the display in the display area following FIG. It is explanatory drawing which shows an example of the display in a 2nd display area. It is explanatory drawing which shows an example of the display following FIG.
  • FIG. 20 is an explanatory diagram illustrating another display example of FIG. 19.
  • FIG. 18 is an explanatory diagram illustrating an example of a cooperative display operation in a display area by the AR-HUD of FIG.
  • FIG. 18 is an explanatory diagram illustrating an example of a display that reduces an increase in viewpoint movement in the upper part of the display area by the AR-HUD in FIG. 17.
  • FIG. 18 is an explanatory diagram illustrating an example of a display for reducing oversight of a road sign in the display area by the AR-HUD of FIG.
  • FIG. 18 is an explanatory diagram illustrating an example of display according to a road condition risk level in the display area by the AR-HUD of FIG. 17;
  • FIG. 18 is an explanatory diagram showing an example of navigation guidance display at an intersection in the display area by the AR-HUD of FIG. 17.
  • FIG. 18 is an explanatory diagram illustrating an example of a menu for customizing display in a display area by AR-HUD in FIG. 17.
  • FIG. 1 is an explanatory diagram showing an outline of an example of an operation concept in a HUD device (hereinafter sometimes referred to as “AR-HUD”) that realizes an AR function according to the first embodiment.
  • AR-HUD HUD device
  • the AR-HUD 1 that is a head-up display device reflects an image displayed on an image display device 30 such as a projector or an LCD (Liquid Crystal Display) by a mirror 51 or a mirror 52, The light is projected onto the windshield 3 of the vehicle 2.
  • the mirror 51 and the mirror 52 are, for example, a free-form surface mirror or a mirror having an optical axis asymmetric shape.
  • the driver 5 views the image projected as a virtual image in front of the transparent windshield 3 by viewing the image projected on the windshield 3.
  • the position of the image projected on the windshield 3 is adjusted, and the display position of the virtual image viewed by the driver 5 can be adjusted in the vertical direction. is there.
  • the display distance such as displaying a virtual image near (for example, 2 to 3 m away) or distant (for example, 30 to 40 m away).
  • the AR function is realized by adjusting the display position and display distance so that the virtual image is superimposed on the scenery outside the vehicle (roads, buildings, people, etc.).
  • the AR-HUD 1 of the present embodiment has an enlarged display area of an image projected on the windshield 3, that is, a display area 6 shown in FIGS. 11 and 12, which will be described later. It can be displayed on the shield 3. This can be realized, for example, by increasing the area of the mirror 52 or the like.
  • the enlargement of the display area 6 is not limited to this, and may be realized by other techniques.
  • FIG. 2 is a functional block diagram showing an overview of an overall configuration example of the AR-HUD according to the first embodiment.
  • the AR-HUD 1 mounted on the vehicle 2 includes a vehicle information acquisition unit 10, a control unit 20, a video display device 30, a display distance adjustment mechanism 40, a mirror driving unit 50, a mirror 52, and a speaker 60.
  • the shape of the vehicle 2 is displayed like a passenger car. However, the shape is not limited to this, and can be applied as appropriate to general vehicles.
  • the vehicle information acquisition unit 10 includes information acquisition devices such as various sensors, which will be described later, installed in each part of the vehicle 2, detects various events that occur in the vehicle 2, and enters a traveling state at predetermined intervals.
  • the vehicle information 4 is acquired and output by detecting and acquiring the values of various parameters.
  • the vehicle information 4 includes, for example, speed information and gear information of the vehicle 2, steering wheel steering angle information, lamp lighting information, outside light information, distance information, infrared information, engine ON / OFF information, inside / outside of the vehicle, as shown in the figure.
  • Camera image information, acceleration gyro information, GPS (Global Positioning System) information, navigation information, vehicle-to-vehicle communication information, road-to-vehicle communication information, and the like may be included.
  • the control unit 20 has a function of controlling the operation of the AR-HUD 1 and is implemented by, for example, a CPU (Central Processing Unit) and software executed thereby. It may be implemented by hardware such as a microcomputer or FPGA (Field Programmable Gate Gate Array).
  • a CPU Central Processing Unit
  • FPGA Field Programmable Gate Gate Array
  • control unit 20 generates a video to be displayed as a virtual image by driving the video display device 30 based on the vehicle information 4 acquired from the vehicle information acquisition unit 10 and the like, and generates the image on the mirror 52.
  • the light is projected onto the windshield 3 by appropriately reflecting it. Then, control is performed such as adjusting the display position of the virtual image display area 6 and adjusting the display distance of the virtual image.
  • the video display device 30 is a device configured by, for example, a projector or an LCD, and generates a video for displaying a virtual image based on an instruction from the control unit 20 and projects or displays the video. To do.
  • the display distance adjustment mechanism 40 is a mechanism for adjusting the distance of the displayed virtual image from the driver 5 based on an instruction from the control unit 20.
  • the mirror driving unit 50 adjusts the angle of the mirror 52 based on an instruction from the control unit 20 and adjusts the position of the virtual image display area 6 in the vertical direction.
  • Speaker 60 performs audio output related to AR-HUD1. For example, voice guidance of a navigation system, voice output when notifying the driver 5 with a AR function, or the like can be performed.
  • FIG. 3 is an explanatory diagram showing an outline of an example of a hardware configuration relating to acquisition of the vehicle information 4 in the AR-HUD of FIG.
  • the acquisition of the vehicle information 4 is performed by an information acquisition device such as various sensors connected to the ECU 21 under the control of an ECU (Electronic Control Unit) 21, for example.
  • ECU Electronic Control Unit
  • These information acquisition devices include, for example, a vehicle speed sensor 101, a shift position sensor 102, a steering wheel angle sensor 103, a headlight sensor 104, an illuminance sensor 105, a chromaticity sensor 106, a distance measuring sensor 107, an infrared sensor 108, and an engine start.
  • Sensor 109 acceleration sensor 110, gyro sensor 111, temperature sensor 112, road-to-vehicle communication wireless receiver 113, vehicle-to-vehicle communication wireless receiver 114, camera (inside the vehicle) 115, camera (outside the vehicle) 116, GPS receiver 117, And VICS (Vehicle Information and Communication Communication System: road traffic information communication system, registered trademark (hereinafter the same)) receiver 118 and the like.
  • VICS Vehicle Information and Communication Communication Communication System: road traffic information communication system, registered trademark (hereinafter the same) receiver 118 and the like.
  • the vehicle information 4 that can be acquired by the provided device can be used as appropriate.
  • the vehicle speed sensor 101 acquires speed information of the vehicle 2 in FIG.
  • the shift position sensor 102 acquires current gear information of the vehicle 2.
  • the steering wheel angle sensor 103 acquires steering wheel angle information.
  • the headlight sensor 104 acquires lamp lighting information related to ON / OFF of the headlight.
  • the illuminance sensor 105 and the chromaticity sensor 106 acquire external light information.
  • the distance measuring sensor 107 acquires distance information between the vehicle 2 and an external object.
  • the infrared sensor 108 acquires infrared information related to the presence / absence and distance of an object at a short distance of the vehicle 2.
  • the engine start sensor 109 detects engine ON / OFF information.
  • the acceleration sensor 110 and the gyro sensor 111 acquire acceleration gyro information including acceleration and angular velocity as information on the posture and behavior of the vehicle 2.
  • the temperature sensor 112 acquires temperature information inside and outside the vehicle.
  • the road-to-vehicle communication wireless receiver 113 and the vehicle-to-vehicle communication wireless receiver 114 are road-to-vehicle communication information received by road-to-vehicle communication between the vehicle 2 and roads, signs, signals, etc. Vehicle-to-vehicle communication information received by vehicle-to-vehicle communication with the other vehicle.
  • the camera (inside the vehicle) 115 and the camera (outside the vehicle) 116 capture the moving image of the situation inside and outside the vehicle, and acquire the camera image information inside the vehicle and the camera image information outside the vehicle, respectively.
  • the camera (inside the vehicle) 115 shoots, for example, the posture of the driver 5 in FIG. By analyzing the obtained moving image, it is possible to grasp, for example, the fatigue status of the driver 5 and the position of the line of sight.
  • the camera (outside the vehicle) 116 photographs the surrounding situation such as the front and rear of the vehicle 2.
  • the presence or absence of moving objects such as other vehicles and people in the vicinity, road surface conditions such as buildings and terrain, rain and snow, freezing, unevenness, and road signs, etc. are grasped It is possible.
  • the GPS receiver 117 and the VICS receiver 118 obtain GPS information obtained by receiving the GPS signal and VICS information obtained by receiving the VICS signal, respectively. It may be implemented as a part of a car navigation system that acquires and uses these pieces of information.
  • FIG. 4 is a functional block diagram showing details of the configuration example of the AR-HUD in FIG.
  • the video display device 30 is a projector, and the video display device 30 includes, for example, each unit such as a light source 31, an illumination optical system 32, and a display element 33.
  • the light source 31 is a member that generates projection illumination light.
  • a high-pressure mercury lamp, a xenon lamp, an LED (Light Emitting Diode) light source, a laser light source, or the like can be used.
  • the illumination optical system 32 is an optical system that collects the illumination light generated by the light source 31 and irradiates the display element 33 with more uniform illumination light.
  • the display element 33 is an element that generates an image to be projected.
  • a transmissive liquid crystal panel, a reflective liquid crystal panel, a DMD (Digital Micromirror Device) (registered trademark) panel, or the like can be used.
  • control unit 20 includes an ECU 21, an audio output unit 22, a nonvolatile memory 23, a memory 24, a light source adjustment unit 25, a distortion correction unit 26, a display element drive unit 27, a display distance adjustment unit 28, and a mirror adjustment. Each part such as the part 29 is included.
  • the ECU 21 acquires the vehicle information 4 via the vehicle information acquisition unit 10, and records, stores, and reads the acquired information in the nonvolatile memory 23 and the memory 24 as necessary. To do.
  • the nonvolatile memory 23 may store setting information such as setting values and parameters for various controls. Further, the ECU 21 generates video data relating to a virtual image to be displayed as the AR-HUD 1 by executing a dedicated program.
  • the audio output unit 22 outputs audio information via the speaker 60 as necessary.
  • the light source adjustment unit 25 adjusts the light emission amount of the light source 31 of the video display device 30. When there are a plurality of light sources 31, they may be controlled individually.
  • the distortion correction unit 26 corrects image distortion caused by the curvature of the windshield 3 by image processing when the image generated by the ECU 21 is projected onto the windshield 3 of the vehicle 2 by the image display device 30.
  • the display element drive unit 27 sends a drive signal corresponding to the video data corrected by the distortion correction unit 26 to the display element 33 to generate an image to be projected.
  • the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to adjust the display distance of the image projected from the image display device 30.
  • the mirror adjustment section 29 changes the angle of the mirror 52 via the mirror driving section 50 and moves the virtual image display area 6 up and down.
  • FIG. 5 is an explanatory diagram showing details of an example of the configuration of the control unit and the display distance adjusting mechanism of FIG.
  • the display distance adjustment unit 28 of the control unit 20 further includes, for example, a functional liquid crystal film ON / OFF control unit 281, a lens movable unit 282, and a dimming mirror ON / OFF as each unit individually controlled by the ECU 21. It includes an OFF control unit 283, a diffusion plate movable unit 284, an optical filter movable unit 285, and the like.
  • the display distance adjustment mechanism 40 further includes a functional liquid crystal film 401, a lens movable mechanism 402, a light control mirror 403, a diffuser plate movable mechanism 404, and an optical.
  • a filter movable mechanism 405 and the like are included.
  • the AR-HUD 1 does not have to include all of these units and devices, and may include various units necessary for mounting a device to which a virtual image display distance adjustment technique is applied.
  • FIG. 6 is a flowchart showing an outline of an example of the initial operation in the AR-HUD of FIG.
  • the AR-HUD 1 When the power of the AR-HUD 1 is turned on by turning on the ignition switch in the vehicle 2 that is stopped (S01), the AR-HUD 1 starts with the vehicle information acquisition unit 10 based on an instruction from the control unit 20. Thus, the vehicle information 4 is acquired (S02).
  • control unit 20 calculates a suitable brightness level based on external light information acquired by the illuminance sensor 105, the chromaticity sensor 106, and the like in the vehicle information 4 (S03), and the light source adjustment unit 25 uses the light source 31. Is set so that the calculated brightness level is obtained (S04). For example, when the outside light is bright, the brightness level is set high, and when the outside light is dark, the brightness level is set low.
  • the ECU 21 determines and generates a video to be displayed as a virtual image, for example, an initial image (S05), and performs a process of correcting the distortion by the distortion correction unit 26 for the generated video (S06).
  • the display element 33 is driven and controlled by the element driving unit 27 to generate a projected image (S07).
  • the image is projected onto the windshield 3 and the driver 5 can visually recognize the virtual image.
  • the ECU 21 or the display distance adjustment unit 28 calculates and determines the display distance of the virtual image (S08), and the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to display the image projected from the video display device 30.
  • the distance is controlled (S09).
  • the HUD-ON signal is output.
  • the control unit 20 determines whether or not this signal has been received (S10). .
  • the HUD-ON signal is further waited for a predetermined time (S11), and the HUD-ON signal waiting process (S11) is repeated until it is determined that the HUD-ON signal is received in the process of step S10.
  • step S10 If it is determined in step S10 that the HUD-ON signal has been received, normal operation of the AR-HUD 1 described later is started (S12), and a series of initial operations are terminated.
  • FIG. 7 is a flowchart showing an outline of an example of normal operation in the AR-HUD of FIG.
  • the AR-HUD 1 acquires the vehicle information 4 by the vehicle information acquisition unit 10 based on an instruction from the control unit 20 (S21). And the control part 20 performs a brightness level adjustment process based on the external light information acquired by the illumination intensity sensor 105, the chromaticity sensor 106, etc. among the vehicle information 4 (S22).
  • FIG. 8 is a flowchart showing an outline of an example of the brightness level adjustment process which is the process of step S22 of FIG.
  • a suitable brightness level is calculated based on the acquired outside light information (S221). Then, by comparing with the currently set brightness level, it is determined whether or not the brightness level needs to be changed (S222). If no change is necessary, the brightness level adjustment process is terminated.
  • the light source adjustment unit 25 controls the light emission amount of the light source 31 to set the brightness level after the change (S223), and the brightness level adjustment process is ended. .
  • step S222 even when there is a difference between the preferred brightness level calculated in step S221 and the currently set brightness level, the difference is equal to or greater than a predetermined threshold value. Alternatively, it may be determined that it is necessary to change the brightness level only.
  • the ECU 21 changes the video to be displayed as a virtual image from the current one as necessary based on the latest vehicle information 4 acquired in the process of step S ⁇ b> 21, and determines the changed video. Are generated (S23).
  • adjustment / correction processing is performed to maintain the visibility and appropriateness of display contents according to the traveling state of the vehicle 2.
  • the angle of the mirror 52 is changed via the mirror driving unit 50, and a mirror adjustment process is performed to move the virtual image display area 6 up and down ( S24). Thereafter, a vibration correction process for correcting the display position of the image in the display area 6 with respect to the vibration of the vehicle 2 is performed (S25).
  • the distortion correction unit 26 performs distortion correction processing on the adjusted / corrected image (S26), and then the display element driving unit 27 drives and controls the display element 33 to generate a projected image ( S27).
  • the display distance adjustment unit 28 calculates and determines the display distance of the virtual image (S28), and the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to display the image projected from the image display device 30.
  • the distance is controlled (S29).
  • a HUD-OFF signal is output to the AR-HUD 1. It is then determined whether or not this signal has been received (S30).
  • step S21 If the HUD-OFF signal has not been received, the process returns to step S21, and a series of normal operations are repeated until the HUD-OFF signal is received. If it is determined that the HUD-OFF signal has been received, a series of normal operations is terminated.
  • AR-HUD 1 adjusts the display contents and display method of the virtual image itself to be displayed according to the situation such as the scenery in front of the vehicle 2 in addition to the adjustment of the virtual image display area 6 and the adjustment of the display distance. Thereby, it is possible to superimpose a more appropriate virtual image on the front landscape in a more appropriate position and manner.
  • FIG. 9 is a flowchart showing an outline of an example of the flow of processing for adjusting the display content and display method of the virtual image in the AR-HUD of FIG.
  • the generation / display of the display contents (contents) of the virtual image in the AR-HUD 1 is performed by the process of step S05 at the initial operation of FIG. 6 and the process of step S23 at the normal operation of FIG.
  • the processing content is shown by taking the display video change / decision processing in the processing of step S23 in the normal operation of FIG. 7 as an example.
  • the ECU 21 performs standard content generation processing (S231) and event content generation processing (S232).
  • the standard content basically refers to content such as a vehicle speed display that is always displayed in the display area 6 while the vehicle 2 is traveling.
  • the event content refers to content such as an alert display that is displayed as necessary based on the driving situation including the scenery situation in front of the vehicle 2. In any case, a plurality of contents may be generated.
  • the ECU 21 adjusts the display position, display color, etc. of the generated content in relation to the front landscape grasped by the camera video information (S233) and the display color adjustment process (S234). )I do.
  • video data for display related to each adjusted content is generated (S235), and the process ends. Note that the video data generated here is projected by the display element driving unit 27 in the processing of the subsequent step S27 in FIG.
  • FIG. 10 is a flowchart showing an outline of an example of the flow of the display position adjustment process which is the process of step S233 in the display video determination / change process of FIG.
  • the camera image information outside the vehicle in the vehicle information 4 obtained in the process of step S21 in FIG. 7 is analyzed, and whether or not there is an object that should be avoided from being hidden in the scenery in front.
  • S3100 For example, a traffic signal, a pedestrian, a two-wheeled vehicle, a forward vehicle, and the like may be applicable in addition to a curve mirror and a road sign.
  • the current display position of each standard content and each event content generated in the processing of step S231 and step S232 of FIG. 9 is collated with the coordinates of each object (S3200). Then, it is determined whether or not the display position of each content needs to be adjusted, that is, whether or not the display of each content is an obstacle to the object (S3300). For example, the display position of the arrow graphic or alert display and the position of the curve mirror or road sign are compared by image processing or the like to determine whether or not the curve mirror or road sign is hidden.
  • step S3300 If it is determined in step S3300 that adjustment of the display position is unnecessary, the process ends. If it is determined that the display position needs to be adjusted, the display position is adjusted and moved for the target content, and a new display position is set (S3400).
  • the method for determining the position to move the content is not particularly limited. Instead of or in addition to the adjustment of the position of the content, the display size of the content may be reduced so that the object is not hidden (S3500).
  • step S3200 After making these adjustments, the process returns to step S3200 again, and the above-described series of processing is repeated until it is no longer necessary to adjust the display position of each content. Since there is a possibility that another object exists before the content is moved, it is necessary to adjust until the display position is appropriate.
  • virtual image content can be displayed in a state where the object is avoided, and the visibility of the object can be improved.
  • the display example in the AR-HUD 1 described below is realized by the AR-HUD 1 executing the processes shown in FIGS. 7 to 10 described above.
  • FIG. 11 is an explanatory diagram showing an example of a display area by the AR-HUD 1 in FIG.
  • FIG. 11 schematically shows an example of the display area 6 that the driver 5 of the vehicle 2 visually recognizes from the driver's seat through the windshield 3.
  • FIG. 11A shows an example in which the display area 6 is not enlarged in the horizontal direction
  • FIG. 11B shows an area in the horizontal direction (hereinafter, horizontal direction) with respect to the road surface.
  • An example in the case of being enlarged is shown.
  • the display area 6 is enlarged in the horizontal direction with respect to the road surface, for example, so that a virtual image is superimposed on a wider area outside the vehicle. be able to.
  • a virtual image can be superimposed and displayed on a vehicle traveling on the opposite lane or a pedestrian on the sidewalk.
  • FIG. 12 is an explanatory diagram showing a display example of a guidance screen based on navigation information when the display area of FIG. 11 is enlarged.
  • FIG. 12 schematically illustrates an example of a front landscape viewed by the driver 5 of the vehicle 2 from the driver's seat through the windshield 3 and a state of a virtual image in the display area 6 projected on the windshield 3. It is shown.
  • the dotted line in the display area 6 indicates the display area when the display area 6 is not enlarged in the horizontal direction.
  • FIG. 12 shows a state in which an arrow 300, a vehicle display 301, a pedestrian display 302, and the like are superimposed on the scenery outside the vehicle in the display area 6.
  • the icon of the instrument for displaying the vehicle speed and the like in the figure, “30 km / h” may be referred to as “vehicle speed display” in the following
  • vehicle speed display in the following
  • the arrow 300 is an arrow for instructing and navigating the traveling direction of the vehicle 2 and is displayed superimposed on the road on which the vehicle 2 is traveling.
  • the vehicle display 301 is a display indicating that there is a running vehicle. In the example of FIG. 12, the vehicle display 301 is circular and is displayed so as to surround the vehicle 200 traveling in the oncoming lane.
  • the shape of the vehicle display 301 is not particularly limited, and may be any shape other than a circle, such as a triangle or a quadrangle.
  • the pedestrian display 302 indicates that there is a pedestrian 201 walking on a sidewalk and the traveling direction of the pedestrian 201.
  • a circular pedestrian display 302 is displayed superimposed on the feet of the pedestrian 201, and the traveling direction of the pedestrian 201 is indicated by an arrow.
  • the shape of the pedestrian display 302 is not particularly limited as in the vehicle display 301.
  • the arrow 300 is generated based on the navigation information acquired from the vehicle information acquisition unit 10 by the ECU 21.
  • the vehicle display 301 and the pedestrian display 302 are generated based on camera video information and infrared information acquired by the ECU 21 from the vehicle information acquisition unit 10.
  • the vehicle speed display is generated based on the speed information acquired by the ECU 21 from the vehicle information acquisition unit 10.
  • the display area 6 is expanded in the horizontal direction, so that, for example, even when the vehicle 2 is traveling straight, the pedestrian display 302 indicating the pedestrian 201, the vehicle display 301 in the oncoming lane, and the like are displayed for driving. Since the person 5 can perceive, it can contribute to safe driving.
  • the display timing of the pedestrian display 302 is when the vehicle 2 turns to the left and the pedestrian 201 enters the display area indicated by the dotted line, and the timing for causing the driver 5 to detect the pedestrian 201 is delayed. It will end up.
  • FIG. 13 is an explanatory diagram showing an example when there are a plurality of roads that can turn left when the vehicle turns left by the navigation guidance.
  • FIG. 14 is an explanatory diagram showing a display example of a guidance screen based on navigation information when the display area in the road configuration of FIG. 13 is enlarged.
  • FIG. 13 shows a case where there are two roads that can be turned to the left at a short distance, and the vehicle 2 makes a left turn not on the road in front but on the road in the back.
  • the road to turn left is outside the display area and is not included in the display area.
  • the display area 6 expanded in the horizontal direction as indicated by a solid line, not only the road on which the vehicle 2 makes a left turn, but also the road that can turn left before that is included in the display area.
  • an arrow 300 which is a guidance display for instructing / navigating the traveling direction and the like can be displayed in a superimposed manner on the road on the far side to turn left.
  • the display area 6 in FIG. 14 includes a vehicle speed display (characters “30 km / h” in the figure), a distance to the road to turn left (characters “20 m” in the figure), and a vehicle display 301. Is displayed as a virtual image.
  • a left turn has been described as an example, but an arrow 300 or the like can be displayed on a right turn in a similar manner by superimposing it on a road that makes a right turn.
  • the road on the right turn side is not included in the display area 6.
  • the display area 6 may be further expanded in the horizontal direction so that the left-turn road is included in the display area 6.
  • the dotted line range shown in FIG. 14 becomes the display area, and thus an arrow is displayed in the dotted line display area.
  • the road to be turned left is not included in the display area, it is not known whether the left or right road is to be turned left, and the driver 5 may be confused.
  • the driver 5 can accurately grasp that he / she makes a left turn on the back road, not on the front side. This is particularly effective when there are a plurality of roads that can turn left at a short distance, or when the roads have complicated shapes.
  • the road on which the vehicle 2 travels can be accurately grasped by the driver 5 and concentrated on driving without being confused, thereby contributing to safe driving.
  • FIG. 15 is an explanatory view showing another display example of the guidance screen based on the navigation information by the enlarged display area 6 of FIG.
  • FIG. 15 also schematically shows an example of the front landscape viewed from the driver's seat by the driver 5 of the vehicle 2 and the state of the virtual image in the display area 6 projected on the windshield.
  • FIG. 15 shows a display example when the vehicle 2 approaches before the intersection where entry is prohibited except in the designated direction.
  • a sign 202 is provided on the front left side of the vehicle 2.
  • This sign 202 is a road sign indicating entry prohibition other than the designated direction. Therefore, at the intersection, the vehicle 2 can only travel straight ahead and cannot enter either the left-turn road or the right-turn road.
  • an entry prohibition display 303 indicating that it is not possible to enter this road is displayed superimposed on the road on the left turn side on the road on the left side of the intersection.
  • the entry prohibition display 303 which is a restriction instruction display, recognizes the meaning of the sign 202 and the like from the camera video information outside the vehicle acquired by the ECU 21 from the vehicle information acquisition unit 10, and is generated based on the recognition result.
  • the road on the left turn side of the intersection is included in the display area 6 and the entry prohibition display 303 can be displayed.
  • the sign 202 is a road sign indicating entry prohibition other than the designated direction.
  • other road signs such as a regulation sign and an instruction sign are similarly displayed according to the recognized road sign.
  • the regulation instruction display is performed on the display area 6.
  • FIG. 15 The display such as the entry prohibition display 303 shown in FIG. 15 is particularly effective when the driver 5 misses a road sign.
  • an icon simulating a road sign for example, entry prohibition other than the designated direction
  • the driver 5 it is more appropriate for the driver 5 to display the entry prohibition display 303 as shown in FIG. It can be recognized that the user should not enter the road quickly. That is, the entry prohibition display 303 is displayed so as to be superimposed on the actual road, and it looks as if there is a pseudo wall, so that it can be intuitively notified that the user should not enter the road.
  • FIG. 16 is an explanatory view showing another display example by the enlarged display area 6 of FIG.
  • FIG. 16 shows a state in which the vehicle 200 is traveling in the opposite lane of the lane in which the vehicle 2 is traveling, and the vehicle 203 is traveling in front of the vehicle 2 in the lane in which the vehicle 2 is traveling.
  • a vehicle display 301 and a vehicle display 305 are displayed.
  • the traveling speed of the oncoming vehicle (character “30 km / h” in the figure) and the distance between the vehicle 203 traveling in front of the vehicle 2 (character “10 m” in the figure) are displayed as virtual images. Yes.
  • the inter-vehicle distance is generated based on the distance information acquired from the vehicle information acquisition unit 10 by the ECU 21.
  • the vehicle display 301 is a display indicating that there is a vehicle 200 traveling in the oncoming lane
  • the vehicle display 305 is a display indicating that there is a vehicle 203 traveling in front of the vehicle 2.
  • the vehicle display 301 is formed, for example, in a circular shape so as to be superimposed so as to surround the vehicle 200.
  • the vehicle display 305 also has a circular shape, for example, and is displayed so as to surround the vehicle 203.
  • the vehicle displays 301 and 305 are color-coded, for example, so that it can be distinguished whether the vehicle is traveling in the traveling lane or the vehicle traveling in the oncoming lane, that is, the oncoming vehicle. .
  • the color classification is represented by the presence / absence of hatching.
  • the vehicle displays 301 and 305 may be any display that can distinguish whether the vehicle is running ahead or on the opposite lane, for example, by changing the shape, in addition to the color coding.
  • AR-HUD1 in which the display area 6 is expanded in the horizontal direction, it is possible to display virtual images such as the arrow 300, the pedestrian display 302, the vehicle display 305, etc. even on the oncoming lane, the sidewalk, and the intersecting road. Therefore, it becomes easy to accurately grasp the course, and it becomes easy to recognize the vehicles 200 and 203 and the pedestrian 201 traveling forward or opposite, thereby contributing to safe driving.
  • the display area 6 by the AR-HUD 1 extends not only in the horizontal direction but also in the vertical direction, thereby increasing the amount of information that can be displayed in the vertical direction. Therefore, it is assumed that the traffic assistance information that is the second information is displayed in the upper part of the display area 6, and the priority information that is the first information is displayed in the lower part of the display area 6.
  • Traffic assistance information is information that assists driving operations such as traffic jam information, road information, or intersection information.
  • the traffic jam information is information indicating the traffic jam status on the road.
  • the road information is information indicating a lane change, for example.
  • the intersection information is information such as the name of the intersection.
  • the priority information is information related to the driving operation prioritized over the traffic assistance information, for example, the situation of other vehicles, pedestrians, road alignment, etc., which is displayed as a virtual image superimposed on the real scene.
  • FIG. 17 is an explanatory diagram showing an example of a region when the display region 6 by the AR-HUD 1 according to the second embodiment is expanded in the horizontal direction and the vertical direction.
  • FIG. 17 (a) when the display area 6 is expanded in the horizontal direction and the vertical direction, in addition to the display area 6 expanded in the horizontal direction in the first embodiment, there is a room for information display newly in the upper and lower parts. Will be born.
  • the area indicated by hatching in FIG. 17A is a new display area created by further expanding the display area 6 expanded in the horizontal direction in the vertical direction.
  • the area close to the upper end of the display area 6 in which the display area 6 is expanded not only in the horizontal direction but also in the vertical direction is a part that hits the scenery such as the sky and buildings that are not so much related to the driving operation from the viewpoint of the driver 5. .
  • the area below the upper part of the display area 6 is an area where the driver 5 obtains real scene information directly connected to the driving operation such as roads, traffic conditions, or road signs.
  • the upper part of the display area 6 is a second display area 6a, which is indicated by hatching in FIG. 17B, and the lower part of the display area 6 is a first display area 6b, which is shown in FIG. This is an area indicated by dots in b).
  • the second display area 6a displays traffic assistance information consisting of auxiliary information for assisting driving operation, information notifying the operation status of airplanes, railways, etc., information notifying incoming calls such as telephone calls and mails, and the like.
  • traffic assistance information consisting of auxiliary information for assisting driving operation, information notifying the operation status of airplanes, railways, etc., information notifying incoming calls such as telephone calls and mails, and the like.
  • priority information that is important information directly related to the driving operation is displayed in the first display area 6b.
  • the second display region 6a is a region in which a virtual image that is not mainly AR (Augmented Reality) is displayed
  • the first display region 6b is a region in which display is mainly performed by AR. is there.
  • the traffic assistance information displayed in the second display area 6a is a virtual image that is not AR
  • the priority information displayed in the first display area 6b is a virtual image display by AR. Shall.
  • the third display area 6c is an area surrounded by a dotted line in FIG.
  • a virtual image that is not AR is mainly displayed.
  • the second display area 6a for displaying the traffic assistance information and the like may be an area that does not block the information of the actual scene directly connected to the driving operation of the driver 5, and is limited to an area close to the upper end of the display area 6. It is not a thing.
  • the second display area 6a or (and) the third display area 6c may be enlarged, and the first display area 6b may be reduced, or the second display area 6a or (and) the third display area 6c may be reduced.
  • the display area 6c may be omitted, and all may be the first display area 6b.
  • the area When the area is enlarged or reduced, for example, the area may be enlarged or reduced based on information obtained by imaging the situation in front of the camera. That is, when the presence of a vehicle or a pedestrian is detected ahead, it is necessary to display a virtual image of the AR for the vehicle or pedestrian, so the second display area 6a or (and) the third display area Control is performed so that the first display area 6b is secured widely without enlarging 6c.
  • the second display area 6a or (and) the third display area 6c is enlarged and the first display area 6c is enlarged. You may control to reduce the display area 6b.
  • the area may be enlarged or reduced based on the vehicle speed information.
  • the second display area 6a or (and) the third display area 6c may be enlarged when traveling at a low speed or stopping. Control to reduce the first display area 6b and eliminate the second display area 6a or (and) the third display area 6c or reduce the first display area 6b as much as possible when traveling at high speed. Good.
  • FIG. 18 is an explanatory diagram showing an example of display in the display area enlarged in the horizontal direction and the vertical direction in FIG.
  • FIG. 19 is an explanatory diagram showing an example of display in the display area following FIG.
  • FIG. 18 (a) shows that the vehicle 2 is traveling without traffic jams
  • FIG. 18 (b) shows the display area 6 during traveling in FIG. 18 (a). A display example is shown.
  • FIG. 19A shows that a traffic jam has occurred at the end of the traveling vehicle 2
  • FIG. 19B shows a table of a display area 6 that informs that a traffic jam has occurred. An example is shown, and traffic congestion information on a road is displayed when a traffic jam occurs.
  • the sign information 310 indicating the vehicle traffic classification is a virtual image as traffic auxiliary information. Is displayed.
  • FIG. 19 (a) when it is detected that a traffic jam has occurred at the destination of the vehicle 2, as shown in FIG. 19 (b), in the second display area 6a, the sign information 310 and The traffic jam information 311 is displayed as traffic assistance information.
  • the sign information 310 is generated based on the camera video information acquired by the ECU 21 from the vehicle information acquisition unit 10.
  • the traffic jam information 311 is generated based on GPS information and VICS information.
  • the traffic assistance information such as the sign information 310 and the traffic jam information 311 is only information that assists driving and is not prioritized in the actual scene directly connected to the driving operation.
  • the sign information 310 and the traffic jam information 311 are displayed in an area where a real scene directly connected to the driving operation of the driver 5 can be seen, that is, in the second display area, the line of sight of the driver 5 focuses on the information. It may cause you to get out of your way.
  • the information can be read without obstructing the field of view at the time of driving that is directly connected to the driving operation and by relatively little viewpoint movement from the part of the real scene where the viewpoint is most concentrated during driving. it can. Therefore, it is possible to display traffic assistance information such as traffic jam information while reducing the distraction of the driver's attention and without deviating the viewpoint for a long time from the actual scene that should be seen when driving. Further, while the vehicle 2 is traveling, as shown in FIG. 19 (b), it is possible to reduce a driver's attention reduction by displaying only simple traffic jam information.
  • FIG. 20 is an explanatory diagram showing an example of display in the second display area
  • FIG. 21 is an explanatory diagram showing an example of display following FIG.
  • FIG. 20 and FIG. 21 show display examples of guidance screens based on navigation information. Here, a case where the vehicle 2 is traveling straight and turns left at the next intersection according to navigation guidance is shown.
  • the vehicle 2 is traveling straight, and when the distance to the intersection where the vehicle turns to the left is larger than the threshold value, the second display area 6a has the second display area 6a shown in FIG. As shown, only the sign information 310 indicating the vehicle traffic classification is displayed as a virtual auxiliary image as traffic assistance information.
  • the threshold value is a distance for determining the timing for displaying the left turn guidance screen.
  • a left turn guidance screen is displayed.
  • Guidance information 312 indicating turning left at the intersection is displayed as an AR superimposed on the traveling road as priority information.
  • the threshold value determination process and the guidance information 312 generation process are performed based on GPS information acquired from the vehicle information acquisition unit 10 by the ECU 21.
  • 20 and 21 show an example in which the guidance screen is switched in two steps according to one threshold value. For example, when the distance to the intersection to turn left is larger than the threshold value 1, traffic assistance information is displayed. If the threshold value is 1 or less and greater than the threshold value 2, traffic assistance information is displayed. If the threshold value is 2 or less, guidance information 312 indicating turning left at the intersection is superimposed on the road and displayed as an AR. Etc.
  • FIG. 22 is a diagram showing another display example of FIG.
  • detailed traffic jam information and detour information are displayed in the second display area 6a.
  • Detailed information 313 including proposals is displayed as traffic assistance information. Further, the detailed information 313 may be displayed so as to cover the first display area 6b when the detailed information 313 does not fit in the second display area 6a due to a large amount of information.
  • FIG. 23 is an explanatory diagram showing an example of the cooperative display operation in the display area by the AR-HUD of FIG.
  • FIG. 23 shows priority information in the first display area 6b from the state in which the sign information 310 indicating the vehicle traffic classification is displayed as the traffic auxiliary information in the second display area 6a while the vehicle 2 is traveling.
  • the example of a display until guidance information 312 is displayed as AR is shown.
  • the guidance information 312 in FIG. 23 is a lane change guidance display that prompts the user to move from the lane in which the vehicle 2 is traveling to the leftmost lane.
  • phase2 when the distance to the intersection where the vehicle 2 turns left is 250 m, the display of the sign information 310 in the second display area 6a is gradually thinned and the first information In the display area 6b, the guidance information 312 for prompting the lane change to move the vehicle 2 to the left lane is superimposed on the road so that the display gradually becomes dark, and AR display is performed.
  • phase2 the display shown in “phase2” is executed when the distance to the intersection to turn left is about 250 m, for example, but the distance is not limited to this.
  • phase 3 when the vehicle 2 reaches a distance of 200 m to the intersection where the vehicle turns to the left, the display of the sign information 310 in the second display area 6a is erased, and the first In the display area 6b, guidance information 312 for prompting a lane change to move to the lane is completely displayed on the left side.
  • the display of “phase 3” is about 200 m, for example, for the distance to the left turn intersection, but the distance is not limited to this.
  • the display of the guide information 312 is gradually thinned while the display of the sign information 310 is gradually darkened, and then the display of the guide information 312 is turned off and the sign information is displayed.
  • the display of 310 is completely displayed. That is, the display is performed in the order of “phase3”, “phase2”, and “phase1” so as to be the reverse of FIG. Also in this case, the display timings from “phase 3” to “phase 2” and from “phase 2” to “phase 1” may be changed according to the distance as in FIG. May be changed.
  • FIG. 24 is an explanatory diagram showing an example of the case where the viewpoint position of the virtual image display in the second display area is near.
  • FIG. 25 is an explanatory diagram showing an example of a display that reduces an increase in viewpoint movement in the second display area by the AR-HUD of FIG.
  • the second display area 6a is a portion that hits a landscape such as a sky or a building that is not related to the driving operation, and is displayed in the second display area 6a. It is assumed that many traffic assistance information and the like do not need to be displayed so as to be superimposed on the actual scene.
  • the viewpoint movement of the driver 5 becomes large, and as a result, there is a possibility that the driving may be hindered.
  • the viewpoint position DD of the driver 5 is about 30 m
  • the viewpoint position VD of the sign information 310 which is the traffic auxiliary information of the virtual image displayed in the second display area 6a
  • a large viewpoint movement from 30 m far to 2 m near is necessary. End up.
  • eyestrain is promoted or accidents are induced by overlooking information that must be seen during driving.
  • the viewpoint position DD of the driver 5 is about 30 m
  • the viewpoint position VD of the sign information 310 that is the traffic auxiliary information of the virtual image displayed in the second display area 6a is Similarly, when the distance is about 30 m, as shown in the lower part of FIG. 25, the viewpoint position DD and the viewpoint position VD are substantially the same position, so the driver 5 recognizes the traffic assistance information without moving the viewpoint forward and backward. be able to.
  • the movement of the viewpoint before and after can be greatly reduced, so that the information transmission speed can be improved, and the driver's 5 eyestrain and distraction of attention, etc. Can be reduced.
  • FIG. 26 is an explanatory view showing an example of a display for reducing oversight of a road sign in the display area by the AR-HUD of FIG.
  • a guide sign display is displayed on the sign 202 at a point where the distance of the viewpoint position of the driver 5 and the distance from the vehicle 2 to the sign 202 as a road sign are substantially the same.
  • the virtual image of the sign display 315 is superimposed and displayed.
  • the sign 202 is a guide sign that is installed in a portion that hits a landscape such as the sky. Therefore, when the sign 202 naturally enters the eyes of the driver 5, a virtual image of the sign display 315 is superimposed on the sign 202 and displayed. For example, when the distance of the viewpoint of the driver 5 is about 30 m, the sign display 315 is displayed when the distance from the vehicle 2 to the sign approaches about 30 m.
  • the sign display 315 is a virtual image display indicating that there is a road sign.
  • FIG. 26 shows an example in which an arrow that surrounds the marker 202 in a circle and causes the marker display 315 to be noticed is displayed as a virtual image as the marker display 316.
  • the shape of the virtual image of the sign display 315, 316 is not particularly limited, and may be any shape.
  • a traffic sign such as the sign 202 is recognized by the camera image information acquired by the ECU 21 from the vehicle information acquisition unit 10. Further, the distance to the sign 202 is recognized by infrared information acquired by the ECU 21 from the vehicle information acquisition unit 10 or the like.
  • the driver 5 changes the focus position. It has to change a lot and it takes time to recognize. Further, since the distance to the sign 202 is as short as about 5 m, the sign 202 may pass before it is recognized. That is, the virtual image shown in FIG. 26 exhibits a greater effect when displayed when the sign is far away, such as 30 m. When the distance to the sign becomes a short distance such as 5 m, it should be avoided to display a virtual image superimposed on the sign. The reason is that it is difficult to see the sign itself due to the virtual image.
  • the driver 5 can recognize the sign 202 without moving the viewpoint back and forth.
  • the oversight of the sign 202 can be reduced.
  • FIG. 27 is an explanatory view showing an example of display according to the road condition risk level in the display area by the AR-HUD of FIG.
  • FIG. 27 shows an example of the scenery in front of the driver's seat viewed through the windshield 3 and the state of the virtual image in the display area 6 projected on the windshield 3.
  • the state shown on the left side of FIG. 27 is the state with the lowest risk level (hereinafter referred to as low risk level) in the road condition, and the state shown on the right side of FIG. 27 is the state with the highest risk level (hereinafter referred to as high risk level). It is shown). Further, the state shown in the central part of FIG. 27 shows a state where the degree of danger is higher than the low degree of risk and the degree of risk is lower than the high degree of risk (hereinafter referred to as medium risk).
  • guidance is provided by navigation.
  • guidance information 312 indicating that the next intersection is to be turned left is displayed in a virtual image, and is displayed on the right side of the opposite lane.
  • the pedestrian 201 is walking on the sidewalk.
  • the danger level is set low.
  • the guidance information 312 indicating that the next intersection is to be turned left is displayed superimposed on the road of the traveling lane, and the pedestrian 201 on the sidewalk has a pedestrian.
  • a pedestrian warning display 320 indicating this is superimposed on the pedestrian 201 and displayed.
  • the pedestrian warning display 320 has a shape such as an arrow, for example, and is displayed so as to point to the pedestrian 201 by the arrow. This pedestrian warning display 320 is displayed when a pedestrian is detected. When the distance from the vehicle 2 to the pedestrian 201 is greater than or equal to a preset distance, or when the pedestrian 201 is on the side of the opposite lane. Displayed when you are in Further, when the vehicle 2 passes the pedestrian 201, the pedestrian warning display 320 is erased.
  • the shape of the pedestrian warning display 320 is not particularly limited, and may be other than the arrow shape.
  • the pedestrian 201 has moved near the pedestrian crossing at the intersection where the vehicle 2 turns left.
  • a pedestrian display 302 is superimposed on the pedestrian 201 and displayed in the display area 6 as in FIG.
  • the distance from the vehicle 2 to the pedestrian 201 is shorter than a preset distance, or the pedestrian 201 is in the vicinity of the pedestrian crossing. Yes.
  • the guidance information 312 When the risk level is low, the guidance information 312 is displayed superimposed on the road. However, when the risk level is medium, the guidance information 312 is not superimposed along the road. That is, it is displayed in the second display area 6a. In other words, the guidance information 312 is displayed upward, and the presence of the pedestrian 201 is displayed more strongly.
  • the pedestrian 201 has moved to the roadway on which the vehicle 2 travels instead of the sidewalk.
  • the risk level is the highest.
  • the guidance information 312 displayed in the state of risk is deleted from the display area 6.
  • a pedestrian display 302 is superimposed on the pedestrian 201, and a virtual image warning information 321 is displayed near the pedestrian 201 to warn that the pedestrian 201 is nearby and high in danger. It is displayed superimposed. Further, in order to appeal more strongly that the driver 2 is at high risk, the guidance by navigation is stopped and the guidance information 312 is deleted.
  • the pedestrian display 302 and the warning information 321 are deleted, and guidance by navigation is resumed.
  • the driver 5 is encouraged to pay attention to the warning event (in this case, a pedestrian). Further, when the degree of danger is not high, guidance by navigation can be continued in parallel with the warning, so that the driver 5 can be prevented from getting lost. Furthermore, since navigation guidance seamlessly moves from the lower part of the display area (first display area) to the upper part (second display area), or from the upper part to the lower part, depending on the degree of danger, guidance by navigation appears. The amount of change in the display is reduced compared to when it disappears or disappears, and it is possible to prevent the driver 5 from being surprised by the change in the display and distracting attention.
  • FIG. 28 is an explanatory diagram showing an example of navigation guidance display at an intersection in the display area by the AR-HUD of FIG.
  • FIG. 28 shows the traveling state of the vehicle 2, and the left side shows that the vehicle 2 is traveling before the intersection, that is, before the intersection.
  • the center portion shows a state in which the vehicle 2 is turning left at the intersection, i.e., traveling in the intersection.
  • the right side shows the traveling state of the vehicle 2 after completing the left turn at the intersection, that is, after passing through the intersection.
  • guidance information 312 indicating that the next intersection is to be turned left is displayed as an AR superimposed on the road. Then, as shown in the center of FIG. 28, when the vehicle 2 approaches the intersection, the guidance information 312 displayed in the first display area 6b is erased, and new guidance information 312 is displayed as the driving action.
  • the guidance information 312 may not be displayed in the second display area 6a.
  • whether or not the vehicle 2 has entered the intersection is determined based on, for example, steering wheel angle information, navigation information, and camera image information outside the vehicle acquired by the ECU 21 from the vehicle information acquisition unit 10.
  • the guidance information 312 is displayed again in the first display area 6b.
  • the guidance information 312 is information indicating that the vehicle travels straight on the left-turned road.
  • the driver 5 when turning the intersection, the driver 5 can travel in a safe state because the object to which attention should be paid while traveling at the intersection is not obstructed by the virtual image display. Furthermore, since navigation guidance seamlessly moves from the lower part of the display area (first display area) to the upper part (second display area) or from the upper part to the lower part according to the driving state of the intersection, guidance by navigation is provided. The amount of change in the display is reduced compared to when the message appears and disappears, and it is possible to prevent the driver 5 from being surprised by the change in the display and distracting attention.
  • FIG. 29 is an explanatory view showing an example of a menu for customizing the display in the display area by the AR-HUD of FIG.
  • menus of “upper and lower two-level display”, “upper-level display”, and “lower-level display” are provided as examples of display selection in the display area 6.
  • virtual images are displayed in all of the second display area 6a and the second table area 6b shown in FIG.
  • “Upper display” displays only the second display area 6a, that is, only the traffic auxiliary information including auxiliary information for assisting the driving operation.
  • the “lower display” displays only the first display area 6b, that is, only priority information that is important information directly related to the driving operation.
  • the driver 5 can arbitrarily select the content to be displayed by selecting the display method of the virtual image displayed in the display area 6 from the menu of FIG. In FIG. 29, the menu for selecting only the display area in the display area 6 is used. However, various menus such as changing the transmittance of information displayed in the second display area 6a and changing the display size of the information are available. A menu may be prepared. Further, the selection menu is not limited to the above-described selection example, but may be other items.
  • the driver 5 can easily distinguish and recognize the display of the traffic assistance information and the priority information more important than the traffic assistance information. In addition, it is possible to make the driver 5 recognize the traffic assistance information and the priority information without hindering driving. Thereby, it can contribute to safe driving.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. .

Abstract

Le problème décrit par la présente invention est d'augmenter la taille de l'affichage d'un dispositif d'affichage tête haute et de faciliter la reconnaissance d'informations importantes par un conducteur. La solution selon l'invention porte sur un AR-HUD 1, une unité d'acquisition d'informations de véhicule (10) acquérant divers types d'informations de véhicule (4) qui peuvent être détectées par un véhicule (2). Une unité de commande (20) commande l'affichage d'une image à afficher sur une région d'affichage visible à travers un pare-brise (3) à partir du siège de conducteur du véhicule (2) sur la base des informations de véhicule (4) acquises par l'unité d'acquisition d'informations de véhicule (10). Un dispositif d'affichage d'image (30) génère une image sur la base de l'instruction provenant de l'unité de commande (20). Un mécanisme de réglage de distance d'affichage (40) règle la distance d'affichage d'une image virtuelle par rapport à un conducteur. La région d'affichage commandée par l'unité de commande (20) possède une première région d'affichage et une seconde région d'affichage située au-dessus de la première région d'affichage. La première région d'affichage est une région affichant une réalité augmentée, et la seconde région d'affichage est une région n'affichant pas de réalité augmentée.
PCT/JP2017/033683 2016-10-13 2017-09-19 Dispositif d'affichage tête haute WO2018070193A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-201687 2016-10-13
JP2016201687A JP2019217790A (ja) 2016-10-13 2016-10-13 ヘッドアップディスプレイ装置

Publications (1)

Publication Number Publication Date
WO2018070193A1 true WO2018070193A1 (fr) 2018-04-19

Family

ID=61905299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/033683 WO2018070193A1 (fr) 2016-10-13 2017-09-19 Dispositif d'affichage tête haute

Country Status (2)

Country Link
JP (1) JP2019217790A (fr)
WO (1) WO2018070193A1 (fr)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019206256A (ja) * 2018-05-29 2019-12-05 株式会社デンソー 表示制御装置、及び表示制御プログラム
JP2020056887A (ja) * 2018-10-01 2020-04-09 本田技研工業株式会社 表示装置、表示制御方法、およびプログラム
WO2020085159A1 (fr) * 2018-10-23 2020-04-30 日本精機株式会社 Dispositif d'affichage
CN111476104A (zh) * 2020-03-17 2020-07-31 重庆邮电大学 动态眼位下ar-hud图像畸变矫正方法、装置、系统
WO2020166252A1 (fr) * 2019-02-14 2020-08-20 株式会社デンソー Dispositif de commande d'affichage, programme de commande d'affichage et support tangible non transitoire lisible par ordinateur
JP2020132137A (ja) * 2019-02-14 2020-08-31 株式会社デンソー 表示制御装置及び表示制御プログラム
WO2020249367A1 (fr) * 2019-06-13 2020-12-17 Volkswagen Aktiengesellschaft Commande d'un affichage d'un dispositif d'affichage tête haute à réalité augmentée pour un véhicule automobile
WO2021002081A1 (fr) * 2019-07-02 2021-01-07 株式会社デンソー Dispositif et programme de commande d'affichage
JP2021009133A (ja) * 2019-07-02 2021-01-28 株式会社デンソー 表示制御装置及び表示制御プログラム
US20210215499A1 (en) * 2018-06-01 2021-07-15 Volkswagen Aktiengesellschaft Method for Calculating an Augmented Reality Overlay for Displaying a Navigation Route on an AR Display Unit, Device for Carrying Out the Method, Motor Vehicle and Computer Program
CN113677553A (zh) * 2019-04-11 2021-11-19 三菱电机株式会社 显示控制装置以及显示控制方法
CN113784861A (zh) * 2019-05-15 2021-12-10 日产自动车株式会社 显示控制方法及显示控制装置
WO2022137558A1 (fr) 2020-12-25 2022-06-30 日産自動車株式会社 Dispositif de traitement d'informations et procédé de traitement d'informations
FR3118728A1 (fr) * 2021-01-12 2022-07-15 Psa Automobiles Sa Véhicule automobile comportant un système ADAS couplés à un système d’affichage en réalité augmentée dudit véhicule.
EP4057251A1 (fr) * 2021-03-10 2022-09-14 Yazaki Corporation Appareil d'affichage pour véhicule
CN115431764A (zh) * 2022-10-10 2022-12-06 江苏泽景汽车电子股份有限公司 一种ar标尺展示方法、装置、电子设备及存储介质

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7061938B2 (ja) * 2018-07-17 2022-05-02 三菱電機株式会社 運転支援装置および運転支援方法
JP7200970B2 (ja) * 2020-04-17 2023-01-10 トヨタ自動車株式会社 車両制御装置
CN111561938A (zh) * 2020-05-28 2020-08-21 北京百度网讯科技有限公司 Ar导航方法和装置
WO2022123922A1 (fr) * 2020-12-11 2022-06-16 株式会社Nttドコモ Système de traitement de l'information
JP2023076002A (ja) 2021-11-22 2023-06-01 トヨタ自動車株式会社 画像表示システム
WO2023145851A1 (fr) * 2022-01-28 2023-08-03 日本精機株式会社 Dispositif d'affichage
WO2024034053A1 (fr) * 2022-08-10 2024-02-15 マクセル株式会社 Dispositif de traitement d'informations et procédé de traitement d'informations

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005199992A (ja) * 2003-12-17 2005-07-28 Denso Corp 車両情報表示システム
JP2005292031A (ja) * 2004-04-02 2005-10-20 Denso Corp 車両用表示装置、車両用表示システム、及びプログラム
JP2007288657A (ja) * 2006-04-19 2007-11-01 Toyota Motor Corp 車両用表示装置、車両用表示装置の表示方法
WO2014129017A1 (fr) * 2013-02-22 2014-08-28 クラリオン株式会社 Appareil d'affichage tête haute pour véhicule
JP2015134521A (ja) * 2014-01-16 2015-07-27 三菱電機株式会社 車両情報表示制御装置
JP2016107731A (ja) * 2014-12-04 2016-06-20 日本精機株式会社 ヘッドアップディスプレイ装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005199992A (ja) * 2003-12-17 2005-07-28 Denso Corp 車両情報表示システム
JP2005292031A (ja) * 2004-04-02 2005-10-20 Denso Corp 車両用表示装置、車両用表示システム、及びプログラム
JP2007288657A (ja) * 2006-04-19 2007-11-01 Toyota Motor Corp 車両用表示装置、車両用表示装置の表示方法
WO2014129017A1 (fr) * 2013-02-22 2014-08-28 クラリオン株式会社 Appareil d'affichage tête haute pour véhicule
JP2015134521A (ja) * 2014-01-16 2015-07-27 三菱電機株式会社 車両情報表示制御装置
JP2016107731A (ja) * 2014-12-04 2016-06-20 日本精機株式会社 ヘッドアップディスプレイ装置

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019230271A1 (fr) * 2018-05-29 2019-12-05 株式会社デンソー Dispositif de commande d'affichage, programme de commande d'affichage et support d'enregistrement lisible par ordinateur tangible persistant associé
US11803053B2 (en) 2018-05-29 2023-10-31 Denso Corporation Display control device and non-transitory tangible computer-readable medium therefor
JP2019206256A (ja) * 2018-05-29 2019-12-05 株式会社デンソー 表示制御装置、及び表示制御プログラム
US20210215499A1 (en) * 2018-06-01 2021-07-15 Volkswagen Aktiengesellschaft Method for Calculating an Augmented Reality Overlay for Displaying a Navigation Route on an AR Display Unit, Device for Carrying Out the Method, Motor Vehicle and Computer Program
US11629972B2 (en) * 2018-06-01 2023-04-18 Volkswagen Aktiengesellschaft Method for calculating an augmented reality overlay for displaying a navigation route on an AR display unit, device for carrying out the method, motor vehicle and computer program
US10996479B2 (en) 2018-10-01 2021-05-04 Honda Motor Co., Ltd. Display device, display control method, and storage medium
JP2020056887A (ja) * 2018-10-01 2020-04-09 本田技研工業株式会社 表示装置、表示制御方法、およびプログラム
WO2020085159A1 (fr) * 2018-10-23 2020-04-30 日本精機株式会社 Dispositif d'affichage
JP2020132137A (ja) * 2019-02-14 2020-08-31 株式会社デンソー 表示制御装置及び表示制御プログラム
JP7063316B2 (ja) 2019-02-14 2022-05-09 株式会社デンソー 表示制御装置及び表示制御プログラム
WO2020166252A1 (fr) * 2019-02-14 2020-08-20 株式会社デンソー Dispositif de commande d'affichage, programme de commande d'affichage et support tangible non transitoire lisible par ordinateur
CN113677553A (zh) * 2019-04-11 2021-11-19 三菱电机株式会社 显示控制装置以及显示控制方法
CN113784861B (zh) * 2019-05-15 2023-01-17 日产自动车株式会社 显示控制方法及显示控制装置
CN113784861A (zh) * 2019-05-15 2021-12-10 日产自动车株式会社 显示控制方法及显示控制装置
CN113924518A (zh) * 2019-06-13 2022-01-11 大众汽车股份公司 控制机动车的增强现实平视显示器装置的显示内容
WO2020249367A1 (fr) * 2019-06-13 2020-12-17 Volkswagen Aktiengesellschaft Commande d'un affichage d'un dispositif d'affichage tête haute à réalité augmentée pour un véhicule automobile
JP7173078B2 (ja) 2019-07-02 2022-11-16 株式会社デンソー 表示制御装置及び表示制御プログラム
JP2021009133A (ja) * 2019-07-02 2021-01-28 株式会社デンソー 表示制御装置及び表示制御プログラム
WO2021002081A1 (fr) * 2019-07-02 2021-01-07 株式会社デンソー Dispositif et programme de commande d'affichage
CN111476104B (zh) * 2020-03-17 2022-07-01 重庆邮电大学 动态眼位下ar-hud图像畸变矫正方法、装置、系统
CN111476104A (zh) * 2020-03-17 2020-07-31 重庆邮电大学 动态眼位下ar-hud图像畸变矫正方法、装置、系统
WO2022137558A1 (fr) 2020-12-25 2022-06-30 日産自動車株式会社 Dispositif de traitement d'informations et procédé de traitement d'informations
FR3118728A1 (fr) * 2021-01-12 2022-07-15 Psa Automobiles Sa Véhicule automobile comportant un système ADAS couplés à un système d’affichage en réalité augmentée dudit véhicule.
WO2022152981A1 (fr) * 2021-01-12 2022-07-21 Psa Automobiles Sa Véhicule automobile comportant un système adas couplés à un système d'affichage en réalité augmentée dudit véhicule.
US20220292838A1 (en) * 2021-03-10 2022-09-15 Yazaki Corporation Vehicular display apparatus
EP4057251A1 (fr) * 2021-03-10 2022-09-14 Yazaki Corporation Appareil d'affichage pour véhicule
CN115431764A (zh) * 2022-10-10 2022-12-06 江苏泽景汽车电子股份有限公司 一种ar标尺展示方法、装置、电子设备及存储介质
CN115431764B (zh) * 2022-10-10 2023-11-17 江苏泽景汽车电子股份有限公司 一种ar标尺展示方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
JP2019217790A (ja) 2019-12-26

Similar Documents

Publication Publication Date Title
WO2018070193A1 (fr) Dispositif d'affichage tête haute
JP7437449B2 (ja) 画像投射装置および画像投射方法
JP6818100B2 (ja) 投射型表示装置
JP6629889B2 (ja) ヘッドアップディスプレイ装置
US10866415B2 (en) Head-up display apparatus
US20170161009A1 (en) Vehicular display device
JP2019059248A (ja) ヘッドアップディスプレイ装置
JP2016055691A (ja) 車両用表示システム
WO2017134861A1 (fr) Dispositif d'affichage tête haute
JP2019113809A (ja) ヘッドアップディスプレイ装置
JP3931343B2 (ja) 経路誘導装置
JP4692595B2 (ja) 車両用情報表示システム
JP2019059247A (ja) ヘッドアップディスプレイ装置
JP2005127995A (ja) 経路誘導装置、経路誘導方法及び経路誘導プログラム
JP7111582B2 (ja) ヘッドアップディスプレイシステム
JP6801508B2 (ja) ヘッドアップディスプレイ装置
JP2015074391A (ja) ヘッドアップディスプレイ装置
JP2018159738A (ja) 虚像表示装置
JP6872441B2 (ja) ヘッドアップディスプレイ装置
JP7344635B2 (ja) ヘッドアップディスプレイ装置
JP6384529B2 (ja) 視界制御装置
JP2019202641A (ja) 表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17859589

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17859589

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP