WO2018070193A1 - Head-up display device - Google Patents

Head-up display device Download PDF

Info

Publication number
WO2018070193A1
WO2018070193A1 PCT/JP2017/033683 JP2017033683W WO2018070193A1 WO 2018070193 A1 WO2018070193 A1 WO 2018070193A1 JP 2017033683 W JP2017033683 W JP 2017033683W WO 2018070193 A1 WO2018070193 A1 WO 2018070193A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
vehicle
information
display device
display area
Prior art date
Application number
PCT/JP2017/033683
Other languages
French (fr)
Japanese (ja)
Inventor
望 下田
昭央 三沢
壮太 佐藤
健 荒川
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Publication of WO2018070193A1 publication Critical patent/WO2018070193A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0968Systems involving transmission of navigation instructions to the vehicle

Definitions

  • the present invention relates to a technology for a head-up display device, and more particularly to a technology effective when applied to a head-up display device using AR (Augmented Reality).
  • AR Augmented Reality
  • HUD head-up display
  • This HUD device projects driving information such as vehicle speed and engine speed or information such as car navigation onto the windshield as described above.
  • the driver can check information without moving the line of sight to an instrument panel incorporated in the dashboard, such as a so-called instrument panel, and the amount of movement of the line of sight can be reduced.
  • some HUD devices display information for supporting safe driving such as detection of pedestrians and obstacles in addition to the above-described traveling information and car navigation information. For example, when displaying a road sign on a side road or the presence of a pedestrian, it is required to enlarge the HUD device.
  • JP 2016-91084 A Japanese Unexamined Patent Publication No. 2016-60303
  • An object of the present invention is to provide a technology that allows a driver to easily recognize important information while increasing the display area of a head-up display device.
  • a typical head-up display device projects a video on a windshield of the vehicle, and displays a virtual image superimposed on the scenery in front of the vehicle to the driver.
  • This head-up display device has a vehicle information acquisition unit, a control unit, a video display device, a mirror, a mirror driving unit, and a display distance adjustment mechanism.
  • the vehicle information acquisition unit acquires various types of vehicle information that can be detected by the vehicle.
  • the control unit controls the display of the video displayed in the display area viewed through the windshield from the driver's seat of the vehicle based on the vehicle information acquired by the vehicle information acquisition unit.
  • the video display device generates a video based on an instruction from the control unit.
  • Mirror reflects the image generated by the image display device and projects it to the windshield.
  • the mirror driving unit changes the angle of the mirror based on an instruction from the control unit.
  • the display distance adjustment mechanism adjusts the display distance of the virtual image for the driver.
  • the display area controlled by the control unit includes a first display area and a second display area that is an area above the first display area.
  • the first display area is an area for displaying augmented reality.
  • the second display area is an area that does not display augmented reality.
  • the first display area displays the first information
  • the second display area displays the second information having a lower priority than the first information.
  • the first information is safe driving support information that is information that supports safe driving
  • the second information is driving support information that is information that supports driving behavior.
  • FIG. 6 is an explanatory diagram showing an outline of an example of an operation concept in AR-HUD according to Embodiment 1.
  • FIG. 3 is a functional block diagram showing an overview of an overall configuration example of an AR-HUD according to Embodiment 1.
  • FIG. 3 is an explanatory diagram showing an outline of an example of a hardware configuration related to acquisition of vehicle information in the AR-HUD of FIG. 2.
  • FIG. 3 is a functional block diagram illustrating details of a configuration example of the AR-HUD in FIG. 2. It is explanatory drawing which showed the detail about the example of a structure in the control part and display distance adjustment mechanism of FIG. 3 is a flowchart showing an outline of an example of an initial operation in the AR-HUD of FIG.
  • FIG. 3 is a flowchart showing an outline of an example of normal operation in the AR-HUD of FIG. It is the flowchart which showed the outline
  • summary about the example of the brightness level adjustment process which is a process of step S22 of FIG. 3 is a flowchart showing an outline of an example of a flow of processing for adjusting display contents and display method of a virtual image in the AR-HUD of FIG. 2.
  • 10 is a flowchart showing an overview of an example of the flow of display position adjustment processing that is processing of step S233 in the display video determination / change processing of FIG. 9;
  • FIG. 3 is an explanatory diagram illustrating an example of a display area by AR-HUD in FIG. 2.
  • FIG. 12 is an explanatory diagram illustrating another example of display on the display screen using the enlarged display area of FIG. 11.
  • FIG. 10 is an explanatory diagram showing an example of a region when a display region by AR-HUD according to the second embodiment is enlarged in a horizontal direction and a vertical direction. It is explanatory drawing which shows an example of the display in the display area expanded in the horizontal direction and the vertical direction of FIG. It is explanatory drawing which shows an example of the display in the display area following FIG. It is explanatory drawing which shows an example of the display in a 2nd display area. It is explanatory drawing which shows an example of the display following FIG.
  • FIG. 20 is an explanatory diagram illustrating another display example of FIG. 19.
  • FIG. 18 is an explanatory diagram illustrating an example of a cooperative display operation in a display area by the AR-HUD of FIG.
  • FIG. 18 is an explanatory diagram illustrating an example of a display that reduces an increase in viewpoint movement in the upper part of the display area by the AR-HUD in FIG. 17.
  • FIG. 18 is an explanatory diagram illustrating an example of a display for reducing oversight of a road sign in the display area by the AR-HUD of FIG.
  • FIG. 18 is an explanatory diagram illustrating an example of display according to a road condition risk level in the display area by the AR-HUD of FIG. 17;
  • FIG. 18 is an explanatory diagram showing an example of navigation guidance display at an intersection in the display area by the AR-HUD of FIG. 17.
  • FIG. 18 is an explanatory diagram illustrating an example of a menu for customizing display in a display area by AR-HUD in FIG. 17.
  • FIG. 1 is an explanatory diagram showing an outline of an example of an operation concept in a HUD device (hereinafter sometimes referred to as “AR-HUD”) that realizes an AR function according to the first embodiment.
  • AR-HUD HUD device
  • the AR-HUD 1 that is a head-up display device reflects an image displayed on an image display device 30 such as a projector or an LCD (Liquid Crystal Display) by a mirror 51 or a mirror 52, The light is projected onto the windshield 3 of the vehicle 2.
  • the mirror 51 and the mirror 52 are, for example, a free-form surface mirror or a mirror having an optical axis asymmetric shape.
  • the driver 5 views the image projected as a virtual image in front of the transparent windshield 3 by viewing the image projected on the windshield 3.
  • the position of the image projected on the windshield 3 is adjusted, and the display position of the virtual image viewed by the driver 5 can be adjusted in the vertical direction. is there.
  • the display distance such as displaying a virtual image near (for example, 2 to 3 m away) or distant (for example, 30 to 40 m away).
  • the AR function is realized by adjusting the display position and display distance so that the virtual image is superimposed on the scenery outside the vehicle (roads, buildings, people, etc.).
  • the AR-HUD 1 of the present embodiment has an enlarged display area of an image projected on the windshield 3, that is, a display area 6 shown in FIGS. 11 and 12, which will be described later. It can be displayed on the shield 3. This can be realized, for example, by increasing the area of the mirror 52 or the like.
  • the enlargement of the display area 6 is not limited to this, and may be realized by other techniques.
  • FIG. 2 is a functional block diagram showing an overview of an overall configuration example of the AR-HUD according to the first embodiment.
  • the AR-HUD 1 mounted on the vehicle 2 includes a vehicle information acquisition unit 10, a control unit 20, a video display device 30, a display distance adjustment mechanism 40, a mirror driving unit 50, a mirror 52, and a speaker 60.
  • the shape of the vehicle 2 is displayed like a passenger car. However, the shape is not limited to this, and can be applied as appropriate to general vehicles.
  • the vehicle information acquisition unit 10 includes information acquisition devices such as various sensors, which will be described later, installed in each part of the vehicle 2, detects various events that occur in the vehicle 2, and enters a traveling state at predetermined intervals.
  • the vehicle information 4 is acquired and output by detecting and acquiring the values of various parameters.
  • the vehicle information 4 includes, for example, speed information and gear information of the vehicle 2, steering wheel steering angle information, lamp lighting information, outside light information, distance information, infrared information, engine ON / OFF information, inside / outside of the vehicle, as shown in the figure.
  • Camera image information, acceleration gyro information, GPS (Global Positioning System) information, navigation information, vehicle-to-vehicle communication information, road-to-vehicle communication information, and the like may be included.
  • the control unit 20 has a function of controlling the operation of the AR-HUD 1 and is implemented by, for example, a CPU (Central Processing Unit) and software executed thereby. It may be implemented by hardware such as a microcomputer or FPGA (Field Programmable Gate Gate Array).
  • a CPU Central Processing Unit
  • FPGA Field Programmable Gate Gate Array
  • control unit 20 generates a video to be displayed as a virtual image by driving the video display device 30 based on the vehicle information 4 acquired from the vehicle information acquisition unit 10 and the like, and generates the image on the mirror 52.
  • the light is projected onto the windshield 3 by appropriately reflecting it. Then, control is performed such as adjusting the display position of the virtual image display area 6 and adjusting the display distance of the virtual image.
  • the video display device 30 is a device configured by, for example, a projector or an LCD, and generates a video for displaying a virtual image based on an instruction from the control unit 20 and projects or displays the video. To do.
  • the display distance adjustment mechanism 40 is a mechanism for adjusting the distance of the displayed virtual image from the driver 5 based on an instruction from the control unit 20.
  • the mirror driving unit 50 adjusts the angle of the mirror 52 based on an instruction from the control unit 20 and adjusts the position of the virtual image display area 6 in the vertical direction.
  • Speaker 60 performs audio output related to AR-HUD1. For example, voice guidance of a navigation system, voice output when notifying the driver 5 with a AR function, or the like can be performed.
  • FIG. 3 is an explanatory diagram showing an outline of an example of a hardware configuration relating to acquisition of the vehicle information 4 in the AR-HUD of FIG.
  • the acquisition of the vehicle information 4 is performed by an information acquisition device such as various sensors connected to the ECU 21 under the control of an ECU (Electronic Control Unit) 21, for example.
  • ECU Electronic Control Unit
  • These information acquisition devices include, for example, a vehicle speed sensor 101, a shift position sensor 102, a steering wheel angle sensor 103, a headlight sensor 104, an illuminance sensor 105, a chromaticity sensor 106, a distance measuring sensor 107, an infrared sensor 108, and an engine start.
  • Sensor 109 acceleration sensor 110, gyro sensor 111, temperature sensor 112, road-to-vehicle communication wireless receiver 113, vehicle-to-vehicle communication wireless receiver 114, camera (inside the vehicle) 115, camera (outside the vehicle) 116, GPS receiver 117, And VICS (Vehicle Information and Communication Communication System: road traffic information communication system, registered trademark (hereinafter the same)) receiver 118 and the like.
  • VICS Vehicle Information and Communication Communication Communication System: road traffic information communication system, registered trademark (hereinafter the same) receiver 118 and the like.
  • the vehicle information 4 that can be acquired by the provided device can be used as appropriate.
  • the vehicle speed sensor 101 acquires speed information of the vehicle 2 in FIG.
  • the shift position sensor 102 acquires current gear information of the vehicle 2.
  • the steering wheel angle sensor 103 acquires steering wheel angle information.
  • the headlight sensor 104 acquires lamp lighting information related to ON / OFF of the headlight.
  • the illuminance sensor 105 and the chromaticity sensor 106 acquire external light information.
  • the distance measuring sensor 107 acquires distance information between the vehicle 2 and an external object.
  • the infrared sensor 108 acquires infrared information related to the presence / absence and distance of an object at a short distance of the vehicle 2.
  • the engine start sensor 109 detects engine ON / OFF information.
  • the acceleration sensor 110 and the gyro sensor 111 acquire acceleration gyro information including acceleration and angular velocity as information on the posture and behavior of the vehicle 2.
  • the temperature sensor 112 acquires temperature information inside and outside the vehicle.
  • the road-to-vehicle communication wireless receiver 113 and the vehicle-to-vehicle communication wireless receiver 114 are road-to-vehicle communication information received by road-to-vehicle communication between the vehicle 2 and roads, signs, signals, etc. Vehicle-to-vehicle communication information received by vehicle-to-vehicle communication with the other vehicle.
  • the camera (inside the vehicle) 115 and the camera (outside the vehicle) 116 capture the moving image of the situation inside and outside the vehicle, and acquire the camera image information inside the vehicle and the camera image information outside the vehicle, respectively.
  • the camera (inside the vehicle) 115 shoots, for example, the posture of the driver 5 in FIG. By analyzing the obtained moving image, it is possible to grasp, for example, the fatigue status of the driver 5 and the position of the line of sight.
  • the camera (outside the vehicle) 116 photographs the surrounding situation such as the front and rear of the vehicle 2.
  • the presence or absence of moving objects such as other vehicles and people in the vicinity, road surface conditions such as buildings and terrain, rain and snow, freezing, unevenness, and road signs, etc. are grasped It is possible.
  • the GPS receiver 117 and the VICS receiver 118 obtain GPS information obtained by receiving the GPS signal and VICS information obtained by receiving the VICS signal, respectively. It may be implemented as a part of a car navigation system that acquires and uses these pieces of information.
  • FIG. 4 is a functional block diagram showing details of the configuration example of the AR-HUD in FIG.
  • the video display device 30 is a projector, and the video display device 30 includes, for example, each unit such as a light source 31, an illumination optical system 32, and a display element 33.
  • the light source 31 is a member that generates projection illumination light.
  • a high-pressure mercury lamp, a xenon lamp, an LED (Light Emitting Diode) light source, a laser light source, or the like can be used.
  • the illumination optical system 32 is an optical system that collects the illumination light generated by the light source 31 and irradiates the display element 33 with more uniform illumination light.
  • the display element 33 is an element that generates an image to be projected.
  • a transmissive liquid crystal panel, a reflective liquid crystal panel, a DMD (Digital Micromirror Device) (registered trademark) panel, or the like can be used.
  • control unit 20 includes an ECU 21, an audio output unit 22, a nonvolatile memory 23, a memory 24, a light source adjustment unit 25, a distortion correction unit 26, a display element drive unit 27, a display distance adjustment unit 28, and a mirror adjustment. Each part such as the part 29 is included.
  • the ECU 21 acquires the vehicle information 4 via the vehicle information acquisition unit 10, and records, stores, and reads the acquired information in the nonvolatile memory 23 and the memory 24 as necessary. To do.
  • the nonvolatile memory 23 may store setting information such as setting values and parameters for various controls. Further, the ECU 21 generates video data relating to a virtual image to be displayed as the AR-HUD 1 by executing a dedicated program.
  • the audio output unit 22 outputs audio information via the speaker 60 as necessary.
  • the light source adjustment unit 25 adjusts the light emission amount of the light source 31 of the video display device 30. When there are a plurality of light sources 31, they may be controlled individually.
  • the distortion correction unit 26 corrects image distortion caused by the curvature of the windshield 3 by image processing when the image generated by the ECU 21 is projected onto the windshield 3 of the vehicle 2 by the image display device 30.
  • the display element drive unit 27 sends a drive signal corresponding to the video data corrected by the distortion correction unit 26 to the display element 33 to generate an image to be projected.
  • the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to adjust the display distance of the image projected from the image display device 30.
  • the mirror adjustment section 29 changes the angle of the mirror 52 via the mirror driving section 50 and moves the virtual image display area 6 up and down.
  • FIG. 5 is an explanatory diagram showing details of an example of the configuration of the control unit and the display distance adjusting mechanism of FIG.
  • the display distance adjustment unit 28 of the control unit 20 further includes, for example, a functional liquid crystal film ON / OFF control unit 281, a lens movable unit 282, and a dimming mirror ON / OFF as each unit individually controlled by the ECU 21. It includes an OFF control unit 283, a diffusion plate movable unit 284, an optical filter movable unit 285, and the like.
  • the display distance adjustment mechanism 40 further includes a functional liquid crystal film 401, a lens movable mechanism 402, a light control mirror 403, a diffuser plate movable mechanism 404, and an optical.
  • a filter movable mechanism 405 and the like are included.
  • the AR-HUD 1 does not have to include all of these units and devices, and may include various units necessary for mounting a device to which a virtual image display distance adjustment technique is applied.
  • FIG. 6 is a flowchart showing an outline of an example of the initial operation in the AR-HUD of FIG.
  • the AR-HUD 1 When the power of the AR-HUD 1 is turned on by turning on the ignition switch in the vehicle 2 that is stopped (S01), the AR-HUD 1 starts with the vehicle information acquisition unit 10 based on an instruction from the control unit 20. Thus, the vehicle information 4 is acquired (S02).
  • control unit 20 calculates a suitable brightness level based on external light information acquired by the illuminance sensor 105, the chromaticity sensor 106, and the like in the vehicle information 4 (S03), and the light source adjustment unit 25 uses the light source 31. Is set so that the calculated brightness level is obtained (S04). For example, when the outside light is bright, the brightness level is set high, and when the outside light is dark, the brightness level is set low.
  • the ECU 21 determines and generates a video to be displayed as a virtual image, for example, an initial image (S05), and performs a process of correcting the distortion by the distortion correction unit 26 for the generated video (S06).
  • the display element 33 is driven and controlled by the element driving unit 27 to generate a projected image (S07).
  • the image is projected onto the windshield 3 and the driver 5 can visually recognize the virtual image.
  • the ECU 21 or the display distance adjustment unit 28 calculates and determines the display distance of the virtual image (S08), and the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to display the image projected from the video display device 30.
  • the distance is controlled (S09).
  • the HUD-ON signal is output.
  • the control unit 20 determines whether or not this signal has been received (S10). .
  • the HUD-ON signal is further waited for a predetermined time (S11), and the HUD-ON signal waiting process (S11) is repeated until it is determined that the HUD-ON signal is received in the process of step S10.
  • step S10 If it is determined in step S10 that the HUD-ON signal has been received, normal operation of the AR-HUD 1 described later is started (S12), and a series of initial operations are terminated.
  • FIG. 7 is a flowchart showing an outline of an example of normal operation in the AR-HUD of FIG.
  • the AR-HUD 1 acquires the vehicle information 4 by the vehicle information acquisition unit 10 based on an instruction from the control unit 20 (S21). And the control part 20 performs a brightness level adjustment process based on the external light information acquired by the illumination intensity sensor 105, the chromaticity sensor 106, etc. among the vehicle information 4 (S22).
  • FIG. 8 is a flowchart showing an outline of an example of the brightness level adjustment process which is the process of step S22 of FIG.
  • a suitable brightness level is calculated based on the acquired outside light information (S221). Then, by comparing with the currently set brightness level, it is determined whether or not the brightness level needs to be changed (S222). If no change is necessary, the brightness level adjustment process is terminated.
  • the light source adjustment unit 25 controls the light emission amount of the light source 31 to set the brightness level after the change (S223), and the brightness level adjustment process is ended. .
  • step S222 even when there is a difference between the preferred brightness level calculated in step S221 and the currently set brightness level, the difference is equal to or greater than a predetermined threshold value. Alternatively, it may be determined that it is necessary to change the brightness level only.
  • the ECU 21 changes the video to be displayed as a virtual image from the current one as necessary based on the latest vehicle information 4 acquired in the process of step S ⁇ b> 21, and determines the changed video. Are generated (S23).
  • adjustment / correction processing is performed to maintain the visibility and appropriateness of display contents according to the traveling state of the vehicle 2.
  • the angle of the mirror 52 is changed via the mirror driving unit 50, and a mirror adjustment process is performed to move the virtual image display area 6 up and down ( S24). Thereafter, a vibration correction process for correcting the display position of the image in the display area 6 with respect to the vibration of the vehicle 2 is performed (S25).
  • the distortion correction unit 26 performs distortion correction processing on the adjusted / corrected image (S26), and then the display element driving unit 27 drives and controls the display element 33 to generate a projected image ( S27).
  • the display distance adjustment unit 28 calculates and determines the display distance of the virtual image (S28), and the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to display the image projected from the image display device 30.
  • the distance is controlled (S29).
  • a HUD-OFF signal is output to the AR-HUD 1. It is then determined whether or not this signal has been received (S30).
  • step S21 If the HUD-OFF signal has not been received, the process returns to step S21, and a series of normal operations are repeated until the HUD-OFF signal is received. If it is determined that the HUD-OFF signal has been received, a series of normal operations is terminated.
  • AR-HUD 1 adjusts the display contents and display method of the virtual image itself to be displayed according to the situation such as the scenery in front of the vehicle 2 in addition to the adjustment of the virtual image display area 6 and the adjustment of the display distance. Thereby, it is possible to superimpose a more appropriate virtual image on the front landscape in a more appropriate position and manner.
  • FIG. 9 is a flowchart showing an outline of an example of the flow of processing for adjusting the display content and display method of the virtual image in the AR-HUD of FIG.
  • the generation / display of the display contents (contents) of the virtual image in the AR-HUD 1 is performed by the process of step S05 at the initial operation of FIG. 6 and the process of step S23 at the normal operation of FIG.
  • the processing content is shown by taking the display video change / decision processing in the processing of step S23 in the normal operation of FIG. 7 as an example.
  • the ECU 21 performs standard content generation processing (S231) and event content generation processing (S232).
  • the standard content basically refers to content such as a vehicle speed display that is always displayed in the display area 6 while the vehicle 2 is traveling.
  • the event content refers to content such as an alert display that is displayed as necessary based on the driving situation including the scenery situation in front of the vehicle 2. In any case, a plurality of contents may be generated.
  • the ECU 21 adjusts the display position, display color, etc. of the generated content in relation to the front landscape grasped by the camera video information (S233) and the display color adjustment process (S234). )I do.
  • video data for display related to each adjusted content is generated (S235), and the process ends. Note that the video data generated here is projected by the display element driving unit 27 in the processing of the subsequent step S27 in FIG.
  • FIG. 10 is a flowchart showing an outline of an example of the flow of the display position adjustment process which is the process of step S233 in the display video determination / change process of FIG.
  • the camera image information outside the vehicle in the vehicle information 4 obtained in the process of step S21 in FIG. 7 is analyzed, and whether or not there is an object that should be avoided from being hidden in the scenery in front.
  • S3100 For example, a traffic signal, a pedestrian, a two-wheeled vehicle, a forward vehicle, and the like may be applicable in addition to a curve mirror and a road sign.
  • the current display position of each standard content and each event content generated in the processing of step S231 and step S232 of FIG. 9 is collated with the coordinates of each object (S3200). Then, it is determined whether or not the display position of each content needs to be adjusted, that is, whether or not the display of each content is an obstacle to the object (S3300). For example, the display position of the arrow graphic or alert display and the position of the curve mirror or road sign are compared by image processing or the like to determine whether or not the curve mirror or road sign is hidden.
  • step S3300 If it is determined in step S3300 that adjustment of the display position is unnecessary, the process ends. If it is determined that the display position needs to be adjusted, the display position is adjusted and moved for the target content, and a new display position is set (S3400).
  • the method for determining the position to move the content is not particularly limited. Instead of or in addition to the adjustment of the position of the content, the display size of the content may be reduced so that the object is not hidden (S3500).
  • step S3200 After making these adjustments, the process returns to step S3200 again, and the above-described series of processing is repeated until it is no longer necessary to adjust the display position of each content. Since there is a possibility that another object exists before the content is moved, it is necessary to adjust until the display position is appropriate.
  • virtual image content can be displayed in a state where the object is avoided, and the visibility of the object can be improved.
  • the display example in the AR-HUD 1 described below is realized by the AR-HUD 1 executing the processes shown in FIGS. 7 to 10 described above.
  • FIG. 11 is an explanatory diagram showing an example of a display area by the AR-HUD 1 in FIG.
  • FIG. 11 schematically shows an example of the display area 6 that the driver 5 of the vehicle 2 visually recognizes from the driver's seat through the windshield 3.
  • FIG. 11A shows an example in which the display area 6 is not enlarged in the horizontal direction
  • FIG. 11B shows an area in the horizontal direction (hereinafter, horizontal direction) with respect to the road surface.
  • An example in the case of being enlarged is shown.
  • the display area 6 is enlarged in the horizontal direction with respect to the road surface, for example, so that a virtual image is superimposed on a wider area outside the vehicle. be able to.
  • a virtual image can be superimposed and displayed on a vehicle traveling on the opposite lane or a pedestrian on the sidewalk.
  • FIG. 12 is an explanatory diagram showing a display example of a guidance screen based on navigation information when the display area of FIG. 11 is enlarged.
  • FIG. 12 schematically illustrates an example of a front landscape viewed by the driver 5 of the vehicle 2 from the driver's seat through the windshield 3 and a state of a virtual image in the display area 6 projected on the windshield 3. It is shown.
  • the dotted line in the display area 6 indicates the display area when the display area 6 is not enlarged in the horizontal direction.
  • FIG. 12 shows a state in which an arrow 300, a vehicle display 301, a pedestrian display 302, and the like are superimposed on the scenery outside the vehicle in the display area 6.
  • the icon of the instrument for displaying the vehicle speed and the like in the figure, “30 km / h” may be referred to as “vehicle speed display” in the following
  • vehicle speed display in the following
  • the arrow 300 is an arrow for instructing and navigating the traveling direction of the vehicle 2 and is displayed superimposed on the road on which the vehicle 2 is traveling.
  • the vehicle display 301 is a display indicating that there is a running vehicle. In the example of FIG. 12, the vehicle display 301 is circular and is displayed so as to surround the vehicle 200 traveling in the oncoming lane.
  • the shape of the vehicle display 301 is not particularly limited, and may be any shape other than a circle, such as a triangle or a quadrangle.
  • the pedestrian display 302 indicates that there is a pedestrian 201 walking on a sidewalk and the traveling direction of the pedestrian 201.
  • a circular pedestrian display 302 is displayed superimposed on the feet of the pedestrian 201, and the traveling direction of the pedestrian 201 is indicated by an arrow.
  • the shape of the pedestrian display 302 is not particularly limited as in the vehicle display 301.
  • the arrow 300 is generated based on the navigation information acquired from the vehicle information acquisition unit 10 by the ECU 21.
  • the vehicle display 301 and the pedestrian display 302 are generated based on camera video information and infrared information acquired by the ECU 21 from the vehicle information acquisition unit 10.
  • the vehicle speed display is generated based on the speed information acquired by the ECU 21 from the vehicle information acquisition unit 10.
  • the display area 6 is expanded in the horizontal direction, so that, for example, even when the vehicle 2 is traveling straight, the pedestrian display 302 indicating the pedestrian 201, the vehicle display 301 in the oncoming lane, and the like are displayed for driving. Since the person 5 can perceive, it can contribute to safe driving.
  • the display timing of the pedestrian display 302 is when the vehicle 2 turns to the left and the pedestrian 201 enters the display area indicated by the dotted line, and the timing for causing the driver 5 to detect the pedestrian 201 is delayed. It will end up.
  • FIG. 13 is an explanatory diagram showing an example when there are a plurality of roads that can turn left when the vehicle turns left by the navigation guidance.
  • FIG. 14 is an explanatory diagram showing a display example of a guidance screen based on navigation information when the display area in the road configuration of FIG. 13 is enlarged.
  • FIG. 13 shows a case where there are two roads that can be turned to the left at a short distance, and the vehicle 2 makes a left turn not on the road in front but on the road in the back.
  • the road to turn left is outside the display area and is not included in the display area.
  • the display area 6 expanded in the horizontal direction as indicated by a solid line, not only the road on which the vehicle 2 makes a left turn, but also the road that can turn left before that is included in the display area.
  • an arrow 300 which is a guidance display for instructing / navigating the traveling direction and the like can be displayed in a superimposed manner on the road on the far side to turn left.
  • the display area 6 in FIG. 14 includes a vehicle speed display (characters “30 km / h” in the figure), a distance to the road to turn left (characters “20 m” in the figure), and a vehicle display 301. Is displayed as a virtual image.
  • a left turn has been described as an example, but an arrow 300 or the like can be displayed on a right turn in a similar manner by superimposing it on a road that makes a right turn.
  • the road on the right turn side is not included in the display area 6.
  • the display area 6 may be further expanded in the horizontal direction so that the left-turn road is included in the display area 6.
  • the dotted line range shown in FIG. 14 becomes the display area, and thus an arrow is displayed in the dotted line display area.
  • the road to be turned left is not included in the display area, it is not known whether the left or right road is to be turned left, and the driver 5 may be confused.
  • the driver 5 can accurately grasp that he / she makes a left turn on the back road, not on the front side. This is particularly effective when there are a plurality of roads that can turn left at a short distance, or when the roads have complicated shapes.
  • the road on which the vehicle 2 travels can be accurately grasped by the driver 5 and concentrated on driving without being confused, thereby contributing to safe driving.
  • FIG. 15 is an explanatory view showing another display example of the guidance screen based on the navigation information by the enlarged display area 6 of FIG.
  • FIG. 15 also schematically shows an example of the front landscape viewed from the driver's seat by the driver 5 of the vehicle 2 and the state of the virtual image in the display area 6 projected on the windshield.
  • FIG. 15 shows a display example when the vehicle 2 approaches before the intersection where entry is prohibited except in the designated direction.
  • a sign 202 is provided on the front left side of the vehicle 2.
  • This sign 202 is a road sign indicating entry prohibition other than the designated direction. Therefore, at the intersection, the vehicle 2 can only travel straight ahead and cannot enter either the left-turn road or the right-turn road.
  • an entry prohibition display 303 indicating that it is not possible to enter this road is displayed superimposed on the road on the left turn side on the road on the left side of the intersection.
  • the entry prohibition display 303 which is a restriction instruction display, recognizes the meaning of the sign 202 and the like from the camera video information outside the vehicle acquired by the ECU 21 from the vehicle information acquisition unit 10, and is generated based on the recognition result.
  • the road on the left turn side of the intersection is included in the display area 6 and the entry prohibition display 303 can be displayed.
  • the sign 202 is a road sign indicating entry prohibition other than the designated direction.
  • other road signs such as a regulation sign and an instruction sign are similarly displayed according to the recognized road sign.
  • the regulation instruction display is performed on the display area 6.
  • FIG. 15 The display such as the entry prohibition display 303 shown in FIG. 15 is particularly effective when the driver 5 misses a road sign.
  • an icon simulating a road sign for example, entry prohibition other than the designated direction
  • the driver 5 it is more appropriate for the driver 5 to display the entry prohibition display 303 as shown in FIG. It can be recognized that the user should not enter the road quickly. That is, the entry prohibition display 303 is displayed so as to be superimposed on the actual road, and it looks as if there is a pseudo wall, so that it can be intuitively notified that the user should not enter the road.
  • FIG. 16 is an explanatory view showing another display example by the enlarged display area 6 of FIG.
  • FIG. 16 shows a state in which the vehicle 200 is traveling in the opposite lane of the lane in which the vehicle 2 is traveling, and the vehicle 203 is traveling in front of the vehicle 2 in the lane in which the vehicle 2 is traveling.
  • a vehicle display 301 and a vehicle display 305 are displayed.
  • the traveling speed of the oncoming vehicle (character “30 km / h” in the figure) and the distance between the vehicle 203 traveling in front of the vehicle 2 (character “10 m” in the figure) are displayed as virtual images. Yes.
  • the inter-vehicle distance is generated based on the distance information acquired from the vehicle information acquisition unit 10 by the ECU 21.
  • the vehicle display 301 is a display indicating that there is a vehicle 200 traveling in the oncoming lane
  • the vehicle display 305 is a display indicating that there is a vehicle 203 traveling in front of the vehicle 2.
  • the vehicle display 301 is formed, for example, in a circular shape so as to be superimposed so as to surround the vehicle 200.
  • the vehicle display 305 also has a circular shape, for example, and is displayed so as to surround the vehicle 203.
  • the vehicle displays 301 and 305 are color-coded, for example, so that it can be distinguished whether the vehicle is traveling in the traveling lane or the vehicle traveling in the oncoming lane, that is, the oncoming vehicle. .
  • the color classification is represented by the presence / absence of hatching.
  • the vehicle displays 301 and 305 may be any display that can distinguish whether the vehicle is running ahead or on the opposite lane, for example, by changing the shape, in addition to the color coding.
  • AR-HUD1 in which the display area 6 is expanded in the horizontal direction, it is possible to display virtual images such as the arrow 300, the pedestrian display 302, the vehicle display 305, etc. even on the oncoming lane, the sidewalk, and the intersecting road. Therefore, it becomes easy to accurately grasp the course, and it becomes easy to recognize the vehicles 200 and 203 and the pedestrian 201 traveling forward or opposite, thereby contributing to safe driving.
  • the display area 6 by the AR-HUD 1 extends not only in the horizontal direction but also in the vertical direction, thereby increasing the amount of information that can be displayed in the vertical direction. Therefore, it is assumed that the traffic assistance information that is the second information is displayed in the upper part of the display area 6, and the priority information that is the first information is displayed in the lower part of the display area 6.
  • Traffic assistance information is information that assists driving operations such as traffic jam information, road information, or intersection information.
  • the traffic jam information is information indicating the traffic jam status on the road.
  • the road information is information indicating a lane change, for example.
  • the intersection information is information such as the name of the intersection.
  • the priority information is information related to the driving operation prioritized over the traffic assistance information, for example, the situation of other vehicles, pedestrians, road alignment, etc., which is displayed as a virtual image superimposed on the real scene.
  • FIG. 17 is an explanatory diagram showing an example of a region when the display region 6 by the AR-HUD 1 according to the second embodiment is expanded in the horizontal direction and the vertical direction.
  • FIG. 17 (a) when the display area 6 is expanded in the horizontal direction and the vertical direction, in addition to the display area 6 expanded in the horizontal direction in the first embodiment, there is a room for information display newly in the upper and lower parts. Will be born.
  • the area indicated by hatching in FIG. 17A is a new display area created by further expanding the display area 6 expanded in the horizontal direction in the vertical direction.
  • the area close to the upper end of the display area 6 in which the display area 6 is expanded not only in the horizontal direction but also in the vertical direction is a part that hits the scenery such as the sky and buildings that are not so much related to the driving operation from the viewpoint of the driver 5. .
  • the area below the upper part of the display area 6 is an area where the driver 5 obtains real scene information directly connected to the driving operation such as roads, traffic conditions, or road signs.
  • the upper part of the display area 6 is a second display area 6a, which is indicated by hatching in FIG. 17B, and the lower part of the display area 6 is a first display area 6b, which is shown in FIG. This is an area indicated by dots in b).
  • the second display area 6a displays traffic assistance information consisting of auxiliary information for assisting driving operation, information notifying the operation status of airplanes, railways, etc., information notifying incoming calls such as telephone calls and mails, and the like.
  • traffic assistance information consisting of auxiliary information for assisting driving operation, information notifying the operation status of airplanes, railways, etc., information notifying incoming calls such as telephone calls and mails, and the like.
  • priority information that is important information directly related to the driving operation is displayed in the first display area 6b.
  • the second display region 6a is a region in which a virtual image that is not mainly AR (Augmented Reality) is displayed
  • the first display region 6b is a region in which display is mainly performed by AR. is there.
  • the traffic assistance information displayed in the second display area 6a is a virtual image that is not AR
  • the priority information displayed in the first display area 6b is a virtual image display by AR. Shall.
  • the third display area 6c is an area surrounded by a dotted line in FIG.
  • a virtual image that is not AR is mainly displayed.
  • the second display area 6a for displaying the traffic assistance information and the like may be an area that does not block the information of the actual scene directly connected to the driving operation of the driver 5, and is limited to an area close to the upper end of the display area 6. It is not a thing.
  • the second display area 6a or (and) the third display area 6c may be enlarged, and the first display area 6b may be reduced, or the second display area 6a or (and) the third display area 6c may be reduced.
  • the display area 6c may be omitted, and all may be the first display area 6b.
  • the area When the area is enlarged or reduced, for example, the area may be enlarged or reduced based on information obtained by imaging the situation in front of the camera. That is, when the presence of a vehicle or a pedestrian is detected ahead, it is necessary to display a virtual image of the AR for the vehicle or pedestrian, so the second display area 6a or (and) the third display area Control is performed so that the first display area 6b is secured widely without enlarging 6c.
  • the second display area 6a or (and) the third display area 6c is enlarged and the first display area 6c is enlarged. You may control to reduce the display area 6b.
  • the area may be enlarged or reduced based on the vehicle speed information.
  • the second display area 6a or (and) the third display area 6c may be enlarged when traveling at a low speed or stopping. Control to reduce the first display area 6b and eliminate the second display area 6a or (and) the third display area 6c or reduce the first display area 6b as much as possible when traveling at high speed. Good.
  • FIG. 18 is an explanatory diagram showing an example of display in the display area enlarged in the horizontal direction and the vertical direction in FIG.
  • FIG. 19 is an explanatory diagram showing an example of display in the display area following FIG.
  • FIG. 18 (a) shows that the vehicle 2 is traveling without traffic jams
  • FIG. 18 (b) shows the display area 6 during traveling in FIG. 18 (a). A display example is shown.
  • FIG. 19A shows that a traffic jam has occurred at the end of the traveling vehicle 2
  • FIG. 19B shows a table of a display area 6 that informs that a traffic jam has occurred. An example is shown, and traffic congestion information on a road is displayed when a traffic jam occurs.
  • the sign information 310 indicating the vehicle traffic classification is a virtual image as traffic auxiliary information. Is displayed.
  • FIG. 19 (a) when it is detected that a traffic jam has occurred at the destination of the vehicle 2, as shown in FIG. 19 (b), in the second display area 6a, the sign information 310 and The traffic jam information 311 is displayed as traffic assistance information.
  • the sign information 310 is generated based on the camera video information acquired by the ECU 21 from the vehicle information acquisition unit 10.
  • the traffic jam information 311 is generated based on GPS information and VICS information.
  • the traffic assistance information such as the sign information 310 and the traffic jam information 311 is only information that assists driving and is not prioritized in the actual scene directly connected to the driving operation.
  • the sign information 310 and the traffic jam information 311 are displayed in an area where a real scene directly connected to the driving operation of the driver 5 can be seen, that is, in the second display area, the line of sight of the driver 5 focuses on the information. It may cause you to get out of your way.
  • the information can be read without obstructing the field of view at the time of driving that is directly connected to the driving operation and by relatively little viewpoint movement from the part of the real scene where the viewpoint is most concentrated during driving. it can. Therefore, it is possible to display traffic assistance information such as traffic jam information while reducing the distraction of the driver's attention and without deviating the viewpoint for a long time from the actual scene that should be seen when driving. Further, while the vehicle 2 is traveling, as shown in FIG. 19 (b), it is possible to reduce a driver's attention reduction by displaying only simple traffic jam information.
  • FIG. 20 is an explanatory diagram showing an example of display in the second display area
  • FIG. 21 is an explanatory diagram showing an example of display following FIG.
  • FIG. 20 and FIG. 21 show display examples of guidance screens based on navigation information. Here, a case where the vehicle 2 is traveling straight and turns left at the next intersection according to navigation guidance is shown.
  • the vehicle 2 is traveling straight, and when the distance to the intersection where the vehicle turns to the left is larger than the threshold value, the second display area 6a has the second display area 6a shown in FIG. As shown, only the sign information 310 indicating the vehicle traffic classification is displayed as a virtual auxiliary image as traffic assistance information.
  • the threshold value is a distance for determining the timing for displaying the left turn guidance screen.
  • a left turn guidance screen is displayed.
  • Guidance information 312 indicating turning left at the intersection is displayed as an AR superimposed on the traveling road as priority information.
  • the threshold value determination process and the guidance information 312 generation process are performed based on GPS information acquired from the vehicle information acquisition unit 10 by the ECU 21.
  • 20 and 21 show an example in which the guidance screen is switched in two steps according to one threshold value. For example, when the distance to the intersection to turn left is larger than the threshold value 1, traffic assistance information is displayed. If the threshold value is 1 or less and greater than the threshold value 2, traffic assistance information is displayed. If the threshold value is 2 or less, guidance information 312 indicating turning left at the intersection is superimposed on the road and displayed as an AR. Etc.
  • FIG. 22 is a diagram showing another display example of FIG.
  • detailed traffic jam information and detour information are displayed in the second display area 6a.
  • Detailed information 313 including proposals is displayed as traffic assistance information. Further, the detailed information 313 may be displayed so as to cover the first display area 6b when the detailed information 313 does not fit in the second display area 6a due to a large amount of information.
  • FIG. 23 is an explanatory diagram showing an example of the cooperative display operation in the display area by the AR-HUD of FIG.
  • FIG. 23 shows priority information in the first display area 6b from the state in which the sign information 310 indicating the vehicle traffic classification is displayed as the traffic auxiliary information in the second display area 6a while the vehicle 2 is traveling.
  • the example of a display until guidance information 312 is displayed as AR is shown.
  • the guidance information 312 in FIG. 23 is a lane change guidance display that prompts the user to move from the lane in which the vehicle 2 is traveling to the leftmost lane.
  • phase2 when the distance to the intersection where the vehicle 2 turns left is 250 m, the display of the sign information 310 in the second display area 6a is gradually thinned and the first information In the display area 6b, the guidance information 312 for prompting the lane change to move the vehicle 2 to the left lane is superimposed on the road so that the display gradually becomes dark, and AR display is performed.
  • phase2 the display shown in “phase2” is executed when the distance to the intersection to turn left is about 250 m, for example, but the distance is not limited to this.
  • phase 3 when the vehicle 2 reaches a distance of 200 m to the intersection where the vehicle turns to the left, the display of the sign information 310 in the second display area 6a is erased, and the first In the display area 6b, guidance information 312 for prompting a lane change to move to the lane is completely displayed on the left side.
  • the display of “phase 3” is about 200 m, for example, for the distance to the left turn intersection, but the distance is not limited to this.
  • the display of the guide information 312 is gradually thinned while the display of the sign information 310 is gradually darkened, and then the display of the guide information 312 is turned off and the sign information is displayed.
  • the display of 310 is completely displayed. That is, the display is performed in the order of “phase3”, “phase2”, and “phase1” so as to be the reverse of FIG. Also in this case, the display timings from “phase 3” to “phase 2” and from “phase 2” to “phase 1” may be changed according to the distance as in FIG. May be changed.
  • FIG. 24 is an explanatory diagram showing an example of the case where the viewpoint position of the virtual image display in the second display area is near.
  • FIG. 25 is an explanatory diagram showing an example of a display that reduces an increase in viewpoint movement in the second display area by the AR-HUD of FIG.
  • the second display area 6a is a portion that hits a landscape such as a sky or a building that is not related to the driving operation, and is displayed in the second display area 6a. It is assumed that many traffic assistance information and the like do not need to be displayed so as to be superimposed on the actual scene.
  • the viewpoint movement of the driver 5 becomes large, and as a result, there is a possibility that the driving may be hindered.
  • the viewpoint position DD of the driver 5 is about 30 m
  • the viewpoint position VD of the sign information 310 which is the traffic auxiliary information of the virtual image displayed in the second display area 6a
  • a large viewpoint movement from 30 m far to 2 m near is necessary. End up.
  • eyestrain is promoted or accidents are induced by overlooking information that must be seen during driving.
  • the viewpoint position DD of the driver 5 is about 30 m
  • the viewpoint position VD of the sign information 310 that is the traffic auxiliary information of the virtual image displayed in the second display area 6a is Similarly, when the distance is about 30 m, as shown in the lower part of FIG. 25, the viewpoint position DD and the viewpoint position VD are substantially the same position, so the driver 5 recognizes the traffic assistance information without moving the viewpoint forward and backward. be able to.
  • the movement of the viewpoint before and after can be greatly reduced, so that the information transmission speed can be improved, and the driver's 5 eyestrain and distraction of attention, etc. Can be reduced.
  • FIG. 26 is an explanatory view showing an example of a display for reducing oversight of a road sign in the display area by the AR-HUD of FIG.
  • a guide sign display is displayed on the sign 202 at a point where the distance of the viewpoint position of the driver 5 and the distance from the vehicle 2 to the sign 202 as a road sign are substantially the same.
  • the virtual image of the sign display 315 is superimposed and displayed.
  • the sign 202 is a guide sign that is installed in a portion that hits a landscape such as the sky. Therefore, when the sign 202 naturally enters the eyes of the driver 5, a virtual image of the sign display 315 is superimposed on the sign 202 and displayed. For example, when the distance of the viewpoint of the driver 5 is about 30 m, the sign display 315 is displayed when the distance from the vehicle 2 to the sign approaches about 30 m.
  • the sign display 315 is a virtual image display indicating that there is a road sign.
  • FIG. 26 shows an example in which an arrow that surrounds the marker 202 in a circle and causes the marker display 315 to be noticed is displayed as a virtual image as the marker display 316.
  • the shape of the virtual image of the sign display 315, 316 is not particularly limited, and may be any shape.
  • a traffic sign such as the sign 202 is recognized by the camera image information acquired by the ECU 21 from the vehicle information acquisition unit 10. Further, the distance to the sign 202 is recognized by infrared information acquired by the ECU 21 from the vehicle information acquisition unit 10 or the like.
  • the driver 5 changes the focus position. It has to change a lot and it takes time to recognize. Further, since the distance to the sign 202 is as short as about 5 m, the sign 202 may pass before it is recognized. That is, the virtual image shown in FIG. 26 exhibits a greater effect when displayed when the sign is far away, such as 30 m. When the distance to the sign becomes a short distance such as 5 m, it should be avoided to display a virtual image superimposed on the sign. The reason is that it is difficult to see the sign itself due to the virtual image.
  • the driver 5 can recognize the sign 202 without moving the viewpoint back and forth.
  • the oversight of the sign 202 can be reduced.
  • FIG. 27 is an explanatory view showing an example of display according to the road condition risk level in the display area by the AR-HUD of FIG.
  • FIG. 27 shows an example of the scenery in front of the driver's seat viewed through the windshield 3 and the state of the virtual image in the display area 6 projected on the windshield 3.
  • the state shown on the left side of FIG. 27 is the state with the lowest risk level (hereinafter referred to as low risk level) in the road condition, and the state shown on the right side of FIG. 27 is the state with the highest risk level (hereinafter referred to as high risk level). It is shown). Further, the state shown in the central part of FIG. 27 shows a state where the degree of danger is higher than the low degree of risk and the degree of risk is lower than the high degree of risk (hereinafter referred to as medium risk).
  • guidance is provided by navigation.
  • guidance information 312 indicating that the next intersection is to be turned left is displayed in a virtual image, and is displayed on the right side of the opposite lane.
  • the pedestrian 201 is walking on the sidewalk.
  • the danger level is set low.
  • the guidance information 312 indicating that the next intersection is to be turned left is displayed superimposed on the road of the traveling lane, and the pedestrian 201 on the sidewalk has a pedestrian.
  • a pedestrian warning display 320 indicating this is superimposed on the pedestrian 201 and displayed.
  • the pedestrian warning display 320 has a shape such as an arrow, for example, and is displayed so as to point to the pedestrian 201 by the arrow. This pedestrian warning display 320 is displayed when a pedestrian is detected. When the distance from the vehicle 2 to the pedestrian 201 is greater than or equal to a preset distance, or when the pedestrian 201 is on the side of the opposite lane. Displayed when you are in Further, when the vehicle 2 passes the pedestrian 201, the pedestrian warning display 320 is erased.
  • the shape of the pedestrian warning display 320 is not particularly limited, and may be other than the arrow shape.
  • the pedestrian 201 has moved near the pedestrian crossing at the intersection where the vehicle 2 turns left.
  • a pedestrian display 302 is superimposed on the pedestrian 201 and displayed in the display area 6 as in FIG.
  • the distance from the vehicle 2 to the pedestrian 201 is shorter than a preset distance, or the pedestrian 201 is in the vicinity of the pedestrian crossing. Yes.
  • the guidance information 312 When the risk level is low, the guidance information 312 is displayed superimposed on the road. However, when the risk level is medium, the guidance information 312 is not superimposed along the road. That is, it is displayed in the second display area 6a. In other words, the guidance information 312 is displayed upward, and the presence of the pedestrian 201 is displayed more strongly.
  • the pedestrian 201 has moved to the roadway on which the vehicle 2 travels instead of the sidewalk.
  • the risk level is the highest.
  • the guidance information 312 displayed in the state of risk is deleted from the display area 6.
  • a pedestrian display 302 is superimposed on the pedestrian 201, and a virtual image warning information 321 is displayed near the pedestrian 201 to warn that the pedestrian 201 is nearby and high in danger. It is displayed superimposed. Further, in order to appeal more strongly that the driver 2 is at high risk, the guidance by navigation is stopped and the guidance information 312 is deleted.
  • the pedestrian display 302 and the warning information 321 are deleted, and guidance by navigation is resumed.
  • the driver 5 is encouraged to pay attention to the warning event (in this case, a pedestrian). Further, when the degree of danger is not high, guidance by navigation can be continued in parallel with the warning, so that the driver 5 can be prevented from getting lost. Furthermore, since navigation guidance seamlessly moves from the lower part of the display area (first display area) to the upper part (second display area), or from the upper part to the lower part, depending on the degree of danger, guidance by navigation appears. The amount of change in the display is reduced compared to when it disappears or disappears, and it is possible to prevent the driver 5 from being surprised by the change in the display and distracting attention.
  • FIG. 28 is an explanatory diagram showing an example of navigation guidance display at an intersection in the display area by the AR-HUD of FIG.
  • FIG. 28 shows the traveling state of the vehicle 2, and the left side shows that the vehicle 2 is traveling before the intersection, that is, before the intersection.
  • the center portion shows a state in which the vehicle 2 is turning left at the intersection, i.e., traveling in the intersection.
  • the right side shows the traveling state of the vehicle 2 after completing the left turn at the intersection, that is, after passing through the intersection.
  • guidance information 312 indicating that the next intersection is to be turned left is displayed as an AR superimposed on the road. Then, as shown in the center of FIG. 28, when the vehicle 2 approaches the intersection, the guidance information 312 displayed in the first display area 6b is erased, and new guidance information 312 is displayed as the driving action.
  • the guidance information 312 may not be displayed in the second display area 6a.
  • whether or not the vehicle 2 has entered the intersection is determined based on, for example, steering wheel angle information, navigation information, and camera image information outside the vehicle acquired by the ECU 21 from the vehicle information acquisition unit 10.
  • the guidance information 312 is displayed again in the first display area 6b.
  • the guidance information 312 is information indicating that the vehicle travels straight on the left-turned road.
  • the driver 5 when turning the intersection, the driver 5 can travel in a safe state because the object to which attention should be paid while traveling at the intersection is not obstructed by the virtual image display. Furthermore, since navigation guidance seamlessly moves from the lower part of the display area (first display area) to the upper part (second display area) or from the upper part to the lower part according to the driving state of the intersection, guidance by navigation is provided. The amount of change in the display is reduced compared to when the message appears and disappears, and it is possible to prevent the driver 5 from being surprised by the change in the display and distracting attention.
  • FIG. 29 is an explanatory view showing an example of a menu for customizing the display in the display area by the AR-HUD of FIG.
  • menus of “upper and lower two-level display”, “upper-level display”, and “lower-level display” are provided as examples of display selection in the display area 6.
  • virtual images are displayed in all of the second display area 6a and the second table area 6b shown in FIG.
  • “Upper display” displays only the second display area 6a, that is, only the traffic auxiliary information including auxiliary information for assisting the driving operation.
  • the “lower display” displays only the first display area 6b, that is, only priority information that is important information directly related to the driving operation.
  • the driver 5 can arbitrarily select the content to be displayed by selecting the display method of the virtual image displayed in the display area 6 from the menu of FIG. In FIG. 29, the menu for selecting only the display area in the display area 6 is used. However, various menus such as changing the transmittance of information displayed in the second display area 6a and changing the display size of the information are available. A menu may be prepared. Further, the selection menu is not limited to the above-described selection example, but may be other items.
  • the driver 5 can easily distinguish and recognize the display of the traffic assistance information and the priority information more important than the traffic assistance information. In addition, it is possible to make the driver 5 recognize the traffic assistance information and the priority information without hindering driving. Thereby, it can contribute to safe driving.
  • a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. .

Abstract

[Problem] To increase the size of the display of a head-up display device and make it easier for a driver to recognize important information. [Solution] An AR-HUD 1, wherein a vehicle information acquisition unit 10 acquires various types of vehicle information 4 that can be detected by a vehicle 2. A control unit 20 controls the display of an image to be displayed on a display region visible through a windshield 3 from the driver seat of the vehicle 2 on the basis of the vehicle information 4 acquired by the vehicle information acquisition unit 10. An image display device 30 generates an image on the basis of the instruction from the control unit 20. A display distance adjusting mechanism 40 adjusts the display distance of a virtual image with respect to a driver. The display region controlled by the control unit 20 has a first display region and a second display region located above the first display region. The first display region is a region displaying augmented reality, and the second display region is a region not displaying augmented reality.

Description

ヘッドアップディスプレイ装置Head-up display device
 本発明は、ヘッドアップディスプレイ装置の技術に関し、特に、AR(Augmented Reality:拡張現実)を利用したヘッドアップディスプレイ装置に適用して有効な技術に関する。 The present invention relates to a technology for a head-up display device, and more particularly to a technology effective when applied to a head-up display device using AR (Augmented Reality).
 自動車などの車両においては、フロントガラス(ウィンドシールド)などに情報を投射して表示するヘッドアップディスプレイ(Head Up Display、以下では「HUD」と記載する場合がある)装置が用いられていることが知られている。 In vehicles such as automobiles, a head-up display (Head-Up Display, hereinafter sometimes referred to as “HUD”) device that projects and displays information on a windshield (windshield) or the like is used. Are known.
 このHUD装置は、車速やエンジン回転数などの走行情報、あるいはカーナビゲーションなどの情報などを上述したようにフロントガラスに投射するものである。運転者は、ダッシュボードに組み込まれる計器盤、いわゆるインパネなどに視線を移動することなく情報を確認することができ、視線の移動量を低減させることができる。 This HUD device projects driving information such as vehicle speed and engine speed or information such as car navigation onto the windshield as described above. The driver can check information without moving the line of sight to an instrument panel incorporated in the dashboard, such as a so-called instrument panel, and the amount of movement of the line of sight can be reduced.
 近年、HUD装置には、上述した走行情報やカーナビゲーションの情報に加えて、歩行者や障害物の検知などの安全運転を支援する情報などを表示するものがある。例えば、側道の道路標識や歩行者の存在などを表示する際には、HUD装置を大画面化することが求められる。 In recent years, some HUD devices display information for supporting safe driving such as detection of pedestrians and obstacles in addition to the above-described traveling information and car navigation information. For example, when displaying a road sign on a side road or the presence of a pedestrian, it is required to enlarge the HUD device.
 なお、この種のHUD装置による表示技術については、例えば前側方からの接近障害物を検出したときに不必要に警報を行ってしまうことを防止するもの(例えば特許文献1参照)、あるいは複数の表示装置間で統一感のある情報を観察者に対して提供し、情報を観察者に認識させやすくするもの(例えば特許文献2参照)などがある。 In addition, about the display technique by this kind of HUD apparatus, for example, it is possible to prevent an unnecessary alarm from occurring when an approaching obstacle from the front side is detected (for example, see Patent Document 1), or a plurality of display techniques. There is one that provides information with a sense of unity between display devices to an observer so that the information can be easily recognized by the observer (see, for example, Patent Document 2).
特開2016-91084号公報JP 2016-91084 A 特開2016-60303号公報Japanese Unexamined Patent Publication No. 2016-60303
 多くの情報をウィンドシールド越しに見える車外の実際の風景に重畳させて表示するためにHUDの表示面積を大きくすると、多くの情報を表示することはできるが、その一方で、運転者は常に多くの情報を見ることになり、煩わしさを感じてしまうことになる。 If you increase the display area of the HUD in order to display a lot of information superimposed on the actual scenery outside the vehicle that can be seen through the windshield, you can display a lot of information, but on the other hand, there are always many drivers. You will feel annoyed by seeing the information.
 それに加えて、大きな表示画面に多くの情報が表示されることにより、運転者の注意力が散漫になったり、あるいは安全にかかわる重要な情報が認識されなくなってしまうなどの問題が生じることになる。 In addition, a large amount of information is displayed on a large display screen, which may cause problems such as distraction of the driver's attention or loss of recognition of important information related to safety. .
 本発明の目的は、ヘッドアップディスプレイ装置による表示領域を大型化しながら、運転者が重要な情報をより認識しやすく表示することのできる技術を提供することにある。 An object of the present invention is to provide a technology that allows a driver to easily recognize important information while increasing the display area of a head-up display device.
 本発明の前記ならびにその他の目的と新規な特徴については、本明細書の記述および添付図面から明らかになるであろう。 The above and other objects and novel features of the present invention will be apparent from the description of this specification and the accompanying drawings.
 本願において開示される発明のうち、代表的なものの概要を簡単に説明すれば、次のとおりである。 Of the inventions disclosed in this application, the outline of typical ones will be briefly described as follows.
 すなわち、代表的なヘッドアップディスプレイ装置は、車両のウィンドシールドに映像を投射することにより、運転者に対して車両の前方の風景に重畳して虚像を表示する。 That is, a typical head-up display device projects a video on a windshield of the vehicle, and displays a virtual image superimposed on the scenery in front of the vehicle to the driver.
 このヘッドアップディスプレイ装置は、車両情報取得部、制御部、映像表示装置、ミラー、ミラー駆動部と、および表示距離調整機構を有する。車両情報取得部は、車両が検知することができる各種の車両情報を取得する。 This head-up display device has a vehicle information acquisition unit, a control unit, a video display device, a mirror, a mirror driving unit, and a display distance adjustment mechanism. The vehicle information acquisition unit acquires various types of vehicle information that can be detected by the vehicle.
 制御部は、車両情報取得部が取得した車両情報に基づいて、車両の運転席からウィンドシールドを通じて視認される表示領域に表示される映像の表示を制御する。映像表示装置は、制御部からの指示に基づいて映像を生成する。 The control unit controls the display of the video displayed in the display area viewed through the windshield from the driver's seat of the vehicle based on the vehicle information acquired by the vehicle information acquisition unit. The video display device generates a video based on an instruction from the control unit.
 ミラーは、映像表示装置が生成した映像を反射してウィンドシールドに投射する。ミラー駆動部は、制御部からの指示に基づいて、ミラーの角度を変化させる。表示距離調整機構は、運転者に対する虚像の表示距離を調整する。 Mirror reflects the image generated by the image display device and projects it to the windshield. The mirror driving unit changes the angle of the mirror based on an instruction from the control unit. The display distance adjustment mechanism adjusts the display distance of the virtual image for the driver.
 制御部が制御する表示領域は、第1の表示領域と、該第1の表示領域よりも上側の領域である第2の表示領域と、を有する。第1の表示領域は、拡張現実の表示をする領域である。第2の表示領域は、拡張現実の表示をしない領域である。 The display area controlled by the control unit includes a first display area and a second display area that is an area above the first display area. The first display area is an area for displaying augmented reality. The second display area is an area that does not display augmented reality.
 特に、第1の表示領域は、第1の情報を表示し、第2の表示領域は、第1の情報よりも優先度が低い第2の情報を表示する。第1の情報は、安全運転を支援する情報である安全運転支援情報であり、第2の情報は、運転行動を支援する情報である走行支援情報である。 Particularly, the first display area displays the first information, and the second display area displays the second information having a lower priority than the first information. The first information is safe driving support information that is information that supports safe driving, and the second information is driving support information that is information that supports driving behavior.
 本願において開示される発明のうち、代表的なものによって得られる効果を簡単に説明すれば以下のとおりである。 Among the inventions disclosed in the present application, effects obtained by typical ones will be briefly described as follows.
 (1)車両の走行状況に応じて安全運転に必要な情報を的確に表示することができる。 (1) Information necessary for safe driving can be accurately displayed in accordance with the traveling state of the vehicle.
 (2)上記(1)により、安全運転に貢献することができる。 (2) The above (1) can contribute to safe driving.
実施の形態1によるAR-HUDにおける動作概念の例について概要を示した説明図である。6 is an explanatory diagram showing an outline of an example of an operation concept in AR-HUD according to Embodiment 1. FIG. 実施の形態1によるAR-HUDの全体の構成例について概要を示した機能ブロック図である。3 is a functional block diagram showing an overview of an overall configuration example of an AR-HUD according to Embodiment 1. FIG. 図2のAR-HUDにおける車両情報の取得に係るハードウェア構成の例について概要を示した説明図である。FIG. 3 is an explanatory diagram showing an outline of an example of a hardware configuration related to acquisition of vehicle information in the AR-HUD of FIG. 2. 図2のAR-HUDの構成例について詳細を示した機能ブロック図である。FIG. 3 is a functional block diagram illustrating details of a configuration example of the AR-HUD in FIG. 2. 図4の制御部および表示距離調整機構における構成の例について詳細を示した説明図である。It is explanatory drawing which showed the detail about the example of a structure in the control part and display distance adjustment mechanism of FIG. 図2のAR-HUDにおける初期動作の例について概要を示したフローチャートである。3 is a flowchart showing an outline of an example of an initial operation in the AR-HUD of FIG. 図2のAR-HUDにおける通常動作の例について概要を示したフローチャートである。3 is a flowchart showing an outline of an example of normal operation in the AR-HUD of FIG. 図7のステップS22の処理である明るさレベル調整処理の例について概要を示したフローチャートである。It is the flowchart which showed the outline | summary about the example of the brightness level adjustment process which is a process of step S22 of FIG. 図2のAR-HUDにおける虚像の表示内容や表示方法を調整する処理の流れの例についての概要を示したフローチャートである。3 is a flowchart showing an outline of an example of a flow of processing for adjusting display contents and display method of a virtual image in the AR-HUD of FIG. 2. 図9の表示映像決定・変更処理におけるステップS233の処理である表示位置等調整処理の流れの例について概要を示したフローチャートである。10 is a flowchart showing an overview of an example of the flow of display position adjustment processing that is processing of step S233 in the display video determination / change processing of FIG. 9; 図2のAR-HUDによる表示領域の一例を示す説明図である。FIG. 3 is an explanatory diagram illustrating an example of a display area by AR-HUD in FIG. 2. 図11の表示領域が拡大された際におけるナビゲーション情報に基づく案内画面の表示例を示す説明図である。It is explanatory drawing which shows the example of a display of the guidance screen based on navigation information when the display area of FIG. 11 is expanded. 車両がナビゲーション案内によって左折する際に左折できる道路が複数ある場合の一例を示す説明図である。It is explanatory drawing which shows an example in case there are multiple roads which can turn left when a vehicle turns left by navigation guidance. 図13の道路構成における表示領域が拡大された際のナビゲーション情報に基づく案内画面の表示例を示す説明図である。It is explanatory drawing which shows the example of a display of the guidance screen based on the navigation information when the display area in the road structure of FIG. 13 is expanded. 図11の拡大された表示領域によるナビゲーション情報に基づく案内画面の他の表示例を示す説明図である。It is explanatory drawing which shows the other example of a display of the guidance screen based on the navigation information by the enlarged display area of FIG. 図11の拡大された表示領域による表示画面の表示の他の例を示す説明図である。FIG. 12 is an explanatory diagram illustrating another example of display on the display screen using the enlarged display area of FIG. 11. 実施の形態2によるAR-HUDによる表示領域が水平方向および鉛直方向に拡大された際の領域の一例を示す説明図である。FIG. 10 is an explanatory diagram showing an example of a region when a display region by AR-HUD according to the second embodiment is enlarged in a horizontal direction and a vertical direction. 図17の水平方向および鉛直方向に拡大された表示領域における表示の一例を示す説明図である。It is explanatory drawing which shows an example of the display in the display area expanded in the horizontal direction and the vertical direction of FIG. 図18に続く表示領域における表示の一例を示す説明図である。It is explanatory drawing which shows an example of the display in the display area following FIG. 第2の表示領域における表示の一例を示す説明図である。It is explanatory drawing which shows an example of the display in a 2nd display area. 図20に続く表示の一例を示す説明図である。It is explanatory drawing which shows an example of the display following FIG. 図19の他の表示例を示す説明図である。FIG. 20 is an explanatory diagram illustrating another display example of FIG. 19. 図17のAR-HUDによる表示領域における連携表示動作の一例を示す説明図である。FIG. 18 is an explanatory diagram illustrating an example of a cooperative display operation in a display area by the AR-HUD of FIG. 本発明者の検討による表示領域の上部における虚像表示の視点位置が近方の場合の一例を示す説明図である。It is explanatory drawing which shows an example in case the viewpoint position of the virtual image display in the upper part of the display area by examination of this inventor is near. 図17のAR-HUDによる表示領域の上部における視点移動の増加を軽減する表示の一例を示す説明図である。FIG. 18 is an explanatory diagram illustrating an example of a display that reduces an increase in viewpoint movement in the upper part of the display area by the AR-HUD in FIG. 17. 図17のAR-HUDによる表示領域における道路標識の見落としなどを低減する表示の一例を示す説明図である。FIG. 18 is an explanatory diagram illustrating an example of a display for reducing oversight of a road sign in the display area by the AR-HUD of FIG. 図17のAR-HUDによる表示領域における道路状況の危険度に応じた表示の一例を示す説明図である。FIG. 18 is an explanatory diagram illustrating an example of display according to a road condition risk level in the display area by the AR-HUD of FIG. 17; 図17のAR-HUDによる表示領域における交差点でのナビゲーションの案内表示の一例を示す説明図である。FIG. 18 is an explanatory diagram showing an example of navigation guidance display at an intersection in the display area by the AR-HUD of FIG. 17. 図17のAR-HUDによる表示領域による表示をカスタマイズするメニューの一例を示す説明図である。FIG. 18 is an explanatory diagram illustrating an example of a menu for customizing display in a display area by AR-HUD in FIG. 17.
 実施の形態を説明するための全図において、同一の部材には原則として同一の符号を付し、その繰り返しの説明は省略する。なお、図面をわかりやすくするために平面図であってもハッチングを付す場合がある。 In all the drawings for explaining the embodiments, the same members are, in principle, given the same reference numerals, and the repeated explanation thereof is omitted. In order to make the drawings easy to understand, even a plan view may be hatched.
 (実施の形態1)
 以下、実施の形態を詳細に説明する。
(Embodiment 1)
Hereinafter, embodiments will be described in detail.
 <AR-HUDの動作概念>
 図1は、本実施の形態1によるAR機能を実現するHUD装置(以下では「AR-HUD」と記載する場合がある)における動作概念の例について概要を示した説明図である。
<AR-HUD operation concept>
FIG. 1 is an explanatory diagram showing an outline of an example of an operation concept in a HUD device (hereinafter sometimes referred to as “AR-HUD”) that realizes an AR function according to the first embodiment.
 ヘッドアップディスプレイ装置であるAR-HUD1は、図1に示すように、プロジェクタやLCD(Liquid Crystal Display)などからなる映像表示装置30に表示された映像を、ミラー51やミラー52により反射させて、車両2のウィンドシールド3に投射する。ミラー51やミラー52は、例えば、自由曲面ミラーや光軸非対称の形状を有するミラーである。 As shown in FIG. 1, the AR-HUD 1 that is a head-up display device reflects an image displayed on an image display device 30 such as a projector or an LCD (Liquid Crystal Display) by a mirror 51 or a mirror 52, The light is projected onto the windshield 3 of the vehicle 2. The mirror 51 and the mirror 52 are, for example, a free-form surface mirror or a mirror having an optical axis asymmetric shape.
 運転者5は、ウィンドシールド3に投射された映像を見ることで、透明のウィンドシールド3を通してその前方に虚像として上記映像を視認する。本実施の形態では、例えば、ミラー52の角度を調整することで、ウィンドシールド3に投射する映像の位置が調整され、運転者5が見る虚像の表示位置を上下方向に調整することが可能である。 The driver 5 views the image projected as a virtual image in front of the transparent windshield 3 by viewing the image projected on the windshield 3. In the present embodiment, for example, by adjusting the angle of the mirror 52, the position of the image projected on the windshield 3 is adjusted, and the display position of the virtual image viewed by the driver 5 can be adjusted in the vertical direction. is there.
 また、虚像を近方(例えば2~3m先)に表示したり、遠方(例えば30~40m先)に表示したりなど、表示距離を調整することも可能である。そして、虚像を車外の風景(道路や建物、人など)に重畳させるようにその表示位置や表示距離を調整することで、AR機能を実現する。 It is also possible to adjust the display distance, such as displaying a virtual image near (for example, 2 to 3 m away) or distant (for example, 30 to 40 m away). Then, the AR function is realized by adjusting the display position and display distance so that the virtual image is superimposed on the scenery outside the vehicle (roads, buildings, people, etc.).
 また、本実施の形態のAR-HUD1は、ウィンドシールド3に投射される映像の表示領域、すなわち後述する図11および図12に示す表示領域6が拡大化されており、より多くの情報をウィンドシールド3に表示することができる。これは、例えばミラー52などをより大面積化することによって実現することができる。なお、表示領域6の拡大は、これに限定されるものではなく、他の技術によって実現してもよい。 In addition, the AR-HUD 1 of the present embodiment has an enlarged display area of an image projected on the windshield 3, that is, a display area 6 shown in FIGS. 11 and 12, which will be described later. It can be displayed on the shield 3. This can be realized, for example, by increasing the area of the mirror 52 or the like. The enlargement of the display area 6 is not limited to this, and may be realized by other techniques.
 〈AR-HUDの構成例〉
 図2は、本実施の形態1によるAR-HUDの全体の構成例について概要を示した機能ブロック図である。
<Ar-HUD configuration example>
FIG. 2 is a functional block diagram showing an overview of an overall configuration example of the AR-HUD according to the first embodiment.
 車両2に搭載されたAR-HUD1は、図2に示すように、車両情報取得部10、制御部20、映像表示装置30、表示距離調整機構40、ミラー駆動部50、ミラー52、およびスピーカ60などからなる。なお、図2の例では、車両2の形状を乗用車のように表示しているが、特にこれに限られず、車両一般に適宜適用することができる。 As shown in FIG. 2, the AR-HUD 1 mounted on the vehicle 2 includes a vehicle information acquisition unit 10, a control unit 20, a video display device 30, a display distance adjustment mechanism 40, a mirror driving unit 50, a mirror 52, and a speaker 60. Etc. In the example of FIG. 2, the shape of the vehicle 2 is displayed like a passenger car. However, the shape is not limited to this, and can be applied as appropriate to general vehicles.
 車両情報取得部10は、車両2の各部に設置された後述するような各種のセンサなどの情報取得デバイスからなり、車両2にて生じた各種イベントを検知したり、所定の間隔で走行状況に係る各種パラメータの値を検知・取得したりすることで車両情報4を取得して出力する。 The vehicle information acquisition unit 10 includes information acquisition devices such as various sensors, which will be described later, installed in each part of the vehicle 2, detects various events that occur in the vehicle 2, and enters a traveling state at predetermined intervals. The vehicle information 4 is acquired and output by detecting and acquiring the values of various parameters.
 車両情報4には、図示するように、例えば、車両2の速度情報やギア情報、ハンドル操舵角情報、ランプ点灯情報、外光情報、距離情報、赤外線情報、エンジンON/OFF情報、車内外のカメラ映像情報、加速度ジャイロ情報、GPS(Global Positioning System)情報、ナビゲーション情報、車車間通信情報、および路車間通信情報などが含まれ得る。 The vehicle information 4 includes, for example, speed information and gear information of the vehicle 2, steering wheel steering angle information, lamp lighting information, outside light information, distance information, infrared information, engine ON / OFF information, inside / outside of the vehicle, as shown in the figure. Camera image information, acceleration gyro information, GPS (Global Positioning System) information, navigation information, vehicle-to-vehicle communication information, road-to-vehicle communication information, and the like may be included.
 制御部20は、AR-HUD1の動作を制御する機能を有し、例えば、CPU(Central Processing Unit)およびこれにより実行されるソフトウェアにより実装される。マイクロコンピュータやFPGA(Field Programmable Gate Array)などのハードウェアにより実装されていてもよい。 The control unit 20 has a function of controlling the operation of the AR-HUD 1 and is implemented by, for example, a CPU (Central Processing Unit) and software executed thereby. It may be implemented by hardware such as a microcomputer or FPGA (Field Programmable Gate Gate Array).
 制御部20は、図2に示したように、車両情報取得部10から取得した車両情報4などに基づいて、虚像として表示する映像を映像表示装置30を駆動して生成し、これをミラー52などによって適宜反射させることでウィンドシールド3に投射する。そして、虚像の表示領域6の表示位置を調整したり、虚像の表示距離を調整したりなどの制御を行う。 As illustrated in FIG. 2, the control unit 20 generates a video to be displayed as a virtual image by driving the video display device 30 based on the vehicle information 4 acquired from the vehicle information acquisition unit 10 and the like, and generates the image on the mirror 52. The light is projected onto the windshield 3 by appropriately reflecting it. Then, control is performed such as adjusting the display position of the virtual image display area 6 and adjusting the display distance of the virtual image.
 映像表示装置30は、上述したように、例えば、プロジェクタやLCDにより構成されるデバイスであり、制御部20からの指示に基づいて虚像を表示するための映像を生成してこれを投射したり表示したりする。 As described above, the video display device 30 is a device configured by, for example, a projector or an LCD, and generates a video for displaying a virtual image based on an instruction from the control unit 20 and projects or displays the video. To do.
 表示距離調整機構40は、制御部20からの指示に基づいて、表示される虚像の運転者5からの距離を調整するための機構である。ミラー駆動部50は、制御部20からの指示に基づいてミラー52の角度を調整し、虚像の表示領域6の位置を上下方向に調整する。 The display distance adjustment mechanism 40 is a mechanism for adjusting the distance of the displayed virtual image from the driver 5 based on an instruction from the control unit 20. The mirror driving unit 50 adjusts the angle of the mirror 52 based on an instruction from the control unit 20 and adjusts the position of the virtual image display area 6 in the vertical direction.
 スピーカ60は、AR-HUD1に係る音声出力を行う。例えば、ナビゲーションシステムの音声案内や、AR機能によって運転者5に警告などを通知する際の音声出力などを行うことができる。 Speaker 60 performs audio output related to AR-HUD1. For example, voice guidance of a navigation system, voice output when notifying the driver 5 with a AR function, or the like can be performed.
 図3は、図2のAR-HUDにおける車両情報4の取得に係るハードウェア構成の例について概要を示した説明図である。 FIG. 3 is an explanatory diagram showing an outline of an example of a hardware configuration relating to acquisition of the vehicle information 4 in the AR-HUD of FIG.
 ここでは、主に車両情報取得部10および制御部20の一部のハードウェア構成について示す。車両情報4の取得は、例えば、ECU(Electronic Control Unit)21の制御の下、ECU21に接続された各種のセンサなどの情報取得デバイスにより行われる。 Here, some hardware configurations of the vehicle information acquisition unit 10 and the control unit 20 are mainly shown. The acquisition of the vehicle information 4 is performed by an information acquisition device such as various sensors connected to the ECU 21 under the control of an ECU (Electronic Control Unit) 21, for example.
 これらの情報取得デバイスとしては、例えば、車速センサ101、シフトポジションセンサ102、ハンドル操舵角センサ103、ヘッドライトセンサ104、照度センサ105、色度センサ106、測距センサ107、赤外線センサ108、エンジン始動センサ109、加速度センサ110、ジャイロセンサ111、温度センサ112、路車間通信用無線受信機113、車車間通信用無線受信機114、カメラ(車内)115、カメラ(車外)116、GPS受信機117、およびVICS(Vehicle Information and Communication System:道路交通情報通信システム、登録商標(以下同様))受信機118などの各デバイスを有する。 These information acquisition devices include, for example, a vehicle speed sensor 101, a shift position sensor 102, a steering wheel angle sensor 103, a headlight sensor 104, an illuminance sensor 105, a chromaticity sensor 106, a distance measuring sensor 107, an infrared sensor 108, and an engine start. Sensor 109, acceleration sensor 110, gyro sensor 111, temperature sensor 112, road-to-vehicle communication wireless receiver 113, vehicle-to-vehicle communication wireless receiver 114, camera (inside the vehicle) 115, camera (outside the vehicle) 116, GPS receiver 117, And VICS (Vehicle Information and Communication Communication System: road traffic information communication system, registered trademark (hereinafter the same)) receiver 118 and the like.
 必ずしもこれら全てのデバイスを備えている必要はなく、また、他の種類のデバイスを備えていてもよい。備えているデバイスによって取得できる車両情報4を適宜用いることができる。 It is not always necessary to provide all these devices, and other types of devices may be provided. The vehicle information 4 that can be acquired by the provided device can be used as appropriate.
 車速センサ101は、図2の車両2の速度情報を取得する。シフトポジションセンサ102は、車両2の現在のギア情報を取得する。ハンドル操舵角センサ103は、ハンドル操舵角情報を取得する。 The vehicle speed sensor 101 acquires speed information of the vehicle 2 in FIG. The shift position sensor 102 acquires current gear information of the vehicle 2. The steering wheel angle sensor 103 acquires steering wheel angle information.
 ヘッドライトセンサ104は、ヘッドライトのON/OFFに係るランプ点灯情報を取得する。照度センサ105および色度センサ106は、外光情報を取得する。測距センサ107は、車両2と外部の物体との間の距離情報を取得する。 The headlight sensor 104 acquires lamp lighting information related to ON / OFF of the headlight. The illuminance sensor 105 and the chromaticity sensor 106 acquire external light information. The distance measuring sensor 107 acquires distance information between the vehicle 2 and an external object.
 赤外線センサ108は、車両2の近距離における物体の有無および距離などに係る赤外線情報を取得する。エンジン始動センサ109は、エンジンのON/OFF情報を検知する。 The infrared sensor 108 acquires infrared information related to the presence / absence and distance of an object at a short distance of the vehicle 2. The engine start sensor 109 detects engine ON / OFF information.
 加速度センサ110およびジャイロセンサ111は、車両2の姿勢や挙動の情報として、加速度や角速度からなる加速度ジャイロ情報を取得する。温度センサ112は、車内外の温度情報を取得する。 The acceleration sensor 110 and the gyro sensor 111 acquire acceleration gyro information including acceleration and angular velocity as information on the posture and behavior of the vehicle 2. The temperature sensor 112 acquires temperature information inside and outside the vehicle.
 路車間通信用無線受信機113および車車間通信用無線受信機114は、車両2と道路や標識、信号等との間の路車間通信により受信した路車間通信情報、および車両2と周辺の他の車両との間の車車間通信により受信した車車間通信情報をそれぞれ取得する。 The road-to-vehicle communication wireless receiver 113 and the vehicle-to-vehicle communication wireless receiver 114 are road-to-vehicle communication information received by road-to-vehicle communication between the vehicle 2 and roads, signs, signals, etc. Vehicle-to-vehicle communication information received by vehicle-to-vehicle communication with the other vehicle.
 カメラ(車内)115およびカメラ(車外)116は、車内および車外の状況の動画像を撮影して車内のカメラ映像情報および車外のカメラ映像情報をそれぞれ取得する。カメラ(車内)115では、例えば、図1の運転者5の姿勢や、眼の位置、動きなどを撮影する。得られた動画像を解析することにより、例えば、運転者5の疲労状況や視線の位置などを把握することが可能である。 The camera (inside the vehicle) 115 and the camera (outside the vehicle) 116 capture the moving image of the situation inside and outside the vehicle, and acquire the camera image information inside the vehicle and the camera image information outside the vehicle, respectively. The camera (inside the vehicle) 115 shoots, for example, the posture of the driver 5 in FIG. By analyzing the obtained moving image, it is possible to grasp, for example, the fatigue status of the driver 5 and the position of the line of sight.
 また、カメラ(車外)116では、車両2の前方や後方などの周囲の状況を撮影する。得られた動画像を解析することにより、例えば、周辺の他の車両や人などの移動物の有無、建物や地形、雨や積雪、凍結、凹凸などといった路面状況、および道路標識などを把握することが可能である。 In addition, the camera (outside the vehicle) 116 photographs the surrounding situation such as the front and rear of the vehicle 2. By analyzing the obtained moving images, for example, the presence or absence of moving objects such as other vehicles and people in the vicinity, road surface conditions such as buildings and terrain, rain and snow, freezing, unevenness, and road signs, etc. are grasped It is possible.
 GPS受信機117およびVICS受信機118は、それぞれ、GPS信号を受信して得られるGPS情報およびVICS信号を受信して得られるVICS情報を取得する。これらの情報を取得して利用するカーナビゲーションシステムの一部として実装されていてもよい。 The GPS receiver 117 and the VICS receiver 118 obtain GPS information obtained by receiving the GPS signal and VICS information obtained by receiving the VICS signal, respectively. It may be implemented as a part of a car navigation system that acquires and uses these pieces of information.
 図4は、図2のAR-HUDの構成例について詳細を示した機能ブロック図である。 FIG. 4 is a functional block diagram showing details of the configuration example of the AR-HUD in FIG.
 この図4の例では、映像表示装置30がプロジェクタである場合を示しており、映像表示装置30は、例えば、光源31、照明光学系32、および表示素子33などの各部を有する。 4 shows a case where the video display device 30 is a projector, and the video display device 30 includes, for example, each unit such as a light source 31, an illumination optical system 32, and a display element 33.
 光源31は、投射用の照明光を発生する部材であり、例えば、高圧水銀ランプやキセノンランプ、LED(Light Emitting Diode)光源、レーザー光源などを用いることができる。 The light source 31 is a member that generates projection illumination light. For example, a high-pressure mercury lamp, a xenon lamp, an LED (Light Emitting Diode) light source, a laser light source, or the like can be used.
 照明光学系32は、光源31にて発生した照明光を集光し、より均一化して表示素子33に照射する光学系である。表示素子33は、投射する映像を生成する素子であり、例えば、透過型液晶パネル、反射型液晶パネル、DMD(Digital Micromirror Device)(登録商標)パネルなどを用いることができる。 The illumination optical system 32 is an optical system that collects the illumination light generated by the light source 31 and irradiates the display element 33 with more uniform illumination light. The display element 33 is an element that generates an image to be projected. For example, a transmissive liquid crystal panel, a reflective liquid crystal panel, a DMD (Digital Micromirror Device) (registered trademark) panel, or the like can be used.
 制御部20は、より詳細には、ECU21、音声出力部22、不揮発性メモリ23、メモリ24、光源調整部25、歪み補正部26、表示素子駆動部27、表示距離調整部28、およびミラー調整部29などの各部を有する。 More specifically, the control unit 20 includes an ECU 21, an audio output unit 22, a nonvolatile memory 23, a memory 24, a light source adjustment unit 25, a distortion correction unit 26, a display element drive unit 27, a display distance adjustment unit 28, and a mirror adjustment. Each part such as the part 29 is included.
 ECU21は、図3に示したように、車両情報取得部10を介して車両情報4を取得するとともに、取得した情報を必要に応じて不揮発性メモリ23やメモリ24に記録、格納したり読み出したりする。 As shown in FIG. 3, the ECU 21 acquires the vehicle information 4 via the vehicle information acquisition unit 10, and records, stores, and reads the acquired information in the nonvolatile memory 23 and the memory 24 as necessary. To do.
 不揮発性メモリ23には、各種制御のための設定値やパラメータなどの設定情報が格納されていてもよい。また、ECU21は、専用のプログラムを実行させるなどにより、AR-HUD1として表示する虚像に係る映像データを生成する。 The nonvolatile memory 23 may store setting information such as setting values and parameters for various controls. Further, the ECU 21 generates video data relating to a virtual image to be displayed as the AR-HUD 1 by executing a dedicated program.
 音声出力部22は、必要に応じてスピーカ60を介して音声情報を出力する。光源調整部25は、映像表示装置30の光源31の発光量を調整する。光源31が複数ある場合にはそれぞれ個別に制御するようにしてもよい。 The audio output unit 22 outputs audio information via the speaker 60 as necessary. The light source adjustment unit 25 adjusts the light emission amount of the light source 31 of the video display device 30. When there are a plurality of light sources 31, they may be controlled individually.
 歪み補正部26は、ECU21が生成した映像について、映像表示装置30によって車両2のウィンドシールド3に投射した場合に、ウィンドシールド3の曲率によって生じる映像の歪みを画像処理により補正する。表示素子駆動部27は、歪み補正部26による補正後の映像データに応じた駆動信号を表示素子33に対して送り、投射する映像を生成させる。 The distortion correction unit 26 corrects image distortion caused by the curvature of the windshield 3 by image processing when the image generated by the ECU 21 is projected onto the windshield 3 of the vehicle 2 by the image display device 30. The display element drive unit 27 sends a drive signal corresponding to the video data corrected by the distortion correction unit 26 to the display element 33 to generate an image to be projected.
 表示距離調整部28は、虚像の表示距離を調整する必要がある場合に、表示距離調整機構40を駆動して、映像表示装置30から投射される映像の表示距離を調整する。ミラー調整部29は、虚像の表示領域6自体の位置を調整する必要がある場合に、ミラー駆動部50を介してミラー52の角度を変更し、虚像の表示領域6を上下に移動させる。 When the display distance adjustment unit 28 needs to adjust the display distance of the virtual image, the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to adjust the display distance of the image projected from the image display device 30. When it is necessary to adjust the position of the virtual image display area 6 itself, the mirror adjustment section 29 changes the angle of the mirror 52 via the mirror driving section 50 and moves the virtual image display area 6 up and down.
 図5は、図4の制御部および表示距離調整機構における構成の例について詳細を示した説明図である。 FIG. 5 is an explanatory diagram showing details of an example of the configuration of the control unit and the display distance adjusting mechanism of FIG.
 図5において、制御部20の表示距離調整部28は、さらに、ECU21により個別に制御される各部として、例えば、機能性液晶フィルムON/OFF制御部281、レンズ可動部282、調光ミラーON/OFF制御部283、拡散板可動部284、および光学フィルタ可動部285などを有する。 In FIG. 5, the display distance adjustment unit 28 of the control unit 20 further includes, for example, a functional liquid crystal film ON / OFF control unit 281, a lens movable unit 282, and a dimming mirror ON / OFF as each unit individually controlled by the ECU 21. It includes an OFF control unit 283, a diffusion plate movable unit 284, an optical filter movable unit 285, and the like.
 また、これら各部により制御・駆動されるハードウェアやデバイスなどとして、表示距離調整機構40は、さらに、機能性液晶フィルム401、レンズ可動機構402、調光ミラー403、拡散板可動機構404、および光学フィルタ可動機構405などを有する。 Further, as hardware and devices controlled and driven by these units, the display distance adjustment mechanism 40 further includes a functional liquid crystal film 401, a lens movable mechanism 402, a light control mirror 403, a diffuser plate movable mechanism 404, and an optical. A filter movable mechanism 405 and the like are included.
 なお、AR-HUD1としてはこれら各部やデバイスなどを全て備えている必要はなく、虚像の表示距離の調整技術を適用するものを実装するために必要な各部を適宜備えていればよい。 Note that the AR-HUD 1 does not have to include all of these units and devices, and may include various units necessary for mounting a device to which a virtual image display distance adjustment technique is applied.
 <処理内容>
 図6は、図2のAR-HUDにおける初期動作の例について概要を示したフローチャートである。
<Processing content>
FIG. 6 is a flowchart showing an outline of an example of the initial operation in the AR-HUD of FIG.
 停止中の車両2においてイグニッションスイッチがONされることでAR-HUD1の電源がONされると(S01)、AR-HUD1は、制御部20からの指示に基づいて、まず、車両情報取得部10により車両情報4を取得する(S02)。 When the power of the AR-HUD 1 is turned on by turning on the ignition switch in the vehicle 2 that is stopped (S01), the AR-HUD 1 starts with the vehicle information acquisition unit 10 based on an instruction from the control unit 20. Thus, the vehicle information 4 is acquired (S02).
 そして、制御部20は、車両情報4のうち、照度センサ105や色度センサ106などにより取得した外光情報に基づいて好適な明るさレベルを算出し(S03)、光源調整部25により光源31の発光量を制御して、算出した明るさレベルとなるように設定する(S04)。例えば、外光が明るい場合には明るさレベルを高くし、暗い場合には明るさレベルを低く設定する。 Then, the control unit 20 calculates a suitable brightness level based on external light information acquired by the illuminance sensor 105, the chromaticity sensor 106, and the like in the vehicle information 4 (S03), and the light source adjustment unit 25 uses the light source 31. Is set so that the calculated brightness level is obtained (S04). For example, when the outside light is bright, the brightness level is set high, and when the outside light is dark, the brightness level is set low.
 その後、ECU21により、虚像として表示する映像、例えば、初期画像などを決定、生成し(S05)、生成した映像に対して歪み補正部26により歪みを補正する処理を実施した後(S06)、表示素子駆動部27により表示素子33を駆動・制御して、投射する映像を生成させる(S07)。 Thereafter, the ECU 21 determines and generates a video to be displayed as a virtual image, for example, an initial image (S05), and performs a process of correcting the distortion by the distortion correction unit 26 for the generated video (S06). The display element 33 is driven and controlled by the element driving unit 27 to generate a projected image (S07).
 これにより、映像がウィンドシールド3に投射され、運転者5は、虚像を視認することができるようになる。その後、ECU21もしくは表示距離調整部28により虚像の表示距離を算出・決定し(S08)、表示距離調整部28により表示距離調整機構40を駆動して、映像表示装置30から投射される映像の表示距離を制御する(S09)。 Thereby, the image is projected onto the windshield 3 and the driver 5 can visually recognize the virtual image. Thereafter, the ECU 21 or the display distance adjustment unit 28 calculates and determines the display distance of the virtual image (S08), and the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to display the image projected from the video display device 30. The distance is controlled (S09).
 AR-HUD1全体で、上述した一連の初期動作も含む各部の起動・始動が完了すると、HUD-ON信号が出力されるが、制御部20ではこの信号を受けたか否かを判定する(S10)。 When the start-up and start-up of each unit including the above-described series of initial operations is completed in the entire AR-HUD 1, the HUD-ON signal is output. The control unit 20 determines whether or not this signal has been received (S10). .
 受けていなければ、さらにHUD-ON信号を一定時間待ち受け(S11)、ステップS10の処理にてHUD-ON信号を受けたと判定されるまで、HUD-ON信号の待ち受け処理(S11)を繰り返す。 If not received, the HUD-ON signal is further waited for a predetermined time (S11), and the HUD-ON signal waiting process (S11) is repeated until it is determined that the HUD-ON signal is received in the process of step S10.
 ステップS10の処理において、HUD-ON信号を受けたと判定された場合は、後述するAR-HUD1の通常動作を開始し(S12)、一連の初期動作を終了する。 If it is determined in step S10 that the HUD-ON signal has been received, normal operation of the AR-HUD 1 described later is started (S12), and a series of initial operations are terminated.
 図7は、図2のAR-HUDにおける通常動作の例について概要を示したフローチャートである。 FIG. 7 is a flowchart showing an outline of an example of normal operation in the AR-HUD of FIG.
 通常動作においても、基本的な処理の流れは上述の図6に示した初期動作と概ね同様である。まず、AR-HUD1は、制御部20からの指示に基づいて、車両情報取得部10により車両情報4を取得する(S21)。そして、制御部20は、車両情報4のうち、照度センサ105や色度センサ106などにより取得した外光情報に基づいて明るさレベル調整処理を行う(S22)。 In the normal operation, the basic processing flow is almost the same as the initial operation shown in FIG. First, the AR-HUD 1 acquires the vehicle information 4 by the vehicle information acquisition unit 10 based on an instruction from the control unit 20 (S21). And the control part 20 performs a brightness level adjustment process based on the external light information acquired by the illumination intensity sensor 105, the chromaticity sensor 106, etc. among the vehicle information 4 (S22).
 図8は、図7のステップS22の処理である明るさレベル調整処理の例について概要を示したフローチャートである。 FIG. 8 is a flowchart showing an outline of an example of the brightness level adjustment process which is the process of step S22 of FIG.
 明るさレベル調整処理を開始すると、まず、取得した外光情報に基づいて好適な明るさレベルを算出する(S221)。そして、現状設定されている明るさレベルと比較することにより、明るさレベルの変更の要否を判定する(S222)。変更が不要である場合にはそのまま明るさレベル調整処理を終了する。 When the brightness level adjustment process is started, first, a suitable brightness level is calculated based on the acquired outside light information (S221). Then, by comparing with the currently set brightness level, it is determined whether or not the brightness level needs to be changed (S222). If no change is necessary, the brightness level adjustment process is terminated.
 一方、変更が必要である場合には、光源調整部25により光源31の発光量を制御して、変更後の明るさレベルとなるように設定し(S223)、明るさレベル調整処理を終了する。 On the other hand, when the change is necessary, the light source adjustment unit 25 controls the light emission amount of the light source 31 to set the brightness level after the change (S223), and the brightness level adjustment process is ended. .
 なお、ステップS222において、ステップS221の処理にて算出した好適な明るさレベルと、現状設定されている明るさレベルとの間に差分がある場合でも、差分が所定のしきい値以上である場合にのみ明るさレベルの変更が必要であると判定するようにしてもよい。 In step S222, even when there is a difference between the preferred brightness level calculated in step S221 and the currently set brightness level, the difference is equal to or greater than a predetermined threshold value. Alternatively, it may be determined that it is necessary to change the brightness level only.
 図7に戻り、その後、ECU21は、ステップS21の処理にて取得した最新の車両情報4に基づいて、虚像として表示する映像を現状のものから必要に応じて変更し、変更後の映像を決定、生成する(S23)。 Returning to FIG. 7, thereafter, the ECU 21 changes the video to be displayed as a virtual image from the current one as necessary based on the latest vehicle information 4 acquired in the process of step S <b> 21, and determines the changed video. Are generated (S23).
 なお、車両情報4に基づいて表示内容を変更するパターンは、取得した車両情報4の内容やそれらの組み合わせなどに応じて多数のものがあり得る。例えば、速度情報が変化したことにより、常時表示されている速度表示の数値を変更する場合や、ナビゲーション情報に基づいて案内の矢印図形を表示/消去したり、矢印の形状や表示位置などを変更したりする場合など、様々なパターンがあり得る。 It should be noted that there are many patterns for changing the display contents based on the vehicle information 4 depending on the contents of the acquired vehicle information 4 and combinations thereof. For example, when the speed information is changed, the numerical value of the speed display that is always displayed is changed, the guidance arrow graphic is displayed / erased based on the navigation information, and the arrow shape and display position are changed. There may be various patterns, such as when doing.
 その後、車両2の走行状況に応じて視認性や表示内容の適切性などを維持するための調整・補正処理を行う。 After that, adjustment / correction processing is performed to maintain the visibility and appropriateness of display contents according to the traveling state of the vehicle 2.
 まず、虚像の表示領域6自体の位置を調整する必要がある場合に、ミラー駆動部50を介してミラー52の角度を変更し、虚像の表示領域6を上下に移動させるミラー調整処理を行う(S24)。その後さらに、車両2の振動に対して表示領域6内における映像の表示位置を補正する振動補正処理を行う(S25)。 First, when it is necessary to adjust the position of the virtual image display area 6 itself, the angle of the mirror 52 is changed via the mirror driving unit 50, and a mirror adjustment process is performed to move the virtual image display area 6 up and down ( S24). Thereafter, a vibration correction process for correcting the display position of the image in the display area 6 with respect to the vibration of the vehicle 2 is performed (S25).
 その後、調整・補正した映像に対して歪み補正部26により歪みを補正する処理を実施した後(S26)、表示素子駆動部27により表示素子33を駆動・制御して投射する映像を生成させる(S27)。 Thereafter, the distortion correction unit 26 performs distortion correction processing on the adjusted / corrected image (S26), and then the display element driving unit 27 drives and controls the display element 33 to generate a projected image ( S27).
 そして、ECU21もしくは表示距離調整部28により虚像の表示距離を算出・決定し(S28)、表示距離調整部28により表示距離調整機構40を駆動して、映像表示装置30から投射される映像の表示距離を制御する(S29)。 Then, the display distance adjustment unit 28 calculates and determines the display distance of the virtual image (S28), and the display distance adjustment unit 28 drives the display distance adjustment mechanism 40 to display the image projected from the image display device 30. The distance is controlled (S29).
 上述した一連の通常動作を実行している際に、車両2の停止などに伴って電源OFFなどがなされると、AR-HUD1に対してHUD-OFF信号が出力されるが、制御部20では、この信号を受けたか否かを判定する(S30)。 If the power is turned off when the vehicle 2 is stopped during the series of normal operations described above, a HUD-OFF signal is output to the AR-HUD 1. It is then determined whether or not this signal has been received (S30).
 HUD-OFF信号を受けていなければ、ステップS21の処理に戻って、HUD-OFF信号を受けるまで一連の通常動作を繰り返す。HUD-OFF信号を受けたと判定された場合は、一連の通常動作を終了する。 If the HUD-OFF signal has not been received, the process returns to step S21, and a series of normal operations are repeated until the HUD-OFF signal is received. If it is determined that the HUD-OFF signal has been received, a series of normal operations is terminated.
 続いて、表示する虚像自体の表示内容や表示方法を調整する処理について説明する。 Next, processing for adjusting the display contents and display method of the virtual image itself to be displayed will be described.
 AR-HUD1は、上述した虚像の表示領域6の調整や表示距離の調整に加えて、車両2における前方の風景などの状況に応じて、表示する虚像自体の表示内容や表示方法を調整する。これにより、前方の風景に対してより適切な虚像をより適切な位置や態様にて重畳させることを可能とする。 AR-HUD 1 adjusts the display contents and display method of the virtual image itself to be displayed according to the situation such as the scenery in front of the vehicle 2 in addition to the adjustment of the virtual image display area 6 and the adjustment of the display distance. Thereby, it is possible to superimpose a more appropriate virtual image on the front landscape in a more appropriate position and manner.
 図9は、図2のAR-HUDにおける虚像の表示内容や表示方法を調整する処理の流れの例についての概要を示したフローチャートである。 FIG. 9 is a flowchart showing an outline of an example of the flow of processing for adjusting the display content and display method of the virtual image in the AR-HUD of FIG.
 AR-HUD1における虚像の表示内容(コンテンツ)の生成・表示は、図6の初期動作時のステップS05の処理や、図7の通常動作時のステップS23の処理によって行う。ここでは、図7の通常動作時におけるステップS23の処理での表示映像変更・決定処理を例として、その処理内容を示す。 The generation / display of the display contents (contents) of the virtual image in the AR-HUD 1 is performed by the process of step S05 at the initial operation of FIG. 6 and the process of step S23 at the normal operation of FIG. Here, the processing content is shown by taking the display video change / decision processing in the processing of step S23 in the normal operation of FIG. 7 as an example.
 まず、ECU21により、標準コンテンツ生成処理(S231)およびイベントコンテンツ生成処理(S232)を行う。標準コンテンツとは、基本的に車両2の走行中は表示領域6に常に表示されている車速表示などのコンテンツを指す。 First, the ECU 21 performs standard content generation processing (S231) and event content generation processing (S232). The standard content basically refers to content such as a vehicle speed display that is always displayed in the display area 6 while the vehicle 2 is traveling.
 また、イベントコンテンツとは、車両2の前方の風景の状況も含む走行状況に基づいて必要に応じて表示されるアラート表示などのコンテンツを指す。いずれについても、複数のコンテンツが生成される場合がある。 Also, the event content refers to content such as an alert display that is displayed as necessary based on the driving situation including the scenery situation in front of the vehicle 2. In any case, a plurality of contents may be generated.
 その後、ECU21により、カメラ映像情報により把握される前方の風景との関係で、生成した各コンテンツの表示位置や表示色などを調整する表示位置等調整処理(S233)および表示色等調整処理(S234)を行う。 After that, the ECU 21 adjusts the display position, display color, etc. of the generated content in relation to the front landscape grasped by the camera video information (S233) and the display color adjustment process (S234). )I do.
 そして、調整後の各コンテンツに係る表示用の映像データを生成して(S235)、処理を終了する。なお、ここで生成された映像データは、図7におけるその後のステップS27の処理にて表示素子駆動部27により投射される。 Then, video data for display related to each adjusted content is generated (S235), and the process ends. Note that the video data generated here is projected by the display element driving unit 27 in the processing of the subsequent step S27 in FIG.
 図10は、図9の表示映像決定・変更処理におけるステップS233の処理である表示位置等調整処理の流れの例について概要を示したフローチャートである。 FIG. 10 is a flowchart showing an outline of an example of the flow of the display position adjustment process which is the process of step S233 in the display video determination / change process of FIG.
 まず、ここでは、図7におけるステップS21の処理にて取得した車両情報4における車外のカメラ映像情報を解析し、前方の風景などに表示が隠されるのを回避すべき対象物があるか否かを把握する(S3100)。例えば、カーブミラーや道路標識などの他に、信号機、歩行者、二輪車、および前方車などが該当し得る。 First, here, the camera image information outside the vehicle in the vehicle information 4 obtained in the process of step S21 in FIG. 7 is analyzed, and whether or not there is an object that should be avoided from being hidden in the scenery in front. (S3100). For example, a traffic signal, a pedestrian, a two-wheeled vehicle, a forward vehicle, and the like may be applicable in addition to a curve mirror and a road sign.
 その後、図9のステップS231およびステップS232の処理にて生成した各標準コンテンツおよび各イベントコンテンツの現在の表示位置と、各対象物の座標とを照合する(S3200)。そして、各コンテンツの表示位置を調整する必要があるか否か、すなわち、各コンテンツの表示が対象物の障害となるか否かを判定する(S3300)。例えば矢印図形やアラート表示の表示位置と、カーブミラーや道路標識の位置とを画像処理などにより比較して、カーブミラーや道路標識が隠される状態となるか否かを判定する。 Thereafter, the current display position of each standard content and each event content generated in the processing of step S231 and step S232 of FIG. 9 is collated with the coordinates of each object (S3200). Then, it is determined whether or not the display position of each content needs to be adjusted, that is, whether or not the display of each content is an obstacle to the object (S3300). For example, the display position of the arrow graphic or alert display and the position of the curve mirror or road sign are compared by image processing or the like to determine whether or not the curve mirror or road sign is hidden.
 ステップS3300の処理において、表示位置の調整が不要であると判定された場合は、処理を終了する。表示位置の調整が必要であると判定された場合は、対象のコンテンツにつき、表示位置を調整・移動させて、新たな表示位置を設定する(S3400)。 If it is determined in step S3300 that adjustment of the display position is unnecessary, the process ends. If it is determined that the display position needs to be adjusted, the display position is adjusted and moved for the target content, and a new display position is set (S3400).
 上述したように、コンテンツを移動させる位置の決定方法は特に限定されない。コンテンツの位置の調整に代えて、もしくはこれに加えて、コンテンツの表示サイズを小さくすることで、対象物を隠さないよう調整するようにしてもよい(S3500)。 As described above, the method for determining the position to move the content is not particularly limited. Instead of or in addition to the adjustment of the position of the content, the display size of the content may be reduced so that the object is not hidden (S3500).
 これらの調整を行った後、再度ステップS3200に戻って、各コンテンツの表示位置を調整する必要がなくなるまで上記の一連の処理を繰り返す。コンテンツを移動させた先に別の対象物が存在する可能性もあるため、適切な表示位置となるまで調整を行う必要がある。 After making these adjustments, the process returns to step S3200 again, and the above-described series of processing is repeated until it is no longer necessary to adjust the display position of each content. Since there is a possibility that another object exists before the content is moved, it is necessary to adjust until the display position is appropriate.
 なお、この調整に時間がかかり過ぎると運転者にとって好適なタイミングで表示を行えなくなってしまう。したがって、例えば、交差点に差しかかる前の早い段階からカーブミラーや道路標識の位置を認識して表示位置の調整を行うなどするとよい。 Note that if this adjustment takes too long, the display cannot be performed at a timing suitable for the driver. Therefore, for example, it is preferable to adjust the display position by recognizing the position of a curve mirror or a road sign at an early stage before reaching the intersection.
 以上の一連の処理により、対象物を回避した状態で虚像のコンテンツを表示することができ、対象物の視認性を向上させることができる。 Through the series of processes described above, virtual image content can be displayed in a state where the object is avoided, and the visibility of the object can be improved.
 〈AR-HUDの表示例〉
 続いて、AR-HUD1による具体的な表示例について説明する。
<Display example of AR-HUD>
Next, a specific display example using the AR-HUD 1 will be described.
 以下に説明するAR-HUD1における表示例は、該AR-HUD1が上述した図7~図10に示す各処理を実行することによって実現されるものである。 The display example in the AR-HUD 1 described below is realized by the AR-HUD 1 executing the processes shown in FIGS. 7 to 10 described above.
 図11は、図2のAR-HUD1による表示領域の一例を示す説明図である。 FIG. 11 is an explanatory diagram showing an example of a display area by the AR-HUD 1 in FIG.
 この図11は、車両2の運転者5が運転席からウィンドシールド3を介して視認している表示領域6の例を模式的に示したものである。 FIG. 11 schematically shows an example of the display area 6 that the driver 5 of the vehicle 2 visually recognizes from the driver's seat through the windshield 3.
 図11(a)は、表示領域6が水平方向に拡大されていない場合の例を示したものであり、図11(b)は、路面に対して水平方向(以下、水平方向)の領域が拡大されている場合の例を示したものである。 FIG. 11A shows an example in which the display area 6 is not enlarged in the horizontal direction, and FIG. 11B shows an area in the horizontal direction (hereinafter, horizontal direction) with respect to the road surface. An example in the case of being enlarged is shown.
 この場合、表示領域6は、図11(b)の実線にて示すように、例えば路面に対して水平方向の領域が拡大されており、これにより、より広範囲の車外の風景に虚像を重畳することができる。これにより、表示領域6が水平方向に拡大されると対向車線を走行する車などや歩道上の歩行者などに虚像を重畳して表示することができるようになる。 In this case, as shown by the solid line in FIG. 11B, the display area 6 is enlarged in the horizontal direction with respect to the road surface, for example, so that a virtual image is superimposed on a wider area outside the vehicle. be able to. As a result, when the display area 6 is expanded in the horizontal direction, a virtual image can be superimposed and displayed on a vehicle traveling on the opposite lane or a pedestrian on the sidewalk.
 一方、図11(a)に示すように、表示領域6が水平方向に狭いときには対向車線を走行する車などや歩道上の歩行者などに虚像を重畳して表示することができない。 On the other hand, as shown in FIG. 11 (a), when the display area 6 is narrow in the horizontal direction, a virtual image cannot be superimposed and displayed on a car traveling in an oncoming lane or a pedestrian on a sidewalk.
 図12は、図11の表示領域が拡大された際におけるナビゲーション情報に基づく案内画面の表示例を示す説明図である。 FIG. 12 is an explanatory diagram showing a display example of a guidance screen based on navigation information when the display area of FIG. 11 is enlarged.
 この図12は、車両2の運転者5が運転席からウィンドシールド3を介して視認している前方の風景およびウィンドシールド3に投射された表示領域6内の虚像の状態の例を模式的に示したものである。なお、表示領域6内の点線は、表示領域6が水平方向に拡大されていない場合の表示領域を示したものである。 FIG. 12 schematically illustrates an example of a front landscape viewed by the driver 5 of the vehicle 2 from the driver's seat through the windshield 3 and a state of a virtual image in the display area 6 projected on the windshield 3. It is shown. The dotted line in the display area 6 indicates the display area when the display area 6 is not enlarged in the horizontal direction.
 図12では、表示領域6に矢印300、車両表示301、および歩行者表示302などが車外の風景に重畳して表示されている状態を示している。その他にも車速などを表示する計器類のアイコン(図中では「30km/h」の文字、以下では「車速表示」と記載する場合がある)および矢印300の案内によって車両2が左折するまでの距離(図中では「10m」の文字)が虚像として表示されている。 FIG. 12 shows a state in which an arrow 300, a vehicle display 301, a pedestrian display 302, and the like are superimposed on the scenery outside the vehicle in the display area 6. In addition to the icon of the instrument for displaying the vehicle speed and the like (in the figure, “30 km / h” may be referred to as “vehicle speed display” in the following), and until the vehicle 2 turns left by the guidance of the arrow 300 The distance (the character “10 m” in the figure) is displayed as a virtual image.
 矢印300は、車両2の進行方向などを指示・ナビゲートする矢印であり、車両2の走行している道路に重畳して表示される。車両表示301は、走行中の車両があることを示す表示である。図12の例では、車両表示301は円形であり、対向車線を走行中の車両200を囲むように重畳して表示されている。車両表示301の形状は、特に制限されるものではなく、円形以外に例えば三角形や四角形など、どのような形状であってもよい。 The arrow 300 is an arrow for instructing and navigating the traveling direction of the vehicle 2 and is displayed superimposed on the road on which the vehicle 2 is traveling. The vehicle display 301 is a display indicating that there is a running vehicle. In the example of FIG. 12, the vehicle display 301 is circular and is displayed so as to surround the vehicle 200 traveling in the oncoming lane. The shape of the vehicle display 301 is not particularly limited, and may be any shape other than a circle, such as a triangle or a quadrangle.
 歩行者表示302は、歩道などを歩行している歩行者201がいることおよび該歩行者201の進行方向をそれぞれ示す。図12の例では、歩行者201の足下に円形の歩行者表示302が重畳して表示されており、矢印にて歩行者201の進行方向が示されている。歩行者表示302においても、車両表示301と同様に、形状は特に制限されるものではない。 The pedestrian display 302 indicates that there is a pedestrian 201 walking on a sidewalk and the traveling direction of the pedestrian 201. In the example of FIG. 12, a circular pedestrian display 302 is displayed superimposed on the feet of the pedestrian 201, and the traveling direction of the pedestrian 201 is indicated by an arrow. The shape of the pedestrian display 302 is not particularly limited as in the vehicle display 301.
 矢印300は、ECU21が車両情報取得部10から取得したナビゲーション情報に基づいて生成する。車両表示301、および歩行者表示302は、ECU21が車両情報取得部10から取得したカメラ映像情報および赤外線情報などに基づいて生成する。車速表示は、ECU21が車両情報取得部10から取得した速度情報に基づいて生成する。 The arrow 300 is generated based on the navigation information acquired from the vehicle information acquisition unit 10 by the ECU 21. The vehicle display 301 and the pedestrian display 302 are generated based on camera video information and infrared information acquired by the ECU 21 from the vehicle information acquisition unit 10. The vehicle speed display is generated based on the speed information acquired by the ECU 21 from the vehicle information acquisition unit 10.
 このように、表示領域6が水平方向に拡大されることにより、例えば車両2が直進状態であっても、歩行者201を示す歩行者表示302や対向車線における車両表示301などを表示して運転者5に察知させることができるので、安全運転に貢献することができる。 As described above, the display area 6 is expanded in the horizontal direction, so that, for example, even when the vehicle 2 is traveling straight, the pedestrian display 302 indicating the pedestrian 201, the vehicle display 301 in the oncoming lane, and the like are displayed for driving. Since the person 5 can perceive, it can contribute to safe driving.
 一方、図12の点線にて示す表示領域6が水平方向に拡大されていない場合、表示領域が狭いために、車両2が直進している状態では、歩行者201などが表示領域に含まれない。よって、歩行者表示302の表示タイミングは、車両2が左折して歩行者201が点線にて示す表示領域内に入ったときとなってしまい、歩行者201を運転者5に察知させるタイミングが遅れてしまうことになる。 On the other hand, when the display area 6 indicated by the dotted line in FIG. 12 is not enlarged in the horizontal direction, the display area is narrow, and thus the pedestrian 201 or the like is not included in the display area when the vehicle 2 is traveling straight. . Therefore, the display timing of the pedestrian display 302 is when the vehicle 2 turns to the left and the pedestrian 201 enters the display area indicated by the dotted line, and the timing for causing the driver 5 to detect the pedestrian 201 is delayed. It will end up.
 図13は、車両がナビゲーション案内によって左折する際に左折できる道路が複数ある場合の一例を示す説明図である。図14は、図13の道路構成における表示領域が拡大された際のナビゲーション情報に基づく案内画面の表示例を示す説明図である。 FIG. 13 is an explanatory diagram showing an example when there are a plurality of roads that can turn left when the vehicle turns left by the navigation guidance. FIG. 14 is an explanatory diagram showing a display example of a guidance screen based on navigation information when the display area in the road configuration of FIG. 13 is enlarged.
 図13の実線にて示すように、虚像の表示領域6が水平方向に拡大されたことによって、車両2の進行方向などを指示・ナビゲートする案内表示を、より広範囲に表示することができる。この図13では、短い距離に左折可能な2本の道路があり、車両2が、手前の道路ではなく、その奥の道路を左折する場合を示している。 As indicated by the solid line in FIG. 13, the virtual image display area 6 is expanded in the horizontal direction, so that a guidance display for instructing and navigating the traveling direction of the vehicle 2 can be displayed in a wider range. FIG. 13 shows a case where there are two roads that can be turned to the left at a short distance, and the vehicle 2 makes a left turn not on the road in front but on the road in the back.
 表示領域6が水平方向に拡大されていない場合、点線にて示すように、左折する道路は表示領域外であり、表示領域に含まれていない。一方、水平方向に拡大された表示領域6の場合には、実線にて示すように、車両2が左折する道路だけではなく、その手前の左折可能な道路まで表示領域に含まれる。 When the display area 6 is not enlarged in the horizontal direction, as shown by the dotted line, the road to turn left is outside the display area and is not included in the display area. On the other hand, in the case of the display area 6 expanded in the horizontal direction, as indicated by a solid line, not only the road on which the vehicle 2 makes a left turn, but also the road that can turn left before that is included in the display area.
 よって、図14に示すように、進行方向などを指示・ナビゲートする案内表示である矢印300を、左折する奥側の道路に重畳して表示することができる。なお、図14の表示領域6には、その他に、車速表示(図中では「30km/h」の文字)、左折する道路までの距離(図中では「20m」の文字)、および車両表示301が虚像として表示されている。 Therefore, as shown in FIG. 14, an arrow 300 which is a guidance display for instructing / navigating the traveling direction and the like can be displayed in a superimposed manner on the road on the far side to turn left. In addition, the display area 6 in FIG. 14 includes a vehicle speed display (characters “30 km / h” in the figure), a distance to the road to turn left (characters “20 m” in the figure), and a vehicle display 301. Is displayed as a virtual image.
 ここでは、左折を例に説明したが、右折についても、同様に右折する道路に重畳して矢印300などを表示することができる。図14の例では、右折側の道路が表示領域6に含まれていないが、例えば、車両2が走行している道路の道幅が狭い場合には、右折側の道路も表示領域6に含まれる。あるいは、表示領域6をさらに水平方向に拡大するようにして左折側の道路を表示領域6に含まれるようにしてもよい。 Here, a left turn has been described as an example, but an arrow 300 or the like can be displayed on a right turn in a similar manner by superimposing it on a road that makes a right turn. In the example of FIG. 14, the road on the right turn side is not included in the display area 6. However, for example, when the road width on which the vehicle 2 is traveling is narrow, the road on the right turn side is also included in the display area 6. . Alternatively, the display area 6 may be further expanded in the horizontal direction so that the left-turn road is included in the display area 6.
 表示領域6が水平方向に拡大されていない場合には、図14に示す点線の範囲が表示領域となるので、点線の表示領域内にて矢印が表示される。この場合、左折すべき道路が表示領域に含まれていないので、手前あるいは奥側のどちらの道路を左折するのかが分からず、運転者5が混乱してしまう恐れがある。 When the display area 6 is not expanded in the horizontal direction, the dotted line range shown in FIG. 14 becomes the display area, and thus an arrow is displayed in the dotted line display area. In this case, since the road to be turned left is not included in the display area, it is not known whether the left or right road is to be turned left, and the driver 5 may be confused.
 上記に対し、表示領域6を水平方向に拡大したAR-HUD1によれば、運転者5は、手前ではなく、奥側の道路を左折することを的確に把握することができる。これは、短い距離に左折できる道路が複数ある場合や入り組んだ複雑な形状の道路などの場合には特に有効である。 In contrast to the above, according to the AR-HUD 1 in which the display area 6 is expanded in the horizontal direction, the driver 5 can accurately grasp that he / she makes a left turn on the back road, not on the front side. This is particularly effective when there are a plurality of roads that can turn left at a short distance, or when the roads have complicated shapes.
 よって、車両2が走行する道路を運転者5に的確に把握させ、混乱させることなく、運転に集中させることができるので、安全運転に貢献することができる。 Therefore, the road on which the vehicle 2 travels can be accurately grasped by the driver 5 and concentrated on driving without being confused, thereby contributing to safe driving.
 図15は、図11の拡大された表示領域6によるナビゲーション情報に基づく案内画面の他の表示例を示す説明図である。 FIG. 15 is an explanatory view showing another display example of the guidance screen based on the navigation information by the enlarged display area 6 of FIG.
 この図15においても、車両2の運転者5が運転席から視認している前方の風景およびウィンドシールドに投射された表示領域6内の虚像の状態の例を模式的に示したものである。 FIG. 15 also schematically shows an example of the front landscape viewed from the driver's seat by the driver 5 of the vehicle 2 and the state of the virtual image in the display area 6 projected on the windshield.
 図15は、車両2が指定方向以外進入禁止の交差点の手前に差し掛かった際の表示例を示したものである。 FIG. 15 shows a display example when the vehicle 2 approaches before the intersection where entry is prohibited except in the designated direction.
 図15において、車両2の前方左側には、標識202が設けられている。この標識202は、指定方向以外進入禁止を示す道路標識である。よって、交差点において、車両2は、直進のみが進行可能であり、左折側の道路および右折側の道路のいずれにも進入できない。 15, a sign 202 is provided on the front left side of the vehicle 2. This sign 202 is a road sign indicating entry prohibition other than the designated direction. Therefore, at the intersection, the vehicle 2 can only travel straight ahead and cannot enter either the left-turn road or the right-turn road.
 この場合、表示領域6において、交差点の左側の道路には、この道路に進入できないことを示す進入禁止表示303が左折側の道路に重畳して表示される。規制指示表示である進入禁止表示303は、ECU21が車両情報取得部10から取得した車外のカメラ映像情報から標識202の意味などを認識し、その認識結果に基づいて生成する。 In this case, in the display area 6, an entry prohibition display 303 indicating that it is not possible to enter this road is displayed superimposed on the road on the left turn side on the road on the left side of the intersection. The entry prohibition display 303, which is a restriction instruction display, recognizes the meaning of the sign 202 and the like from the camera video information outside the vehicle acquired by the ECU 21 from the vehicle information acquisition unit 10, and is generated based on the recognition result.
 この場合においても、表示領域6が水平方向に拡大されたことにより、交差点の左折側の道路が表示領域6に含まれ、進入禁止表示303を表示することができる。 Also in this case, when the display area 6 is expanded in the horizontal direction, the road on the left turn side of the intersection is included in the display area 6 and the entry prohibition display 303 can be displayed.
 これにより、運転者5は、短時間で的確に進入禁止の道路などを把握することができ、安全運転に寄与することができる。また、ここでは、標識202が指定方向以外進入禁止を示す道路標識である例を示したが、その他の道路標識、例えば規制標識や指示標識などについて、同様に、認識した道路標識に応じた表示である規制指示表示を表示領域6に行うものである。 This makes it possible for the driver 5 to accurately grasp roads and the like that are prohibited from entering in a short time and contribute to safe driving. In addition, here, an example is shown in which the sign 202 is a road sign indicating entry prohibition other than the designated direction. However, other road signs such as a regulation sign and an instruction sign are similarly displayed according to the recognized road sign. The regulation instruction display is performed on the display area 6.
 例えば、交差点の左折側の道路が一方通行であり、規制標識として一方通行の道路標識がある場合には、該道路に規制指示表示として一方通行であることを示す表示あるいは図15と同様に、進入禁止表示303を表示するようにしてもよい。図15に示す進入禁止表示303などの表示は、運転者5が道路標識を見落とした際などに特に有効である。また、道路標識を模したアイコン(例えば指定方向以外進入禁止)をHUDに表示することもできるが、それに比べて図15に示すような進入禁止表示303を表示する方が運転者5に的確かつ迅速に該道路に進入してはならないことを認識させることができる。すなわち、進入禁止表示303は実際の道路に重畳して表示され、あたかも疑似的な壁があるかのように見えるため、該道路に進入してはならないことを直感的に伝えることができる。 For example, if the road on the left-hand side of the intersection is one-way and there is a one-way road sign as a restriction sign, a display indicating that the road is one-way as a restriction instruction display or similar to FIG. You may make it display the entry prohibition display 303. FIG. The display such as the entry prohibition display 303 shown in FIG. 15 is particularly effective when the driver 5 misses a road sign. In addition, although an icon simulating a road sign (for example, entry prohibition other than the designated direction) can be displayed on the HUD, it is more appropriate for the driver 5 to display the entry prohibition display 303 as shown in FIG. It can be recognized that the user should not enter the road quickly. That is, the entry prohibition display 303 is displayed so as to be superimposed on the actual road, and it looks as if there is a pseudo wall, so that it can be intuitively notified that the user should not enter the road.
 図16は、図11の拡大された表示領域6による他の表示例を示す説明図である。 FIG. 16 is an explanatory view showing another display example by the enlarged display area 6 of FIG.
 図16では、車両2が走行する車線の対向車線に車両200が走行中であり、車両2が走行する車線には、該車両2の前方に車両203が走行している状態を示している。そして、表示領域6には、車両表示301および車両表示305がそれぞれ表示されている。 FIG. 16 shows a state in which the vehicle 200 is traveling in the opposite lane of the lane in which the vehicle 2 is traveling, and the vehicle 203 is traveling in front of the vehicle 2 in the lane in which the vehicle 2 is traveling. In the display area 6, a vehicle display 301 and a vehicle display 305 are displayed.
 その他にも対向車の走行速度(図中では「30km/h」の文字)および車両2の前を走行する車両203との車間距離(図中では「10m」の文字)が虚像として表示されている。車間距離は、ECU21が車両情報取得部10から取得した距離情報に基づいて生成する。 In addition, the traveling speed of the oncoming vehicle (character “30 km / h” in the figure) and the distance between the vehicle 203 traveling in front of the vehicle 2 (character “10 m” in the figure) are displayed as virtual images. Yes. The inter-vehicle distance is generated based on the distance information acquired from the vehicle information acquisition unit 10 by the ECU 21.
 車両表示301は、対向車線を走行中の車両200があることを示す表示であり、車両表示305は、車両2の前方を走行中の車両203があることを示す表示である。ここでも、車両表示301は、例えば円形からなり、車両200を囲むように重畳して表示されている。同様に、車両表示305についても、例えば円形からなり、車両203を囲むように重畳して表示されている。 The vehicle display 301 is a display indicating that there is a vehicle 200 traveling in the oncoming lane, and the vehicle display 305 is a display indicating that there is a vehicle 203 traveling in front of the vehicle 2. Also here, the vehicle display 301 is formed, for example, in a circular shape so as to be superimposed so as to surround the vehicle 200. Similarly, the vehicle display 305 also has a circular shape, for example, and is displayed so as to surround the vehicle 203.
 また、車両表示301,305は、走行車線を走行している車両であるか、あるいは対向車線を走行している車両、すなわち対向車であるかをそれぞれ区別できるように、例えば色分けがされている。なお、図16では、ハッチングの有り・なしによって色分けを表している。 In addition, the vehicle displays 301 and 305 are color-coded, for example, so that it can be distinguished whether the vehicle is traveling in the traveling lane or the vehicle traveling in the oncoming lane, that is, the oncoming vehicle. . In FIG. 16, the color classification is represented by the presence / absence of hatching.
 車両表示301,305は、色分け以外にも、例えば形状を変えるなど、前走している車両であるか、対向車線の車両であるかを区別できる表示であればよい。 The vehicle displays 301 and 305 may be any display that can distinguish whether the vehicle is running ahead or on the opposite lane, for example, by changing the shape, in addition to the color coding.
 表示領域6を水平方向に拡大したAR-HUD1によれば、対向車線や歩道や交差する道路などにまで、矢印300や歩行者表示302や車両表示305などの虚像を表示することが可能となるので、進路を的確に把握しやすくなり、また前方あるいは対向を走行する車両200、203および歩行者201などを認識しやすくなり、安全運転に貢献することができる。 According to AR-HUD1 in which the display area 6 is expanded in the horizontal direction, it is possible to display virtual images such as the arrow 300, the pedestrian display 302, the vehicle display 305, etc. even on the oncoming lane, the sidewalk, and the intersecting road. Therefore, it becomes easy to accurately grasp the course, and it becomes easy to recognize the vehicles 200 and 203 and the pedestrian 201 traveling forward or opposite, thereby contributing to safe driving.
 以上により、表示領域6を水平方向に拡大したAR-HUD1によれば、安全運転に必要な情報を運転の妨げにならずに的確に表示することができるので、安全運転に貢献することができる。 As described above, according to the AR-HUD 1 in which the display area 6 is expanded in the horizontal direction, information necessary for safe driving can be accurately displayed without hindering driving, thereby contributing to safe driving. .
 (実施の形態2)
 〈概要〉
 前記実施の形態1では、AR-HUD1による表示領域6を水平方向に拡大した際の表示技術について説明したが、本実施の形態2においては、該表示領域6を水平方向だけではなく、路面に対して鉛直方向(以下、鉛直方向)にも拡大した際における表示技術について説明する。
(Embodiment 2)
<Overview>
In the first embodiment, the display technique when the display area 6 by the AR-HUD 1 is expanded in the horizontal direction has been described. In the second embodiment, the display area 6 is not only displayed in the horizontal direction but also on the road surface. On the other hand, a display technique when enlarged in the vertical direction (hereinafter referred to as the vertical direction) will be described.
 なお、AR-HUD1の装置構成および基本的な動作については、上述した実施の形態1と同様であるので、説明は省略する。 Note that the apparatus configuration and basic operation of the AR-HUD 1 are the same as those in the first embodiment described above, and thus the description thereof is omitted.
 AR-HUD1による表示領域6が水平方向だけでなく、鉛直方向にも広がることにより、天地方向に表示することのできる情報量が増加する。そこで、表示領域6の上部には、第2の情報である交通補助情報を表示し、表示領域6の下部には、第1の情報である優先情報を表示するものとする。 The display area 6 by the AR-HUD 1 extends not only in the horizontal direction but also in the vertical direction, thereby increasing the amount of information that can be displayed in the vertical direction. Therefore, it is assumed that the traffic assistance information that is the second information is displayed in the upper part of the display area 6, and the priority information that is the first information is displayed in the lower part of the display area 6.
 交通補助情報は、例えば渋滞情報、道路情報、あるいは交差点情報などの運転動作を補助する情報である。渋滞情報は、道路の渋滞状況を示す情報である。道路情報は、例えば車線変更などを示す情報である。交差点情報は、交差点の名称などの情報である。 Traffic assistance information is information that assists driving operations such as traffic jam information, road information, or intersection information. The traffic jam information is information indicating the traffic jam status on the road. The road information is information indicating a lane change, for example. The intersection information is information such as the name of the intersection.
 また、優先情報は、交通補助情報よりも優先される運転動作にかかわる情報であり、例えば他車両、歩行者などの状況や道路線形などであり、これは実景に重畳した虚像として表示する。 Also, the priority information is information related to the driving operation prioritized over the traffic assistance information, for example, the situation of other vehicles, pedestrians, road alignment, etc., which is displayed as a virtual image superimposed on the real scene.
 〈AR-HUDの表示例〉
 図17は、本実施の形態2によるAR-HUD1による表示領域6が水平方向および鉛直方向に拡大された際の領域の一例を示す説明図である。
<Display example of AR-HUD>
FIG. 17 is an explanatory diagram showing an example of a region when the display region 6 by the AR-HUD 1 according to the second embodiment is expanded in the horizontal direction and the vertical direction.
 図17(a)において、表示領域6が水平方向および鉛直方向にそれぞれ拡大した場合、前記実施の形態1の水平方向に拡大した表示領域6に加えて、新たに上部および下部に情報表示の余地がそれぞれ生まれることになる。図17(a)におけるハッチングにて示す領域が、水平方向に拡大した表示領域6をさらに鉛直方向に拡大したことによって生まれる新たな表示領域である。 In FIG. 17 (a), when the display area 6 is expanded in the horizontal direction and the vertical direction, in addition to the display area 6 expanded in the horizontal direction in the first embodiment, there is a room for information display newly in the upper and lower parts. Will be born. The area indicated by hatching in FIG. 17A is a new display area created by further expanding the display area 6 expanded in the horizontal direction in the vertical direction.
 表示領域6を水平方向のみならず鉛直方向にも拡大した表示領域6の上端に近い領域は、運転者5から見れば、運転動作とはあまり関係がない空や建物などの景色に当たる部分である。 The area close to the upper end of the display area 6 in which the display area 6 is expanded not only in the horizontal direction but also in the vertical direction is a part that hits the scenery such as the sky and buildings that are not so much related to the driving operation from the viewpoint of the driver 5. .
 一方、表示領域6の上部よりも下側の領域(表示領域6の下部)は、運転者5が道路や交通状況、あるいは道路標識などの運転動作に直結する実景の情報を得る領域である。 On the other hand, the area below the upper part of the display area 6 (the lower part of the display area 6) is an area where the driver 5 obtains real scene information directly connected to the driving operation such as roads, traffic conditions, or road signs.
 ここで、表示領域6の上部は、第2の表示領域6aとなり、図17(b)においてハッチングにて示す領域であり、表示領域6の下部は、第1の表示領域6bとなり、図17(b)においてドットにて示す領域である。 Here, the upper part of the display area 6 is a second display area 6a, which is indicated by hatching in FIG. 17B, and the lower part of the display area 6 is a first display area 6b, which is shown in FIG. This is an area indicated by dots in b).
 よって、運転動作を補助する補助的な情報からなる交通補助情報や、飛行機や鉄道などの運行状況を知らせる情報や、電話やメールなどの着信を知らせる情報などを第2の表示領域6aに表示するとともに、運転動作に直結する重要な情報である優先情報は、第1の表示領域6bに表示する。 Therefore, the second display area 6a displays traffic assistance information consisting of auxiliary information for assisting driving operation, information notifying the operation status of airplanes, railways, etc., information notifying incoming calls such as telephone calls and mails, and the like. At the same time, priority information that is important information directly related to the driving operation is displayed in the first display area 6b.
 ここで、第2の表示領域6aは、主にAR(Augmented Reality:拡張現実)ではない虚像が表示される領域であり、第1の表示領域6bは、主にARによる表示がされる領域である。本実施例では、一例として、第2の表示領域6aに表示される交通補助情報は、ARではない虚像であり、第1の表示領域6bに表示される優先情報は、ARによる虚像表示であるものとする。 Here, the second display region 6a is a region in which a virtual image that is not mainly AR (Augmented Reality) is displayed, and the first display region 6b is a region in which display is mainly performed by AR. is there. In the present embodiment, as an example, the traffic assistance information displayed in the second display area 6a is a virtual image that is not AR, and the priority information displayed in the first display area 6b is a virtual image display by AR. Shall.
 また、第1の表示領域6bの下端に近い領域には、優先情報だけでなく、車両2の走行速度などの交通補助情報を表示するようにしてもよい。この領域は、第3の表示領域6cとなり、図17(b)において点線にて囲まれた領域である。この第3の表示領域6cには、主に、ARではない虚像を表示する。 In addition, in the area near the lower end of the first display area 6b, not only priority information but also traffic assistance information such as the traveling speed of the vehicle 2 may be displayed. This area becomes the third display area 6c, and is an area surrounded by a dotted line in FIG. In the third display area 6c, a virtual image that is not AR is mainly displayed.
 なお、交通補助情報などを表示する第2の表示領域6aは、運転者5の運転動作に直結する実景の情報を遮らない領域であればよく、表示領域6の上端に近い領域に限定されるものではない。 The second display area 6a for displaying the traffic assistance information and the like may be an area that does not block the information of the actual scene directly connected to the driving operation of the driver 5, and is limited to an area close to the upper end of the display area 6. It is not a thing.
 例えば第2の表示領域6aまたは(かつ)第3の表示領域6cを拡大し、そのぶん第1の表示領域6bを縮小してもよいし、第2の表示領域6aまたは(かつ)第3の表示領域6cをなくして、すべて第1の表示領域6bとしてもよい。 For example, the second display area 6a or (and) the third display area 6c may be enlarged, and the first display area 6b may be reduced, or the second display area 6a or (and) the third display area 6c may be reduced. The display area 6c may be omitted, and all may be the first display area 6b.
 領域の拡大や縮小に際しては、例えば前方の状況をカメラなどで撮像した情報に基づいて領域の拡大や縮小を行うようにするとよい。すなわち、前方に車両や歩行者などの存在を検知した場合には当該車両や歩行者に対してARの虚像を表示する必要があるため第2の表示領域6aまたは(かつ)第3の表示領域6cは拡大せず、第1の表示領域6bを広く確保するように制御する。 When the area is enlarged or reduced, for example, the area may be enlarged or reduced based on information obtained by imaging the situation in front of the camera. That is, when the presence of a vehicle or a pedestrian is detected ahead, it is necessary to display a virtual image of the AR for the vehicle or pedestrian, so the second display area 6a or (and) the third display area Control is performed so that the first display area 6b is secured widely without enlarging 6c.
 逆に、前方に車両や歩行者などの存在が検知されないときにはARの虚像を表示する必要がないため、第2の表示領域6aまたは(かつ)第3の表示領域6cを拡大して第1の表示領域6bを縮小するように制御してもよい。 On the contrary, since it is not necessary to display a virtual image of AR when the presence of a vehicle or a pedestrian is not detected in front, the second display area 6a or (and) the third display area 6c is enlarged and the first display area 6c is enlarged. You may control to reduce the display area 6b.
 さらには、車両の速度情報に基づいて領域の拡大や縮小を行うようにしてもよく、例えば低速走行時や停車時には第2の表示領域6aまたは(かつ)第3の表示領域6cを拡大して第1の表示領域6bを縮小し、高速走行時には第2の表示領域6aまたは(かつ)第3の表示領域6cをなくす、または縮小して第1の表示領域6bをなるべく広く確保するように制御するとよい。 Furthermore, the area may be enlarged or reduced based on the vehicle speed information. For example, the second display area 6a or (and) the third display area 6c may be enlarged when traveling at a low speed or stopping. Control to reduce the first display area 6b and eliminate the second display area 6a or (and) the third display area 6c or reduce the first display area 6b as much as possible when traveling at high speed. Good.
 図18は、図17の水平方向および鉛直方向に拡大された表示領域における表示の一例を示す説明図である。図19は、図18に続く表示領域における表示の一例を示す説明図である。 FIG. 18 is an explanatory diagram showing an example of display in the display area enlarged in the horizontal direction and the vertical direction in FIG. FIG. 19 is an explanatory diagram showing an example of display in the display area following FIG.
 図18(a)は、渋滞などが発生することなく車両2が走行している状態であることを示しており、図18(b)は、図18(a)の走行時における表示領域6の表示例を示したものである。 FIG. 18 (a) shows that the vehicle 2 is traveling without traffic jams, and FIG. 18 (b) shows the display area 6 during traveling in FIG. 18 (a). A display example is shown.
 また、図19(a)は、走行中の車両2の先に渋滞が発生していることを示したものであり、図19(b)は、渋滞が発生したことを知らせる表示領域6の表示例を示したものであり、渋滞の発生に伴い、道路の渋滞情報が表示される。 FIG. 19A shows that a traffic jam has occurred at the end of the traveling vehicle 2, and FIG. 19B shows a table of a display area 6 that informs that a traffic jam has occurred. An example is shown, and traffic congestion information on a road is displayed when a traffic jam occurs.
 まず、図18(a)に示す走行状態において、表示領域6の第2の表示領域6aには、図18(b)に示すように、車両通行区分を示す標識情報310が交通補助情報として虚像にて表示されている。 First, in the traveling state shown in FIG. 18 (a), in the second display area 6a of the display area 6, as shown in FIG. 18 (b), the sign information 310 indicating the vehicle traffic classification is a virtual image as traffic auxiliary information. Is displayed.
 続いて、図19(a)に示すように、車両2の進行先に渋滞が発生したことを検出すると、図19(b)に示すように、第2の表示領域6aにおいて、標識情報310および渋滞情報311が交通補助情報として表示される。 Subsequently, as shown in FIG. 19 (a), when it is detected that a traffic jam has occurred at the destination of the vehicle 2, as shown in FIG. 19 (b), in the second display area 6a, the sign information 310 and The traffic jam information 311 is displayed as traffic assistance information.
 標識情報310は、ECU21が車両情報取得部10から取得したカメラ映像情報などに基づいて生成する。渋滞情報311は、GPS情報およびVICS情報などに基づいて生成する。 The sign information 310 is generated based on the camera video information acquired by the ECU 21 from the vehicle information acquisition unit 10. The traffic jam information 311 is generated based on GPS information and VICS information.
 ここで、標識情報310および渋滞情報311などの交通補助情報は、運転時の補助になる情報に過ぎず運転動作に直結する実景に優先されるものではない。 Here, the traffic assistance information such as the sign information 310 and the traffic jam information 311 is only information that assists driving and is not prioritized in the actual scene directly connected to the driving operation.
 例えば運転者5の運転動作に直結する実景が見える領域、すなわち第2の表示領域に標識情報310および渋滞情報311などが表示された際には、運転者5の視線がそれらの情報に注がれてしまい、運転が疎かになる恐れがある。また、それらの情報は、運転動作に直結する実景に優先されるものではないので、実景が見える領域に表示させて実景の視認性を低下させるのは好ましくない。 For example, when the sign information 310 and the traffic jam information 311 are displayed in an area where a real scene directly connected to the driving operation of the driver 5 can be seen, that is, in the second display area, the line of sight of the driver 5 focuses on the information. It may cause you to get out of your way. In addition, since such information is not given priority over the actual scene directly connected to the driving operation, it is not preferable to display the actual scene in an area where the actual scene can be seen to reduce the visibility of the actual scene.
 一方、図19(b)に示す表示では、運転動作に直結する運転時の視界を遮ることなく、また、運転時に最も視点が集中する実景の部分から比較的少ない視点移動によって情報を読み取ることができる。よって、運転者の注意力が散漫になることを低減しながら、また、本来運転時に見るべき実景から長時間視点を逸らすことなく、渋滞情報などの交通補助情報を表示することができる。また、車両2の走行中は、図19(b)に示すように、簡易な渋滞の情報のみを表示することによって、運転者の注意力の低下を低減することができる。 On the other hand, in the display shown in FIG. 19B, the information can be read without obstructing the field of view at the time of driving that is directly connected to the driving operation and by relatively little viewpoint movement from the part of the real scene where the viewpoint is most concentrated during driving. it can. Therefore, it is possible to display traffic assistance information such as traffic jam information while reducing the distraction of the driver's attention and without deviating the viewpoint for a long time from the actual scene that should be seen when driving. Further, while the vehicle 2 is traveling, as shown in FIG. 19 (b), it is possible to reduce a driver's attention reduction by displaying only simple traffic jam information.
 図20は、第2の表示領域における表示の一例を示す説明図であり、図21は、図20に続く表示の一例を示す説明図である。 FIG. 20 is an explanatory diagram showing an example of display in the second display area, and FIG. 21 is an explanatory diagram showing an example of display following FIG.
 図20および図21は、ナビゲーション情報に基づく案内画面の表示例を示したものである。ここでは、車両2が直進しており、ナビゲーションの案内により次の交差点を左折する場合を示す。 FIG. 20 and FIG. 21 show display examples of guidance screens based on navigation information. Here, a case where the vehicle 2 is traveling straight and turns left at the next intersection according to navigation guidance is shown.
 まず、車両2は、図20(a)に示すように、直進しており、左折する交差点までの距離がしきい値よりも大きい場合、第2の表示領域6aには、図20(b)に示すように、車両通行区分を示す標識情報310のみが交通補助情報として虚像にて表示される。 First, as shown in FIG. 20A, the vehicle 2 is traveling straight, and when the distance to the intersection where the vehicle turns to the left is larger than the threshold value, the second display area 6a has the second display area 6a shown in FIG. As shown, only the sign information 310 indicating the vehicle traffic classification is displayed as a virtual auxiliary image as traffic assistance information.
 ここで、しきい値とは、左折の案内画面を出すタイミングを判定する距離である。左折する交差点までの距離がしきい値に到達すると、左折の案内画面が表示される。 Here, the threshold value is a distance for determining the timing for displaying the left turn guidance screen. When the distance to the intersection to turn left reaches the threshold value, a left turn guidance screen is displayed.
 その後、図21(a)に示すように、車両2から左折する交差点までの距離がしきい値以下となると、図21(b)に示すように、第1の表示領域6bには、次の交差点を左折することを示す案内情報312が優先情報として走行中の道路に重畳してAR表示される。しきい値の判定処理および案内情報312の生成処理は、ECU21が車両情報取得部10から取得したGPS情報などに基づいて行う。 After that, as shown in FIG. 21A, when the distance from the vehicle 2 to the intersection where the vehicle turns to the left is equal to or smaller than the threshold value, as shown in FIG. Guidance information 312 indicating turning left at the intersection is displayed as an AR superimposed on the traveling road as priority information. The threshold value determination process and the guidance information 312 generation process are performed based on GPS information acquired from the vehicle information acquisition unit 10 by the ECU 21.
 なお、図20および図21は、1つのしきい値によって案内画面を2段階に切り替える例を示したが、例えば左折する交差点までの距離がしきい値1より大きい場合は交通補助情報を表示せず、しきい値1以下かつしきい値2より大きい場合は交通補助情報を表示し、しきい値2以下の場合は交差点を左折することを示す案内情報312を道路に重畳してAR表示するなどしてもよい。 20 and 21 show an example in which the guidance screen is switched in two steps according to one threshold value. For example, when the distance to the intersection to turn left is larger than the threshold value 1, traffic assistance information is displayed. If the threshold value is 1 or less and greater than the threshold value 2, traffic assistance information is displayed. If the threshold value is 2 or less, guidance information 312 indicating turning left at the intersection is superimposed on the road and displayed as an AR. Etc.
 図22は、図19の他の表示例を示す図である。 FIG. 22 is a diagram showing another display example of FIG.
 図19では、車両2が走行中である場合に簡易な渋滞の情報のみを第2の表示領域6bに表示することにより、運転動作に直結する実景を遮ることなく運転者5に対して渋滞の状況を簡易的に報知する例を示したが、その後、車両2が停止した際における表示例について、図22を用いて説明する。 In FIG. 19, when the vehicle 2 is traveling, only simple traffic jam information is displayed in the second display area 6 b, so that the driver 5 can see the traffic jam without blocking the actual scene directly connected to the driving operation. Although the example which alert | reports a condition is shown easily, the example of a display when the vehicle 2 stops after that is demonstrated using FIG.
 渋滞情報を受信した後であって、渋滞発生地点に到達する前に車両2が停止した際には、図22に示すように、例えば第2の表示領域6aに詳細な渋滞情報および迂回路の提案などからなる詳細情報313を交通補助情報として表示する。また、詳細情報313は、情報量が多いために、第2の表示領域6aだけで収まらない場合には、第1の表示領域6bにかかるように表示するようにしてもよい。 When the vehicle 2 stops after receiving the traffic jam information and before reaching the traffic jam occurrence point, as shown in FIG. 22, for example, detailed traffic jam information and detour information are displayed in the second display area 6a. Detailed information 313 including proposals is displayed as traffic assistance information. Further, the detailed information 313 may be displayed so as to cover the first display area 6b when the detailed information 313 does not fit in the second display area 6a due to a large amount of information.
 車両2が停止している際には、運転者5の視界から運転動作に直結する実景を多少遮っても支障がなく、これにより、詳細情報313の視認性を向上させることができる。これにより、運転者5は、詳細な表示を安全に確認することができる。 When the vehicle 2 is stopped, there is no problem even if the actual scene directly connected to the driving action is obstructed from the view of the driver 5, so that the visibility of the detailed information 313 can be improved. Thereby, the driver | operator 5 can confirm a detailed display safely.
 次に、表示領域6の第1および第2の表示領域における連携表示の例について説明する。 Next, an example of cooperative display in the first and second display areas of the display area 6 will be described.
 図23は、図17のAR-HUDによる表示領域における連携表示動作の一例を示す説明図である。 FIG. 23 is an explanatory diagram showing an example of the cooperative display operation in the display area by the AR-HUD of FIG.
 この図23は、車両2の走行中において、第2の表示領域6aに車両通行区分を示す標識情報310が交通補助情報として表示されている状態から、第1の表示領域6bに優先情報である案内情報312がAR表示されるまでの表示例を示したものである。この図23における案内情報312は、車両2が走行している車線から最も左側の車線への移動を促す車線変更の案内表示である。 FIG. 23 shows priority information in the first display area 6b from the state in which the sign information 310 indicating the vehicle traffic classification is displayed as the traffic auxiliary information in the second display area 6a while the vehicle 2 is traveling. The example of a display until guidance information 312 is displayed as AR is shown. The guidance information 312 in FIG. 23 is a lane change guidance display that prompts the user to move from the lane in which the vehicle 2 is traveling to the leftmost lane.
 まず、図23の左側の「phase1」に示すように、車両2が直進している際には、第2の表示領域6aに車両通行区分を示す標識情報310のみが表示されている状態となる。 First, as shown in “phase1” on the left side of FIG. 23, when the vehicle 2 is traveling straight, only the sign information 310 indicating the vehicle traffic classification is displayed in the second display area 6a. .
 その後、図23の中央の「phase2」に示すように、車両2が左折する交差点まで250mの距離となると、第2の表示領域6aにおける標識情報310の表示が徐々に薄くなるとともに、第1の表示領域6bには、左側の車線に車両2を移動する車線変更を促す案内情報312を徐々に表示が濃くなるように道路に重畳してAR表示する。 Thereafter, as indicated by “phase2” in the center of FIG. 23, when the distance to the intersection where the vehicle 2 turns left is 250 m, the display of the sign information 310 in the second display area 6a is gradually thinned and the first information In the display area 6b, the guidance information 312 for prompting the lane change to move the vehicle 2 to the left lane is superimposed on the road so that the display gradually becomes dark, and AR display is performed.
 なお、ここでは、左折する交差点までの距離が例えば250m程度にて「phase2」に示す表示が実行される例を示しているが、距離については、これに限定されるものではない。 Here, an example is shown in which the display shown in “phase2” is executed when the distance to the intersection to turn left is about 250 m, for example, but the distance is not limited to this.
 続いて、図23の右側の「phase3」に示すように、車両2が左折する交差点まで200mの距離に到達すると、第2の表示領域6aの標識情報310の表示が消去されて、第1の表示領域6bには、左側に車線に移動する車線変更を促す案内情報312が完全に表示される。ここでも、「phase3」の表示は、左折する交差点までの距離を例えば200m程度としたが、距離については、これに限定されるものではない。 Subsequently, as shown in “phase 3” on the right side of FIG. 23, when the vehicle 2 reaches a distance of 200 m to the intersection where the vehicle turns to the left, the display of the sign information 310 in the second display area 6a is erased, and the first In the display area 6b, guidance information 312 for prompting a lane change to move to the lane is completely displayed on the left side. In this case, the display of “phase 3” is about 200 m, for example, for the distance to the left turn intersection, but the distance is not limited to this.
 また、車両2が車線変更した後は、案内情報312の表示を徐々に薄くしながら、標識情報310の表示を徐々に濃くなるように表示し、その後、案内情報312の表示を消すとともに標識情報310の表示を完全に表示させる。すなわち、図23とは逆にとなるように、「phase3」、「phase2」、「phase1」の順序にて表示が行われる。この場合においても、「phase3」から「phase2」、「phase2」から「phase1」へのそれぞれの表示タイミングは、図23と同様に、距離によって変えるようにしてもよいし、あるいは時間を条件として表示を変えるようにしてもよい。 In addition, after the vehicle 2 changes lanes, the display of the guide information 312 is gradually thinned while the display of the sign information 310 is gradually darkened, and then the display of the guide information 312 is turned off and the sign information is displayed. The display of 310 is completely displayed. That is, the display is performed in the order of “phase3”, “phase2”, and “phase1” so as to be the reverse of FIG. Also in this case, the display timings from “phase 3” to “phase 2” and from “phase 2” to “phase 1” may be changed according to the distance as in FIG. May be changed.
 これにより、第1の表示領域6bに案内情報312が唐突に表示されることをなくすことができる。運転動作に直結する運転時の視界に唐突に案内情報312が入り込むと運転者5が驚いたり、運転が散漫になることによって事故などを誘発しかねないが、このように表示を段階的に変化させることで、それを防止することができる。 Thereby, it is possible to prevent the guide information 312 from being displayed suddenly in the first display area 6b. If the guidance information 312 suddenly enters the driving field of view that is directly connected to the driving action, the driver 5 may be surprised or the driving may be distracted. This can be prevented.
 図24は、第2の表示領域における虚像表示の視点位置が近方の場合の一例を示す説明図である。図25は、図17のAR-HUDによる第2の表示領域における視点移動の増加を軽減する表示の一例を示す説明図である。 FIG. 24 is an explanatory diagram showing an example of the case where the viewpoint position of the virtual image display in the second display area is near. FIG. 25 is an explanatory diagram showing an example of a display that reduces an increase in viewpoint movement in the second display area by the AR-HUD of FIG.
 上述したように、第2の表示領域6aは、運転者5から見れば、運転動作とはあまり関係がない空や建物などの景色に当たる部分であり、該第2の表示領域6aに表示される交通補助情報などは、あまり実景と重ね合わせて表示する必要がないものが多いと想定される。 As described above, when viewed from the driver 5, the second display area 6a is a portion that hits a landscape such as a sky or a building that is not related to the driving operation, and is displayed in the second display area 6a. It is assumed that many traffic assistance information and the like do not need to be displayed so as to be superimposed on the actual scene.
 しかしながら、第2の表示領域6aに表示する交通補助情報を近方の視点位置に表示した場合には、運転者5の視点移動が大きくなり、その結果、運転の妨げとなる恐れがある。 However, when the traffic assistance information displayed in the second display area 6a is displayed at a nearby viewpoint position, the viewpoint movement of the driver 5 becomes large, and as a result, there is a possibility that the driving may be hindered.
 例えば図24の上部に示すように、運転者5の視点位置DDが約30mであり、該第2の表示領域6aに表示される虚像の交通補助情報である標識情報310の視点位置VDが約2mであった場合、30m先の視点位置から2m先の視点位置の交通補助情報を読み取るためには、図24の下部に示すように、遠方30mから近方2mまで前後に大きな視点移動が必要となってしまう。運転中において、前後に大きな視点移動を行うと、眼精疲労が促進されたり本来運転中に見なければならない情報を見落とすことで事故を誘発したりすることに繋がる。 For example, as shown in the upper part of FIG. 24, the viewpoint position DD of the driver 5 is about 30 m, and the viewpoint position VD of the sign information 310, which is the traffic auxiliary information of the virtual image displayed in the second display area 6a, is about In the case of 2 m, in order to read the traffic assistance information from the viewpoint position 2 m ahead from the viewpoint position 30 m ahead, as shown in the lower part of FIG. 24, a large viewpoint movement from 30 m far to 2 m near is necessary. End up. When driving a large viewpoint back and forth during driving, eyestrain is promoted or accidents are induced by overlooking information that must be seen during driving.
 一方、図25の上部に示すように、運転者5の視点位置DDが約30mであり、該第2の表示領域6aに表示される虚像の交通補助情報である標識情報310の視点位置VDが同じく約30mであった場合、図25の下部に示すように、視点位置DDおよび視点位置VDがほぼ同じ位置であるので、運転者5は前後の視点移動を行うことなく交通補助情報を認識することができる。 On the other hand, as shown in the upper part of FIG. 25, the viewpoint position DD of the driver 5 is about 30 m, and the viewpoint position VD of the sign information 310 that is the traffic auxiliary information of the virtual image displayed in the second display area 6a is Similarly, when the distance is about 30 m, as shown in the lower part of FIG. 25, the viewpoint position DD and the viewpoint position VD are substantially the same position, so the driver 5 recognizes the traffic assistance information without moving the viewpoint forward and backward. be able to.
 このように、交通補助情報を認識する際に、前後の視点移動を大幅に減らすことができるので、情報伝達速度を向上させることができるとともに、運転者5の眼精疲労や注意力の散漫などを低減することができる。 Thus, when recognizing traffic assistance information, the movement of the viewpoint before and after can be greatly reduced, so that the information transmission speed can be improved, and the driver's 5 eyestrain and distraction of attention, etc. Can be reduced.
 図26は、図17のAR-HUDによる表示領域における道路標識の見落としなどを低減する表示の一例を示す説明図である。 FIG. 26 is an explanatory view showing an example of a display for reducing oversight of a road sign in the display area by the AR-HUD of FIG.
 この場合、図26の左側に示すように、運転者5の視点位置の距離と車両2から道路標識である標識202までの距離とがほぼ同じ程度となる地点にて、標識202に案内標識表示となる標識表示315の虚像を重ね合わせて表示する。 In this case, as shown on the left side of FIG. 26, a guide sign display is displayed on the sign 202 at a point where the distance of the viewpoint position of the driver 5 and the distance from the vehicle 2 to the sign 202 as a road sign are substantially the same. The virtual image of the sign display 315 is superimposed and displayed.
 この場合、標識202は、図26に示すように、空などの景色に当たる部分に設置される案内標識である。よって、標識202が運転者5の目に自然に入ったときに標識202に標識表示315の虚像が重畳して表示される。例えば運転者5の視点の距離を約30mとした場合、標識表示315は、車両2から標識までの距離が約30mに接近した際に表示する。 In this case, as shown in FIG. 26, the sign 202 is a guide sign that is installed in a portion that hits a landscape such as the sky. Therefore, when the sign 202 naturally enters the eyes of the driver 5, a virtual image of the sign display 315 is superimposed on the sign 202 and displayed. For example, when the distance of the viewpoint of the driver 5 is about 30 m, the sign display 315 is displayed when the distance from the vehicle 2 to the sign approaches about 30 m.
 標識表示315は、道路標識があることを示す虚像表示である。図26では、標識202を円形に囲むとともに該標識表示315を注目させる矢印が標識表示316として虚像表示されている例を示している。この標識表示315,316の虚像の形状は、特に制限されるものではなく、どのような形状であってもよい。 The sign display 315 is a virtual image display indicating that there is a road sign. FIG. 26 shows an example in which an arrow that surrounds the marker 202 in a circle and causes the marker display 315 to be noticed is displayed as a virtual image as the marker display 316. The shape of the virtual image of the sign display 315, 316 is not particularly limited, and may be any shape.
 標識202などの交通標識は、ECU21が車両情報取得部10から取得したカメラ映像情報などによって認識する。また、標識202までの距離は、ECU21が車両情報取得部10から取得した赤外線情報などによって認識する。 A traffic sign such as the sign 202 is recognized by the camera image information acquired by the ECU 21 from the vehicle information acquisition unit 10. Further, the distance to the sign 202 is recognized by infrared information acquired by the ECU 21 from the vehicle information acquisition unit 10 or the like.
 車両2が標識202により近づいた際、例えば車両2と標識202までの距離が5m程度となった際に標識表示315,316の虚像を重畳して表示した場合、運転者5は、焦点位置を大きく変えなければならず、認識に時間がかかってしまう。また、標識202までの距離が5m程度と短いために標識202を認識する前に通過してしまう恐れもある。すなわち、図26に示す虚像は、標識が30mなどの遠方にあるときに表示することでより大きな効果を発揮するものである。標識との距離が5mなどの近距離になったときには、むしろ標識に重畳して虚像を表示することは避けるべきである。その理由は、虚像によって標識そのものが見えづらくなってしまうためである。 When the vehicle 2 approaches the sign 202, for example, when the distance between the vehicle 2 and the sign 202 is about 5 m and the virtual images of the sign indications 315 and 316 are displayed in a superimposed manner, the driver 5 changes the focus position. It has to change a lot and it takes time to recognize. Further, since the distance to the sign 202 is as short as about 5 m, the sign 202 may pass before it is recognized. That is, the virtual image shown in FIG. 26 exhibits a greater effect when displayed when the sign is far away, such as 30 m. When the distance to the sign becomes a short distance such as 5 m, it should be avoided to display a virtual image superimposed on the sign. The reason is that it is difficult to see the sign itself due to the virtual image.
 このように、標識表示315、316を運転者5の視点位置の距離とほぼ同じ距離にて表示させることにより、運転者5は視点を前後に移動することなく標識202を認識することができるとともに、標識202の見落としなどを低減することができる。 In this way, by displaying the sign displays 315 and 316 at approximately the same distance as the viewpoint position of the driver 5, the driver 5 can recognize the sign 202 without moving the viewpoint back and forth. The oversight of the sign 202 can be reduced.
 図27は、図17のAR-HUDによる表示領域における道路状況の危険度に応じた表示の一例を示す説明図である。 FIG. 27 is an explanatory view showing an example of display according to the road condition risk level in the display area by the AR-HUD of FIG.
 この図27は、運転席からウィンドシールド3を介して視認している前方の風景およびウィンドシールド3に投射された表示領域6内の虚像の状態の例を示したものである。 FIG. 27 shows an example of the scenery in front of the driver's seat viewed through the windshield 3 and the state of the virtual image in the display area 6 projected on the windshield 3.
 図27の左側に示す状態は、道路状況における危険度が最も低い状態(以下、危険度低という)であり、図27の右側に示す状態は、最も危険度が高い状態(以下、危険度高という)を示している。また、図27の中央部に示す状態は、危険度低よりも危険度が高く、危険度高よりも危険度が低い状態(以下、危険度中という)を示している。 The state shown on the left side of FIG. 27 is the state with the lowest risk level (hereinafter referred to as low risk level) in the road condition, and the state shown on the right side of FIG. 27 is the state with the highest risk level (hereinafter referred to as high risk level). It is shown). Further, the state shown in the central part of FIG. 27 shows a state where the degree of danger is higher than the low degree of risk and the degree of risk is lower than the high degree of risk (hereinafter referred to as medium risk).
 まず、危険度低では、ナビゲーションによる案内がされており、表示領域6には、道路に重畳して次の交差点を左折することを示す案内情報312が虚像にて表示されており、対向車線右側の歩道上には、歩行者201が歩行している状態である。 First, when the degree of risk is low, guidance is provided by navigation. In the display area 6, guidance information 312 indicating that the next intersection is to be turned left is displayed in a virtual image, and is displayed on the right side of the opposite lane. The pedestrian 201 is walking on the sidewalk.
 この場合、歩行者201は、対向車線側の歩道上におり、かつ車両2との距離が離れているために危険度は低く設定される。 In this case, since the pedestrian 201 is on the sidewalk on the opposite lane side and the distance from the vehicle 2 is far, the danger level is set low.
 表示領域6には、上述したように次の交差点を左折することを示す案内情報312が走行車線の道路に重畳して表示されており、歩道上の歩行者201には、歩行者が存在することを示す歩行者注意表示320が該歩行者201に重畳して表示されている。 In the display area 6, as described above, the guidance information 312 indicating that the next intersection is to be turned left is displayed superimposed on the road of the traveling lane, and the pedestrian 201 on the sidewalk has a pedestrian. A pedestrian warning display 320 indicating this is superimposed on the pedestrian 201 and displayed.
 この歩行者注意表示320は、例えば矢印などの形状からなり、該矢印にて歩行者201を指すように表示される。この歩行者注意表示320は、歩行者を検知した際に表示されるものであり、車両2から歩行者201までの距離が予め設定された距離以上の場合や歩行者201が対向車線側の歩道にいる場合などに表示される。また、車両2が歩行者201を通り過ぎると歩行者注意表示320は消去される。 The pedestrian warning display 320 has a shape such as an arrow, for example, and is displayed so as to point to the pedestrian 201 by the arrow. This pedestrian warning display 320 is displayed when a pedestrian is detected. When the distance from the vehicle 2 to the pedestrian 201 is greater than or equal to a preset distance, or when the pedestrian 201 is on the side of the opposite lane. Displayed when you are in Further, when the vehicle 2 passes the pedestrian 201, the pedestrian warning display 320 is erased.
 なお、歩行者注意表示320の形状は、特に制限はなく、矢印形状以外であってもよい。 In addition, the shape of the pedestrian warning display 320 is not particularly limited, and may be other than the arrow shape.
 続いて、危険度中では、歩行者201が車両2の左折する交差点にある横断歩道の近くに移動した状態となっている。この場合、表示領域6には、図12と同様に歩行者表示302が該歩行者201に重畳して表示される。 Subsequently, in the risk level, the pedestrian 201 has moved near the pedestrian crossing at the intersection where the vehicle 2 turns left. In this case, a pedestrian display 302 is superimposed on the pedestrian 201 and displayed in the display area 6 as in FIG.
 ここでは、車両2から歩行者201までの距離が予め設定された距離よりも短い距離であったり、あるいは歩行者201が横断歩道付近にいるために、より危険度が高い危険度中となっている。 Here, the distance from the vehicle 2 to the pedestrian 201 is shorter than a preset distance, or the pedestrian 201 is in the vicinity of the pedestrian crossing. Yes.
 危険度低の際には、案内情報312が道路に沿って重畳して表示されていたが、危険度中になると案内情報312が道路に沿って重畳するのではなく、道路よりも上方の景色、すなわち第2の表示領域6aに表示される。言い換えれば、案内情報312を上方に表示するとともに、歩行者201の存在をより強くアピールするように表示する。 When the risk level is low, the guidance information 312 is displayed superimposed on the road. However, when the risk level is medium, the guidance information 312 is not superimposed along the road. That is, it is displayed in the second display area 6a. In other words, the guidance information 312 is displayed upward, and the presence of the pedestrian 201 is displayed more strongly.
 これにより、案内情報312と歩行者表示302とが表示される領域が区別されるので、運転者5が歩行者201の存在をより認識しやすくすることができる。また、ナビゲーション案内を続けながら、歩行者201がいることを運転者5に強くアピールすることができる。 This distinguishes the area where the guidance information 312 and the pedestrian display 302 are displayed, so that the driver 5 can more easily recognize the presence of the pedestrian 201. Moreover, it can appeal to the driver | operator 5 strongly that the pedestrian 201 exists, continuing navigation guidance.
 続いて、危険度高では、歩行者201が歩道ではなく、車両2の走行する車道に移動した状態となっている。この場合には、歩行者201が歩道ではなく、車両2の走行する車道にいるために、危険度が最も高い危険度高となる。 Subsequently, at high risk, the pedestrian 201 has moved to the roadway on which the vehicle 2 travels instead of the sidewalk. In this case, since the pedestrian 201 is not on the sidewalk but on the roadway on which the vehicle 2 travels, the risk level is the highest.
 この危険度高では、危険度中の状態にて表示されていた案内情報312は表示領域6から消去される。歩行者201には、歩行者表示302が重畳して表示されるとともに、歩行者201が近くにいて危険が高いことを警告する表示である、虚像の警告情報321が該歩行者201の近くに重畳して表示される。また、運転者2に高い危険があることをより強くアピールするためにナビゲーションによる案内は中止され、案内情報312が消去される。 When the degree of risk is high, the guidance information 312 displayed in the state of risk is deleted from the display area 6. A pedestrian display 302 is superimposed on the pedestrian 201, and a virtual image warning information 321 is displayed near the pedestrian 201 to warn that the pedestrian 201 is nearby and high in danger. It is displayed superimposed. Further, in order to appeal more strongly that the driver 2 is at high risk, the guidance by navigation is stopped and the guidance information 312 is deleted.
 また、車両2が歩行者を通過すると、歩行者表示302および警告情報321は消去され、ナビゲーションによる案内が再開される。 Further, when the vehicle 2 passes a pedestrian, the pedestrian display 302 and the warning information 321 are deleted, and guidance by navigation is resumed.
 このように、危険度に応じてナビゲーションによる案内の表示と警告の表示が変更されることにより、運転者5に対して、警告された事象(ここでは歩行者)に注意を払うよう促すことができ、また、危険度が高くない場合には警告と並行してナビゲーションによる案内も続けることができるため、運転者5が進路に迷うことを防ぐことができる。さらには、ナビゲーションによる案内が危険度に応じて表示領域の下部(第1の表示領域)から上部(第2の表示領域)、または上部から下部へとシームレスに移動するため、ナビゲーションによる案内が出たり消えたりするのに比べて表示の変化量が低減され運転者5が表示の変化に驚いて注意力が散漫になることを防ぐことができる。 Thus, by changing the display of the guidance by navigation and the display of warning according to the degree of risk, the driver 5 is encouraged to pay attention to the warning event (in this case, a pedestrian). Further, when the degree of danger is not high, guidance by navigation can be continued in parallel with the warning, so that the driver 5 can be prevented from getting lost. Furthermore, since navigation guidance seamlessly moves from the lower part of the display area (first display area) to the upper part (second display area), or from the upper part to the lower part, depending on the degree of danger, guidance by navigation appears. The amount of change in the display is reduced compared to when it disappears or disappears, and it is possible to prevent the driver 5 from being surprised by the change in the display and distracting attention.
 続いて、車両2が交差点を曲がる際におけるナビゲーションの案内表示について説明する。 Next, a navigation guidance display when the vehicle 2 turns at an intersection will be described.
 交差点を曲がる際には、歩行者、あるいは対向車の状況など運転者5が注意を払わなければならない点が多い。このような場合、交差点内においては、第1の表示領域6bにナビゲーション案内の表示を行わないものとする。これは、運転者5がナビゲーションの案内表示に気をとられてしまうと、上述した注意点などを見落としてしまう恐れがあるからである。 When turning an intersection, there are many points that the driver 5 must pay attention to, such as the situation of pedestrians or oncoming vehicles. In such a case, navigation guidance is not displayed in the first display area 6b within the intersection. This is because if the driver 5 is distracted by the navigation guidance display, the above-mentioned cautionary points may be overlooked.
 図28は、図17のAR-HUDによる表示領域における交差点でのナビゲーションの案内表示の一例を示す説明図である。 FIG. 28 is an explanatory diagram showing an example of navigation guidance display at an intersection in the display area by the AR-HUD of FIG.
 図28の上部には、車両2の走行状態をそれぞれ示しており、左側は、車両2が交差点を曲がる前、すなわち交差点の手前を走行中であることを示している。また、中央部は、車両2が交差点を左折している最中、すなわち交差点内を走行している状態を示している。右側は、交差点の左折を終了、すなわち交差点通過後における車両2の走行状態を示している。 The upper part of FIG. 28 shows the traveling state of the vehicle 2, and the left side shows that the vehicle 2 is traveling before the intersection, that is, before the intersection. The center portion shows a state in which the vehicle 2 is turning left at the intersection, i.e., traveling in the intersection. The right side shows the traveling state of the vehicle 2 after completing the left turn at the intersection, that is, after passing through the intersection.
 また、図28の下部において、左から右にかけては、上記した車両2の走行状態において、運転席からウィンドシールド3を介して視認している前方の風景およびウィンドシールド3に投射された表示領域6内の虚像の状態をそれぞれ示している。 In the lower part of FIG. 28, from left to right, in the above-described traveling state of the vehicle 2, the front landscape viewed through the windshield 3 from the driver's seat and the display area 6 projected onto the windshield 3. The state of the virtual image inside is shown, respectively.
 まず、図28の左側において、第1の表示領域6bには、次の交差点を左折することを示す案内情報312が道路に重畳してAR表示される。そして、図28の中央部に示すように、車両2が交差点内に差し掛かると、第1の表示領域6bに表示されている案内情報312は消去され、新たな案内情報312が、運転動作とはあまり関係がない部分である第2の表示領域6aに表示される。あるいは、案内情報312を第2の表示領域6aにおいても表示しないようにしてもよい。 28. First, on the left side of FIG. 28, in the first display area 6b, guidance information 312 indicating that the next intersection is to be turned left is displayed as an AR superimposed on the road. Then, as shown in the center of FIG. 28, when the vehicle 2 approaches the intersection, the guidance information 312 displayed in the first display area 6b is erased, and new guidance information 312 is displayed as the driving action. Are displayed in the second display area 6a, which is a part that is not so much related. Alternatively, the guidance information 312 may not be displayed in the second display area 6a.
 ここで、車両2が交差点内に差しかかったか否かの判定は、例えばECU21が車両情報取得部10から取得したハンドル操舵角情報、ナビゲーション情報、および車外のカメラ映像情報などに基づいて行う。 Here, whether or not the vehicle 2 has entered the intersection is determined based on, for example, steering wheel angle information, navigation information, and camera image information outside the vehicle acquired by the ECU 21 from the vehicle information acquisition unit 10.
 車両2の左折が終了すると、図28の右側に示すように、再び案内情報312が第1の表示領域6bにAR表示される。ここでは、案内情報312が左折した道路を直進することを示す情報となっている。 When the left turn of the vehicle 2 is finished, as shown on the right side of FIG. 28, the guidance information 312 is displayed again in the first display area 6b. Here, the guidance information 312 is information indicating that the vehicle travels straight on the left-turned road.
 これにより、交差点を曲がる際に、運転者5は交差点を走行中に注意を払うべき対象を虚像表示によって遮られることがないので安全な状態で走行することができる。さらには、ナビゲーションによる案内が交差点の走行状態に応じて表示領域の下部(第1の表示領域)から上部(第2の表示領域)、または上部から下部へとシームレスに移動するため、ナビゲーションによる案内が出たり消えたりするのに比べて表示の変化量が低減され、運転者5が表示の変化に驚いて注意力が散漫になることを防ぐことができる。 Thus, when turning the intersection, the driver 5 can travel in a safe state because the object to which attention should be paid while traveling at the intersection is not obstructed by the virtual image display. Furthermore, since navigation guidance seamlessly moves from the lower part of the display area (first display area) to the upper part (second display area) or from the upper part to the lower part according to the driving state of the intersection, guidance by navigation is provided. The amount of change in the display is reduced compared to when the message appears and disappears, and it is possible to prevent the driver 5 from being surprised by the change in the display and distracting attention.
 これまでは、表示領域6の第2の表示領域6aおよび第1の表示領域6bにおける表示例について説明したが、これらの表示については、ユーザによってカスタマイズできるようにしてもよい。 So far, the display examples in the second display area 6a and the first display area 6b of the display area 6 have been described. However, these displays may be customized by the user.
 図29は、図17のAR-HUDによる表示領域による表示をカスタマイズするメニューの一例を示す説明図である。 FIG. 29 is an explanatory view showing an example of a menu for customizing the display in the display area by the AR-HUD of FIG.
 この図29の例では、表示領域6における表示の選択例として、「上下二段表示」、「上段表示」、および「下段表示」のメニューが設けられている。「上下二段表示」は、図17(b)に示す第2の表示領域6aおよび第2の表領域6bの全ての領域において、虚像が表示される。 In the example of FIG. 29, menus of “upper and lower two-level display”, “upper-level display”, and “lower-level display” are provided as examples of display selection in the display area 6. In the “upper and lower two-stage display”, virtual images are displayed in all of the second display area 6a and the second table area 6b shown in FIG.
 「上段表示」は、第2の表示領域6aのみの表示、すなわち運転動作を補助する補助的な情報からなる交通補助情報をのみを表示させる。「下段表示」は、第1の表示領域6bのみの表示、すなわち運転動作に直結する重要な情報である優先情報のみを表示させる。 “Upper display” displays only the second display area 6a, that is, only the traffic auxiliary information including auxiliary information for assisting the driving operation. The “lower display” displays only the first display area 6b, that is, only priority information that is important information directly related to the driving operation.
 運転者5は、図29のメニューから表示領域6に表示される虚像の表示方法を選択することにより、表示する内容を任意に選択することができる。この図29では、表示領域6における表示領域のみを選択するメニューとしたが、例えば第2の表示領域6aに表示される情報の透過率を変えたり、情報の表示サイズを変更したりなど様々なメニューを用意するようにしてもよい。また、選択メニューも上述の選択例に限らず、他のものであってもよい。 The driver 5 can arbitrarily select the content to be displayed by selecting the display method of the virtual image displayed in the display area 6 from the menu of FIG. In FIG. 29, the menu for selecting only the display area in the display area 6 is used. However, various menus such as changing the transmittance of information displayed in the second display area 6a and changing the display size of the information are available. A menu may be prepared. Further, the selection menu is not limited to the above-described selection example, but may be other items.
 以上により、運転者5は、交通補助情報と該交通補助情報よりも重要な優先情報との表示を容易に区別して認識することができる。また、交通補助情報および優先情報を運転の妨げになることなく運転者5に認識させることができる。これにより、安全運転に寄与することができる。 As described above, the driver 5 can easily distinguish and recognize the display of the traffic assistance information and the priority information more important than the traffic assistance information. In addition, it is possible to make the driver 5 recognize the traffic assistance information and the priority information without hindering driving. Thereby, it can contribute to safe driving.
 以上、本発明者によってなされた発明を実施の形態に基づき具体的に説明したが、本発明は前記実施の形態に限定されるものではなく、その要旨を逸脱しない範囲で種々変更可能であることはいうまでもない。 As mentioned above, the invention made by the present inventor has been specifically described based on the embodiment. However, the present invention is not limited to the embodiment, and various modifications can be made without departing from the scope of the invention. Needless to say.
 なお、本発明は上記した実施の形態に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施の形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。 Note that the present invention is not limited to the above-described embodiment, and includes various modifications. For example, the above-described embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to one having all the configurations described.
 また、ある実施の形態の構成の一部を他の実施の形態の構成に置き換えることが可能であり、また、ある実施の形態の構成に他の実施の形態の構成を加えることも可能である。また、各実施の形態の構成の一部について、他の構成の追加、削除、置換をすることが可能である。 Further, a part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. . In addition, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment.
1 AR-HUD
2 車両
3 ウィンドシールド
4 車両情報
5 運転者
6 表示領域
10 車両情報取得部
20 制御部
21 ECU
22 音声出力部
23 不揮発性メモリ
24 メモリ
25 光源調整部
26 歪み補正部
27 表示素子駆動部
28 表示距離調整部
29 ミラー調整部
30 映像表示装置
31 光源
32 照明光学系
33 表示素子
40 表示距離調整機構
50 ミラー駆動部
51 ミラー
52 ミラー
60 スピーカ
101 車速センサ
102 シフトポジションセンサ
103 ハンドル操舵角センサ
104 ヘッドライトセンサ
105 照度センサ
106 色度センサ
107 測距センサ
108 赤外線センサ
109 エンジン始動センサ
110 加速度センサ
111 ジャイロセンサ
112 温度センサ
113 路車間通信用無線受信機
114 車車間通信用無線受信機
117 GPS受信機
118 VICS受信機
281 機能性液晶フィルムON/OFF制御部
282 レンズ可動部
283 調光ミラーON/OFF制御部
284 拡散板可動部
285 光学フィルタ可動部
401 機能性液晶フィルム
402 レンズ可動機構
403 調光ミラー
404 拡散板可動機構
405 光学フィルタ可動機構
1 AR-HUD
2 Vehicle 3 Windshield 4 Vehicle information 5 Driver 6 Display area 10 Vehicle information acquisition unit 20 Control unit 21 ECU
22 Audio output unit 23 Non-volatile memory 24 Memory 25 Light source adjustment unit 26 Distortion correction unit 27 Display element drive unit 28 Display distance adjustment unit 29 Mirror adjustment unit 30 Video display device 31 Light source 32 Illumination optical system 33 Display element 40 Display distance adjustment mechanism 50 Mirror Drive Unit 51 Mirror 52 Mirror 60 Speaker 101 Vehicle Speed Sensor 102 Shift Position Sensor 103 Handle Steering Angle Sensor 104 Headlight Sensor 105 Illuminance Sensor 106 Chromaticity Sensor 107 Distance Sensor 108 Infrared Sensor 109 Engine Start Sensor 110 Acceleration Sensor 111 Gyro Sensor 112 Temperature sensor 113 Radio receiver for road-to-vehicle communication 114 Radio receiver for vehicle-to-vehicle communication 117 GPS receiver 118 VICS receiver 281 Functional liquid crystal film ON / OFF control unit 282 Lens movable unit 283 Dimming Ra ON / OFF control unit 284 diffusing plate movable part 285 optical filter moving unit 401 functional liquid crystal film 402 lens moving mechanism 403 switchable mirror 404 diffusion plate moving mechanism 405 optical filter moving mechanism

Claims (15)

  1.  車両の運転席からウィンドシールドを介して視認される表示領域に映像を表示するヘッドアップディスプレイ装置であって、
     車両が検知した車両情報を取得する車両情報取得部と、
     前記映像の表示を制御する制御部と、
     前記映像を生成する映像表示装置と、
     前記映像表示装置が生成した前記映像を反射して前記ウィンドシールドに投射するミラーと、
     前記ミラーの角度を変化させるミラー駆動部と、
     運転者に対する前記映像の表示距離を調整する表示距離調整機構と、
     を有し、
     前記制御部は、前記車両情報に基づいて、前記映像表示装置を制御することにより、前記表示領域を制御可能なものであり、
     前記制御部が制御する前記表示領域は、
     第1の表示領域と、
     第2の表示領域と、
     を有し、
     前記第1の表示領域は、拡張現実の表示が可能な領域であり、
     前記第2の表示領域は、前記拡張現実の表示をしない領域である、ヘッドアップディスプレイ装置。
    A head-up display device that displays an image in a display area that is visually recognized through a windshield from a driver's seat of a vehicle,
    A vehicle information acquisition unit for acquiring vehicle information detected by the vehicle;
    A control unit for controlling display of the video;
    A video display device for generating the video;
    A mirror that reflects and projects the image generated by the image display device onto the windshield;
    A mirror driving section for changing the angle of the mirror;
    A display distance adjustment mechanism for adjusting the display distance of the image to the driver;
    Have
    The control unit is capable of controlling the display area by controlling the video display device based on the vehicle information,
    The display area controlled by the control unit is:
    A first display area;
    A second display area;
    Have
    The first display area is an area capable of displaying augmented reality,
    The head-up display device, wherein the second display area is an area where the augmented reality is not displayed.
  2.  請求項1記載のヘッドアップディスプレイ装置において、
     前記第2の表示領域は、前記第1の表示領域よりも上側の領域である、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 1,
    The head-up display device, wherein the second display area is an area above the first display area.
  3.  請求項1記載のヘッドアップディスプレイ装置において、
     前記第1の表示領域は、第1の情報を表示し、
     前記第2の表示領域は、前記第1の情報よりも優先度が低い第2の情報を表示する、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 1,
    The first display area displays first information;
    The head-up display device, wherein the second display area displays second information having a lower priority than the first information.
  4.  請求項3記載のヘッドアップディスプレイ装置において、
     前記第1の情報は、安全運転を支援する情報である安全運転支援情報であり、
     前記第2の情報は、運転行動を支援する情報である走行支援情報である、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 3.
    The first information is safe driving support information that is information for supporting safe driving,
    The head-up display device, wherein the second information is travel support information that is information that supports driving behavior.
  5.  請求項4記載のヘッドアップディスプレイ装置において、
     前記第1の表示領域に表示される前記安全運転支援情報は、前記車両の進行先を示す案内表示を有し、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するナビゲーション情報に基づいて、前記案内表示を生成し、生成した前記案内表示を前記車両が左折する道路または前記車両が右折する道路に重畳させる拡張現実の表示を行う、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 4,
    The safe driving support information displayed in the first display area has a guidance display indicating a destination of the vehicle,
    The control unit generates the guidance display based on navigation information included in the vehicle information acquired by the vehicle information acquisition unit, and a road on which the vehicle turns left or a road on which the vehicle turns right on the generated guidance display A head-up display device that displays augmented reality superimposed on the head.
  6.  請求項4または5記載のヘッドアップディスプレイ装置において、
     前記第1の表示領域に表示される前記安全運転支援情報は、歩行者の存在を示す歩行者表示を有し、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するカメラ映像情報に基づいて前記歩行者を認識し、認識した前記歩行者に前記歩行者表示を重畳させる拡張現実の表示を行う、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 4 or 5,
    The safe driving support information displayed in the first display area has a pedestrian display indicating the presence of a pedestrian,
    The control unit recognizes the pedestrian based on camera image information included in the vehicle information acquired by the vehicle information acquisition unit, and performs an augmented reality display in which the pedestrian display is superimposed on the recognized pedestrian. , Head-up display device.
  7.  請求項4記載のヘッドアップディスプレイ装置において、
     前記第1の表示領域に表示される前記安全運転支援情報は、前記車両の進行先を示す案内表示および歩行者の存在を示す歩行者表示を有し、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するナビゲーション情報に基づいて、前記案内表示を生成し、生成した前記案内表示を前記車両が左折する道路または前記車両が右折する道路に重畳させる拡張現実の表示を行い、前記車両情報が有するカメラ映像情報に基づいて前記歩行者を認識し、認識した前記歩行者に重畳させる拡張現実の表示を行い、認識した前記歩行者の位置から危険度を判定し、危険度が高いと判定した際に前記第1の表示領域に表示される前記案内表示を前記第2の表示領域に表示する、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 4,
    The safe driving support information displayed in the first display area has a guidance display indicating a destination of the vehicle and a pedestrian display indicating the presence of a pedestrian,
    The control unit generates the guidance display based on navigation information included in the vehicle information acquired by the vehicle information acquisition unit, and a road on which the vehicle turns left or a road on which the vehicle turns right on the generated guidance display The augmented reality displayed on the vehicle information is recognized, the pedestrian is recognized based on the camera image information of the vehicle information, the augmented reality displayed superimposed on the recognized pedestrian, and the recognized position of the pedestrian A head-up display device that determines a risk level from the first display area and displays the guidance display displayed in the first display area when it is determined that the risk level is high.
  8.  請求項4記載のヘッドアップディスプレイ装置において、
     前記第1の表示領域に表示される前記安全運転支援情報は、前記車両の進行先を示す案内表示および歩行者の存在を示す歩行者表示を有し、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するナビゲーション情報に基づいて、前記案内表示を生成し、生成した前記案内表示を前記車両が左折する道路または前記車両が右折する道路に重畳させる拡張現実の表示を行い、前記車両情報が有するカメラ映像情報に基づいて前記歩行者を認識し、認識した前記歩行者に重畳させる拡張現実の表示を行い、認識した前記歩行者の位置から危険度を判定し、危険度が高いと判定した際に前記案内表示を消去する、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 4,
    The safe driving support information displayed in the first display area has a guidance display indicating a destination of the vehicle and a pedestrian display indicating the presence of a pedestrian,
    The control unit generates the guidance display based on navigation information included in the vehicle information acquired by the vehicle information acquisition unit, and a road on which the vehicle turns left or a road on which the vehicle turns right on the generated guidance display The augmented reality displayed on the vehicle information is recognized, the pedestrian is recognized based on the camera image information of the vehicle information, the augmented reality displayed superimposed on the recognized pedestrian, and the recognized position of the pedestrian A head-up display device that determines the risk level from the information and erases the guidance display when it is determined that the risk level is high.
  9.  請求項5記載のヘッドアップディスプレイ装置において、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するハンドル操舵角情報、ナビゲーション情報、およびカメラ映像情報に基づいて、前記車両が左折または右折をしているか否かを判定し、前記車両が左折または右折をしていると判定した際に前記第1の表示領域に表示される前記案内表示を前記第2の表示領域に表示する、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 5.
    The control unit determines whether the vehicle is turning left or right based on steering wheel angle information, navigation information, and camera video information included in the vehicle information acquired by the vehicle information acquisition unit, A head-up display device that displays the guidance display displayed in the first display area in the second display area when it is determined that the vehicle is turning left or right.
  10.  請求項5~8のいずれか1項に記載のヘッドアップディスプレイ装置において、
     前記第2の表示領域に表示される前記走行支援情報は、案内標識の存在を示す案内標識表示を有し、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するカメラ映像情報に基づいて前記案内標識を認識し、認識した前記案内標識に前記案内標識表示を重畳させる拡張現実の表示を行う、ヘッドアップディスプレイ装置。
    The head-up display device according to any one of claims 5 to 8,
    The driving support information displayed in the second display area has a guide sign display indicating the presence of a guide sign,
    The control unit recognizes the guide sign based on camera image information included in the vehicle information acquired by the vehicle information acquisition unit, and performs augmented reality display in which the guide sign display is superimposed on the recognized guide sign. , Head-up display device.
  11.  車両の運転席からウィンドシールドを介して視認される表示領域に映像を表示するヘッドアップディスプレイ装置であって、
     前記車両が検知した車両情報を取得する車両情報取得部と、
     前記映像の表示を制御する制御部と、
     前記映像を生成する映像表示装置と、
     前記映像表示装置が生成した前記映像を反射して前記ウィンドシールドに投射するミラーと、
     前記ミラーの角度を変化させるミラー駆動部と、
     運転者に対する前記映像の表示距離を調整する表示距離調整機構と、
     を有し、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するナビゲーション情報に基づいて、前記車両の進行先を示す案内表示を生成し、生成した前記案内表示を前記車両が左折する道路または前記車両が右折する道路に重畳させる拡張現実の表示を行う、ヘッドアップディスプレイ装置。
    A head-up display device that displays an image in a display area that is visually recognized through a windshield from a driver's seat of a vehicle,
    A vehicle information acquisition unit for acquiring vehicle information detected by the vehicle;
    A control unit for controlling display of the video;
    A video display device for generating the video;
    A mirror that reflects and projects the image generated by the image display device onto the windshield;
    A mirror driving section for changing the angle of the mirror;
    A display distance adjustment mechanism for adjusting the display distance of the image to the driver;
    Have
    The control unit generates a guidance display indicating a destination of the vehicle based on navigation information included in the vehicle information acquired by the vehicle information acquisition unit, and a road on which the vehicle turns to the left or the generated guidance display A head-up display device that displays augmented reality superimposed on a road on which the vehicle turns right.
  12.  請求項11記載のヘッドアップディスプレイ装置において、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するカメラ映像情報に基づいて歩行者を認識し、認識した前記歩行者に前記歩行者の存在を示す歩行者表示を重畳させる拡張現実の表示を行う、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 11.
    The control unit recognizes a pedestrian based on camera image information included in the vehicle information acquired by the vehicle information acquisition unit, and superimposes a pedestrian display indicating the presence of the pedestrian on the recognized pedestrian. A head-up display device that displays actual images.
  13.  請求項11または12記載のヘッドアップディスプレイ装置において、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するカメラ映像情報に基づいて、他の車両が存在しているかを判定し、前記他の車両が存在している際に、他の車両が存在していることを示す車両表示を生成し、生成した前記車両表示を前記他の車両に重畳させる拡張現実の表示を行う、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 11 or 12,
    The control unit determines whether another vehicle exists based on the camera image information included in the vehicle information acquired by the vehicle information acquisition unit, and when the other vehicle exists, A head-up display device that generates a vehicle display indicating that the vehicle is present and displays an augmented reality in which the generated vehicle display is superimposed on the other vehicle.
  14.  請求項13記載のヘッドアップディスプレイ装置において、
     前記制御部は、前記運転者が運転する前記車両と反対の進行方向の車両と前記運転者が運転する前記車両と同じ進行方向の車両とを識別できるように前記車両表示を生成する、ヘッドアップディスプレイ装置。
    The head-up display device according to claim 13.
    The control unit generates the vehicle display so that a vehicle in a traveling direction opposite to the vehicle driven by the driver and a vehicle in the same traveling direction as the vehicle driven by the driver can be identified. Display device.
  15.  請求項11~13のいずれか1項に記載のヘッドアップディスプレイ装置において、
     前記制御部は、前記車両情報取得部が取得した前記車両情報が有するカメラ映像情報に基づいて規制標識または指示標識の道路標識の意味を判定し、判定した前記道路標識の意味に見合った規制指示を示す表示である規制指示表示を生成し、生成した前記規制指示表示を前記道路標識に該当する前記道路に重畳させる拡張現実の表示を行う、ヘッドアップディスプレイ装置。
    The head-up display device according to any one of claims 11 to 13,
    The control unit determines the meaning of a road sign of a restriction sign or an instruction sign based on camera video information included in the vehicle information acquired by the vehicle information acquisition unit, and a restriction instruction corresponding to the determined meaning of the road sign A head-up display device that generates a restriction instruction display, which is a display indicating the above, and performs an augmented reality display in which the generated restriction instruction display is superimposed on the road corresponding to the road sign.
PCT/JP2017/033683 2016-10-13 2017-09-19 Head-up display device WO2018070193A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016201687A JP2019217790A (en) 2016-10-13 2016-10-13 Head-up display device
JP2016-201687 2016-10-13

Publications (1)

Publication Number Publication Date
WO2018070193A1 true WO2018070193A1 (en) 2018-04-19

Family

ID=61905299

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/033683 WO2018070193A1 (en) 2016-10-13 2017-09-19 Head-up display device

Country Status (2)

Country Link
JP (1) JP2019217790A (en)
WO (1) WO2018070193A1 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019230271A1 (en) * 2018-05-29 2019-12-05 株式会社デンソー Display control device, display control program, and persistent tangible computer-readable recording medium therefor
JP2020056887A (en) * 2018-10-01 2020-04-09 本田技研工業株式会社 Display device, display control method, and program
WO2020085159A1 (en) * 2018-10-23 2020-04-30 日本精機株式会社 Display device
CN111476104A (en) * 2020-03-17 2020-07-31 重庆邮电大学 AR-HUD image distortion correction method, device and system under dynamic eye position
WO2020166252A1 (en) * 2019-02-14 2020-08-20 株式会社デンソー Display control device, display control program, and tangible, non-transitory computer-readable medium
JP2020132137A (en) * 2019-02-14 2020-08-31 株式会社デンソー Display controller and display control program
WO2020249367A1 (en) * 2019-06-13 2020-12-17 Volkswagen Aktiengesellschaft Control of a display of an augmented reality head-up display apparatus for a motor vehicle
WO2021002081A1 (en) * 2019-07-02 2021-01-07 株式会社デンソー Display control device and display control program
JP2021009133A (en) * 2019-07-02 2021-01-28 株式会社デンソー Display control device and display control program
US20210215499A1 (en) * 2018-06-01 2021-07-15 Volkswagen Aktiengesellschaft Method for Calculating an Augmented Reality Overlay for Displaying a Navigation Route on an AR Display Unit, Device for Carrying Out the Method, Motor Vehicle and Computer Program
CN113677553A (en) * 2019-04-11 2021-11-19 三菱电机株式会社 Display control device and display control method
CN113784861A (en) * 2019-05-15 2021-12-10 日产自动车株式会社 Display control method and display control device
WO2022137558A1 (en) 2020-12-25 2022-06-30 日産自動車株式会社 Information processing device and information processing method
FR3118728A1 (en) * 2021-01-12 2022-07-15 Psa Automobiles Sa Motor vehicle comprising an ADAS system coupled to an augmented reality display system of said vehicle.
EP4057251A1 (en) * 2021-03-10 2022-09-14 Yazaki Corporation Vehicular display apparatus
CN115431764A (en) * 2022-10-10 2022-12-06 江苏泽景汽车电子股份有限公司 AR scale display method and device, electronic equipment and storage medium

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7061938B2 (en) * 2018-07-17 2022-05-02 三菱電機株式会社 Driving support device and driving support method
JP7200970B2 (en) * 2020-04-17 2023-01-10 トヨタ自動車株式会社 vehicle controller
CN111561938A (en) * 2020-05-28 2020-08-21 北京百度网讯科技有限公司 AR navigation method and device
JPWO2022123922A1 (en) * 2020-12-11 2022-06-16
JP2023076002A (en) 2021-11-22 2023-06-01 トヨタ自動車株式会社 image display system
WO2023145851A1 (en) * 2022-01-28 2023-08-03 日本精機株式会社 Display device
WO2024034053A1 (en) * 2022-08-10 2024-02-15 マクセル株式会社 Information processing device and information processing method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005199992A (en) * 2003-12-17 2005-07-28 Denso Corp Vehicle information display system
JP2005292031A (en) * 2004-04-02 2005-10-20 Denso Corp Vehicular information display device and system, and program
JP2007288657A (en) * 2006-04-19 2007-11-01 Toyota Motor Corp Display apparatus for vehicle, and display method of the display apparatus for vehicle
WO2014129017A1 (en) * 2013-02-22 2014-08-28 クラリオン株式会社 Head-up display apparatus for vehicle
JP2015134521A (en) * 2014-01-16 2015-07-27 三菱電機株式会社 vehicle information display control device
JP2016107731A (en) * 2014-12-04 2016-06-20 日本精機株式会社 Head-up display device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005199992A (en) * 2003-12-17 2005-07-28 Denso Corp Vehicle information display system
JP2005292031A (en) * 2004-04-02 2005-10-20 Denso Corp Vehicular information display device and system, and program
JP2007288657A (en) * 2006-04-19 2007-11-01 Toyota Motor Corp Display apparatus for vehicle, and display method of the display apparatus for vehicle
WO2014129017A1 (en) * 2013-02-22 2014-08-28 クラリオン株式会社 Head-up display apparatus for vehicle
JP2015134521A (en) * 2014-01-16 2015-07-27 三菱電機株式会社 vehicle information display control device
JP2016107731A (en) * 2014-12-04 2016-06-20 日本精機株式会社 Head-up display device

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019206256A (en) * 2018-05-29 2019-12-05 株式会社デンソー Display control device and display control program
US11803053B2 (en) 2018-05-29 2023-10-31 Denso Corporation Display control device and non-transitory tangible computer-readable medium therefor
WO2019230271A1 (en) * 2018-05-29 2019-12-05 株式会社デンソー Display control device, display control program, and persistent tangible computer-readable recording medium therefor
US20210215499A1 (en) * 2018-06-01 2021-07-15 Volkswagen Aktiengesellschaft Method for Calculating an Augmented Reality Overlay for Displaying a Navigation Route on an AR Display Unit, Device for Carrying Out the Method, Motor Vehicle and Computer Program
US11629972B2 (en) * 2018-06-01 2023-04-18 Volkswagen Aktiengesellschaft Method for calculating an augmented reality overlay for displaying a navigation route on an AR display unit, device for carrying out the method, motor vehicle and computer program
US10996479B2 (en) 2018-10-01 2021-05-04 Honda Motor Co., Ltd. Display device, display control method, and storage medium
JP2020056887A (en) * 2018-10-01 2020-04-09 本田技研工業株式会社 Display device, display control method, and program
WO2020085159A1 (en) * 2018-10-23 2020-04-30 日本精機株式会社 Display device
JP2020132137A (en) * 2019-02-14 2020-08-31 株式会社デンソー Display controller and display control program
JP7063316B2 (en) 2019-02-14 2022-05-09 株式会社デンソー Display control device and display control program
WO2020166252A1 (en) * 2019-02-14 2020-08-20 株式会社デンソー Display control device, display control program, and tangible, non-transitory computer-readable medium
CN113677553A (en) * 2019-04-11 2021-11-19 三菱电机株式会社 Display control device and display control method
CN113784861B (en) * 2019-05-15 2023-01-17 日产自动车株式会社 Display control method and display control device
CN113784861A (en) * 2019-05-15 2021-12-10 日产自动车株式会社 Display control method and display control device
CN113924518A (en) * 2019-06-13 2022-01-11 大众汽车股份公司 Controlling display content of an augmented reality head-up display device of a motor vehicle
WO2020249367A1 (en) * 2019-06-13 2020-12-17 Volkswagen Aktiengesellschaft Control of a display of an augmented reality head-up display apparatus for a motor vehicle
JP7173078B2 (en) 2019-07-02 2022-11-16 株式会社デンソー Display control device and display control program
JP2021009133A (en) * 2019-07-02 2021-01-28 株式会社デンソー Display control device and display control program
WO2021002081A1 (en) * 2019-07-02 2021-01-07 株式会社デンソー Display control device and display control program
CN111476104B (en) * 2020-03-17 2022-07-01 重庆邮电大学 AR-HUD image distortion correction method, device and system under dynamic eye position
CN111476104A (en) * 2020-03-17 2020-07-31 重庆邮电大学 AR-HUD image distortion correction method, device and system under dynamic eye position
WO2022137558A1 (en) 2020-12-25 2022-06-30 日産自動車株式会社 Information processing device and information processing method
FR3118728A1 (en) * 2021-01-12 2022-07-15 Psa Automobiles Sa Motor vehicle comprising an ADAS system coupled to an augmented reality display system of said vehicle.
WO2022152981A1 (en) * 2021-01-12 2022-07-21 Psa Automobiles Sa Motor vehicle comprising an adas coupled to an augmented-reality display system of said vehicle
US20220292838A1 (en) * 2021-03-10 2022-09-15 Yazaki Corporation Vehicular display apparatus
EP4057251A1 (en) * 2021-03-10 2022-09-14 Yazaki Corporation Vehicular display apparatus
CN115431764A (en) * 2022-10-10 2022-12-06 江苏泽景汽车电子股份有限公司 AR scale display method and device, electronic equipment and storage medium
CN115431764B (en) * 2022-10-10 2023-11-17 江苏泽景汽车电子股份有限公司 AR scale display method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
JP2019217790A (en) 2019-12-26

Similar Documents

Publication Publication Date Title
WO2018070193A1 (en) Head-up display device
JP7437449B2 (en) Image projection device and image projection method
JP6818100B2 (en) Projection type display device
JP6629889B2 (en) Head-up display device
US10866415B2 (en) Head-up display apparatus
US20170161009A1 (en) Vehicular display device
JP2019059248A (en) Head-up display device
JP2016055691A (en) Vehicular display system
WO2017134861A1 (en) Head-up display device
JP2019113809A (en) Head-up display device
JP3931343B2 (en) Route guidance device
JP4692595B2 (en) Information display system for vehicles
JP2019059247A (en) Head-up display device
JP2005127995A (en) Route guidance system, method, and program
JP7111582B2 (en) head up display system
JP6801508B2 (en) Head-up display device
JP2015074391A (en) Head-up display device
JP2018159738A (en) Virtual image display device
JP6872441B2 (en) Head-up display device
JP7344635B2 (en) heads up display device
JP6384529B2 (en) Visibility control device
JP2019202641A (en) Display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17859589

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17859589

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP