WO2021010123A1 - Dispositif d'affichage tête haute - Google Patents

Dispositif d'affichage tête haute Download PDF

Info

Publication number
WO2021010123A1
WO2021010123A1 PCT/JP2020/024947 JP2020024947W WO2021010123A1 WO 2021010123 A1 WO2021010123 A1 WO 2021010123A1 JP 2020024947 W JP2020024947 W JP 2020024947W WO 2021010123 A1 WO2021010123 A1 WO 2021010123A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
eye
light
illumination light
Prior art date
Application number
PCT/JP2020/024947
Other languages
English (en)
Japanese (ja)
Inventor
小林 建
Original Assignee
株式会社Jvcケンウッド
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019131604A external-priority patent/JP2021017064A/ja
Priority claimed from JP2019138707A external-priority patent/JP2021022851A/ja
Application filed by 株式会社Jvcケンウッド filed Critical 株式会社Jvcケンウッド
Publication of WO2021010123A1 publication Critical patent/WO2021010123A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/22Display screens
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • B60K35/231Head-up displays [HUD] characterised by their arrangement or structure for integration into vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/38Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/346Image reproducers using prisms or semi-transparent mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Definitions

  • the present invention relates to a head-up display device.
  • a head-up display device may be used as a display device for vehicles.
  • the head-up display device projects the image display light onto the windshield of the vehicle or the like, and superimposes and displays a virtual image based on the image display light on the scenery outside the vehicle.
  • a device for presenting a stereoscopic image by separately generating image display lights for the left eye and the right eye has also been proposed (see, for example, Patent Document 1).
  • the height position of the left eye and the right eye is different due to the user who visually recognizes the virtual image tilting his / her head, the height position at which the left eye virtual image is visually recognized by the left eye and the right eye virtual image is presented by the right eye. Can cause vertical parallax. If the vertical parallax of the virtual image visually recognized by the user is too large, it becomes difficult to recognize the virtual image visually recognized by the left and right eyes as the same image, which may cause visual fatigue.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a technique for reducing the vertical parallax of a virtual image visually recognized by a user.
  • the head-up display device of an aspect of the present invention projects the first illumination light along the first projection axis that is reflected by the windshield and directed toward the left side region of the eyebox, and is reflected by the windshield and reflected by the eyebox. It is arranged at the intersection of the illumination unit that projects the second illumination light along the second projection axis toward the right side region of the camera and the first projection axis and the second projection axis, and modulates the first illumination light to display light for the left eye.
  • the image processing unit that generates the left-eye image and the right-eye image that are corrected so that the positions of the respective display contents of the above are relatively shifted in the vertical direction, and the first illumination light and the second illumination light are generated in a time division.
  • a lighting control unit that controls the operation of the lighting unit, and a display control that displays the image for the left eye on the display unit when the first illumination light is generated and displays the image for the right eye on the display unit when the second illumination light is generated. It has a part and.
  • the vertical parallax of the virtual image visually recognized by the user can be reduced.
  • FIG. 1 It is a figure which shows typically the structure of the head-up display apparatus which concerns on 1st Embodiment. It is a figure which shows typically the virtual image which a user visually recognizes.
  • 3 (a) and 3 (b) are diagrams schematically showing the optical path of the display light.
  • 4 (a) and 4 (b) are diagrams schematically showing the height positions of both eyes. It is a figure which shows typically the vertical parallax of a virtual image caused by the difference in the height position of both eyes.
  • 6 (a) to 6 (c) are diagrams schematically showing an image displayed on the display unit in the comparative example and a virtual image presented based on the image.
  • FIGS. 7 (a) to 7 (c) are diagrams schematically showing an image displayed on the display unit and a virtual image presented based on the image in the present embodiment. It is a figure which shows the structure of the head-up display apparatus which concerns on 1st Embodiment in detail. It is a figure which shows the structure of the head-up display apparatus which concerns on 1st Embodiment in detail. It is a figure which shows the structure of the head-up display apparatus which concerns on 1st Embodiment in detail. 11 (a) to 11 (c) are front views schematically showing a configuration of a light source and a collimator and a configuration of a fly-eye lens. It is a figure which shows typically the operation of a light source and a display element.
  • 15 (a) and 15 (b) are diagrams showing examples of a display image and an illumination image. It is a figure which shows typically the operation of a light source and a display element. It is a figure which shows typically the structure of the head-up display apparatus which concerns on 3rd Embodiment.
  • FIG. 1 is a diagram schematically showing the configuration of the head-up display device 10 according to the first embodiment.
  • the head-up display device 10 is installed in the dashboard of the vehicle 60, which is an example of a moving body.
  • the head-up display device 10 projects the display light 52 onto the windshield 62, which is a virtual image presentation plate, and presents the virtual image 50 in front of the vehicle 60 in the traveling direction (right direction in FIG. 1).
  • a user 70 such as a driver can visually recognize a virtual image 50 superimposed on a real landscape through a windshield 62. Therefore, the user 70 can obtain the information shown in the virtual image 50 with almost no movement of the line of sight while the vehicle is traveling.
  • the traveling direction (front-rear direction) of the vehicle 60 is the z direction
  • the top-bottom direction (vertical direction) of the vehicle 60 is the y direction
  • the left-right direction of the vehicle 60 is the x direction.
  • FIG. 2 is a diagram schematically showing a virtual image 50 visually recognized by the user 70.
  • the virtual image 50 is presented, for example, at a distance of about 2 m to 5 m in front of the user 70.
  • the angle of view (FOV; FieldOfView) corresponding to the display size of the imaginary image 50 is, for example, about 5 degrees in the horizontal direction (horizontal direction) and about 2 degrees in the vertical direction (vertical direction).
  • the head-up display device 10 includes an illumination unit 12, a display unit 14, a projection mirror 16, a camera, and a control device 18.
  • the head-up display device 10 is connected to an external device 64.
  • the lighting unit 12 generates illumination light for illuminating the display unit 14.
  • the illumination unit 12 includes a light source such as an LED (Light Emitting Diode) and an optical element for adjusting the intensity distribution and the angle distribution of the light emitted from the light source to generate illumination light.
  • the illumination unit 12 is configured to generate visible illumination light for generating the display light 52 and infrared illumination light for illuminating the user 70 to be imaged by the camera 17.
  • the display unit 14 modulates the illumination light to generate the display light.
  • the display unit 14 includes a display element such as a liquid crystal panel.
  • the display unit 14 generates the display light 52 of the display content corresponding to the image signal based on the image signal transmitted from the control device 18.
  • the projection mirror 16 reflects the display light 52 generated by the display unit 14 and projects it toward the windshield 62.
  • the projection mirror 16 projects the display light 52 so that the display light 52 reflected by the windshield 62 is directed toward the eye box 76 in which the eyes 72 of the user 70 are located.
  • the user 70 can visually recognize the virtual image 50 based on the display light 52.
  • the projection mirror 16 is configured as a concave mirror in order to magnify and present an image based on the display light 52 to the user 70.
  • the reflecting surface of the projection mirror 16 is composed of a free curved surface such as an aspherical surface.
  • the camera 17 is arranged so as to image the face of the user 70 with an imaging axis 56 different from the projection axis 54 of the display light 52.
  • an imaging axis 56 of the camera 17 By shifting the image pickup axis 56 of the camera 17 from the projection axis 54, even when the user 70 uses glasses or sunglasses, the eyes 72 behind the lenses 74 of the glasses or sunglasses can be suitably imaged.
  • the infrared illumination light is projected onto the user 70 along the projection axis 54, a part of the illumination light is specularly reflected on the surface of the lens 74.
  • the lens 74 is directed to the line-of-sight direction 78 of the user 70, the illumination light mirror-reflected on the surface of the lens 74 is generally reflected along the projection axis 54.
  • the eyes 72 of the user 70 located behind the lens 74 can be appropriately imaged because the intensity of the infrared light mirror-reflected by the lens 74 is high. It will disappear.
  • the image pickup axis 56 of the camera 17 from the projection axis 54, it is possible to prevent the infrared light mirror-reflected by the lens 74 from heading toward the camera 17, and the user behind the lens 74 can be prevented.
  • the 70th eye 72 can be appropriately imaged.
  • the camera 17 is provided near the lower end 62a of the windshield 62, and is provided, for example, on the dashboard of the vehicle 60.
  • the camera 17 is arranged at a position where the face of the user 70 is imaged from diagonally forward and downward with reference to the vertical direction (height direction), and is located above the projection axis 54 in the range from the windshield 62 to the eyebox 76.
  • the imaging shaft 56 of the camera 17 is arranged so as to be located below.
  • the angle of the imaging shaft 56 with respect to the horizontal plane is larger than the angle of the projection shaft 54 with respect to the horizontal plane, and is set to, for example, about 10 to 20 degrees.
  • the camera 17 is arranged so that the imaging shaft 56 is located between the windshield 62 and the projection mirror 16. Further, the camera 17 is arranged at a position as far as possible from the user 70, and is arranged at a position farther from the user 70 than the projection axis 54 in the range from the projection mirror 16 toward the windshield 62.
  • the camera 17 is arranged, for example, between the projection mirror 16 and the windshield 62, so that the projection axis 54 in the range from the projection mirror 16 to the windshield 62 is included in the angle of view of the camera 17.
  • the camera 17 is provided with an infrared light filter that selectively transmits the wavelength of the infrared illumination light generated by the illumination unit 12.
  • the transmission wavelength of the infrared light filter provided in the camera 17 is, for example, around 850 nm or around 940 nm, but is not limited thereto.
  • the camera 17 may be configured to detect only infrared light, or may be configured to detect both visible and infrared light.
  • a red filter, a green filter, a blue filter, and an infrared light filter are provided for each of the four adjacent pixels of the camera 17. You may.
  • the control device 18 controls the overall operation of the head-up display device 10.
  • the control device 18 is realized by elements and mechanical devices such as a CPU and memory of a computer in terms of hardware, and is realized by a computer program or the like in terms of software, and various functions provided by the control device 18 are It can be realized by the cooperation of hardware and software.
  • the control device 18 generates a display image and controls the operations of the illumination unit 12 and the display unit 14 so that the virtual image 50 corresponding to the display image is presented.
  • the control device 18 is connected to the external device 64, and for example, generates a display image based on the information from the external device 64.
  • the control device 18 controls the operations of the illumination unit 12 and the display unit 14 so that the illumination light of infrared light is projected toward the face of the user 70.
  • the control device 18 may acquire an image captured by the camera 17 and analyze the acquired image to monitor the state of the user 70.
  • the external device 64 is a device that generates the original data of the display image displayed as the virtual image 50.
  • the external device 64 is, for example, an electronic control unit (ECU) of the vehicle 60, a navigation device, a mobile device such as a mobile phone, a smartphone, or a tablet.
  • the external device 64 transmits to the control device 18 image data necessary for displaying the virtual image 50, information indicating the content and type of the image data, and information about the vehicle 60 such as the speed and the current position of the vehicle 60.
  • 3 (a) and 3 (b) are diagrams schematically showing the optical paths of the display lights 52L and 52R.
  • 3 (a) and 3 (b) show the configuration of the head-up display device 10 as viewed from above the vehicle 60, and the lighting unit 12 and the windshield 62 are omitted for the sake of clarity.
  • the display lights 52L and 52R shown in FIGS. 3A and 3B each emit the display unit 14, are reflected by the projection mirror 16 and the windshield 62 (not shown), and then are reflected by the eye box 76. Head.
  • the eye box 76 is a range in which the virtual image 50 presented by the head-up display device 10 can be visually recognized.
  • the eye box 76 is set long in the left-right direction (x direction) so as to include the left eye 72L and the right eye 72R of the user 70.
  • the length of the eye box 76 in the left-right direction is set to, for example, about twice the pupillary distance of both eyes 72L and 72R of the user 70.
  • the pupillary distance between the eyes 72L and 72R varies from person to person, but is generally about 60 mm to 70 mm.
  • the length of the eye box 76 in the left-right direction is set to, for example, about 120 mm to 130 mm. In the normal usage mode shown in FIGS.
  • the user 70's right eye 72R is located in the right side area 76R of the eyebox 76
  • the user's left eye 72L is located in the left side area 76L of the eyebox 76.
  • the projection axis 54 is defined as an optical axis that exits from the center 14C of the display unit 14 and heads toward the center 76C of the eyebox 76.
  • the projection mirror 16 is configured to have a shape symmetrical with respect to the projection axis 54 in the left-right direction (x direction).
  • FIG. 3A schematically shows the display light 52L for the left eye toward the left side region 76L of the eye box 76.
  • the display light 52L for the left eye is display light emitted obliquely with respect to the projection axis 54, and is emitted to the left from the center 14C of the display unit 14 along the first projection axis 54L.
  • the display light 52L for the left eye is reflected at or near the position 16L deviated to the left from the center 16C of the projection mirror 16.
  • the left eye display light 52L is incident on the left eye 72L of the user 70, the left eye virtual image 50L based on the left eye display light 52L is visually recognized by the user 70.
  • FIG. 3B schematically shows the display light 52R for the right eye toward the right region 76R of the eye box 76.
  • the display light 52R for the right eye is the display light emitted obliquely with respect to the projection axis 54, and is emitted to the right from the center 14C of the display unit 14 along the second projection axis 54R.
  • the display light 52R for the right eye is reflected at or near the position 16R shifted to the right from the center 16C of the projection mirror 16.
  • the virtual image 50R for the right eye based on the display light 52R for the right eye is visually recognized by the user 70.
  • FIG. 4 (a) and 4 (b) are diagrams schematically showing the height positions of both eyes 72L and 72R.
  • FIG. 4A shows a case where the height positions of the left eye 72L and the right eye 72R of the user 70 are the same.
  • FIG. 4B shows a case where the height positions of the left eye 72L and the right eye 72R of the user 70 are different, and shows a case where the right eye 72R is higher than the left eye 72L by ⁇ h1.
  • the difference ⁇ h1 between the height positions of the left eye 72L and the right eye 72R of the user 70 is generated, for example, when the user 70 tilts his head.
  • the pupillary distance w between the left eye 72L and the right eye 72R of the user 70 varies from person to person, but is generally about 60 mm to 70 mm.
  • FIG. 5 is a diagram schematically showing the vertical parallax of the virtual images 50L and 50R caused by the difference ⁇ h1 in the height positions of both eyes 72L and 72R.
  • FIG. 5 shows a case where the positions of both eyes 72L and 72R of the user 70 are in the state of FIG. 4B, and shows a case where the right eye 72R is higher than the left eye 72L by ⁇ h1.
  • the left eye 72L of the user 70 visually recognizes the left eye virtual image 50L based on the left eye display light 52L
  • the user 70's right eye 72R visually recognizes the right eye virtual image 50R based on the right eye display light 52R.
  • the height positions of the left-eye virtual image 50L and the right-eye virtual image 50R are not necessarily the same, and a difference ⁇ h2 in height positions may occur.
  • the virtual image 50R for the right eye is presented at a position higher than the virtual image 50L for the left eye by ⁇ h2.
  • the user 70 perceives the difference ⁇ h2 in the height positions of the virtual images 50L and 50R visually recognized by the left eye 72L and the right eye 72R as the vertical parallax (angle difference) ⁇ .
  • the magnitude of the vertical parallax ⁇ can be obtained from the difference ⁇ h1 in the height positions of both eyes 72L and 72R if the design of the optical system of the head-up display device 10 is determined. For example, as the height difference ⁇ h1 between the eyes 72L and 72R increases, the vertical parallax ⁇ also tends to increase.
  • FIG. 6A to 6 (c) are diagrams schematically showing a display image 140 displayed on the display unit 14 in the comparative example and virtual images 150L, 150R, 150 presented based on the display image 140.
  • FIG. 6A shows an example of the display image 140 displayed on the display unit 14, and shows the display image 140 for presenting the virtual image 50 shown in FIG.
  • FIG. 6B shows a virtual image 150L for the left eye and a virtual image 150R for the right eye presented when the display image 140 of FIG. 6A is displayed on the display unit 14.
  • FIG. 6C shows a virtual image 150 when viewed with both eyes.
  • FIG. 6B the height positions of the left-eye virtual image 150L and the right-eye virtual image 150R are not the same, and a difference in height position ⁇ h2 occurs.
  • the virtual image 150R for the right eye is presented at a position ⁇ h2 higher than the virtual image 150L for the left eye.
  • FIG. 6C schematically shows a virtual image 150 visually recognized by the user 70 with both eyes 72L and 72R, and the left-eye virtual image 150L and the right-eye virtual image 150R shown in FIG. 6B are superimposed.
  • the virtual image 150L for the left eye and the virtual image 150R for the right eye have different positions in the height direction in which the virtual image is presented on the left and right, so that it is difficult to fuse and identify the two. If such a virtual image for the left eye 150L and a virtual image for the right eye 150R are forcibly identified with each other, visual fatigue and dizziness may occur, which is not preferable.
  • the vertical parallax ⁇ of the virtual images 50L and 50R perceived by the user is too large, the user may feel dizzy.
  • an example of the upper limit value of the vertical parallax ⁇ that can be tolerated by the user is 1 ⁇ 10 -3 [rad] (that is, 1 mrad).
  • the difference ⁇ h1 in the height positions of the eyes 72L and 72R when the vertical parallax ⁇ is 1 mrad is about 2 mm to 5 mm, although it depends on the design of the optical system of the head-up display device 10. When this is converted into the angle ⁇ of both eyes 72L and 72R, it becomes about 2 to 5 degrees. Therefore, even if the user 70 tilts his / her head at a slight angle of about 2 to 5 degrees, an unacceptable vertical parallax ⁇ may occur.
  • the left-eye image and the right-eye image are separately prepared as the display images to be displayed on the display unit 14, and the positions of the display contents of the left-eye image and the right-eye image are relatively relative to each other. Correct the image so that it shifts in the vertical direction. Specifically, the positions of the display contents of the left-eye image and the right-eye image are moved up and down so that the vertical parallax ⁇ caused by the height position difference ⁇ h1 between the eyes 72L and 72R of the user 70 is reduced or offset. Try to intentionally shift it in the direction.
  • FIG. 7A schematically shows an image 40L for the left eye and an image 40R for the right eye to be displayed on the display unit 14.
  • the left-eye image 40L and the right-eye image 40R have the same display contents, but are corrected so that the positions of the display contents are relatively shifted in the vertical direction.
  • the position of the display content included in the left-eye image 40L in the image is shifted upward by ⁇ h3 from the position of the display content included in the right-eye image 40R in the image.
  • FIG. 7B shows a virtual image 50L for the left eye presented on the display light 52L for the left eye when the image 40L for the left eye is displayed on the display unit 14, and an image 40R for the right eye when the image 40R for the right eye is displayed on the display unit 14.
  • the virtual image 50R for the right eye presented by the display light 52R for the right eye is schematically shown.
  • the height positions of the left eye virtual image 50L and the right eye virtual image 50R do not match, and a height position difference ⁇ h2 occurs.
  • the positions of the display contents of the left eye image 40L and the right eye image 40R are relatively shifted in the vertical direction, the height position of the display contents presented by the left eye virtual image 50L and the right eye virtual image 50R is different. Match.
  • FIG. 7C schematically shows a virtual image 50 visually recognized by the user 70 with both eyes 72L and 72R, and the left-eye virtual image 50L and the right-eye virtual image 50R shown in FIG. 7B are superimposed.
  • the left-eye virtual image 50L and the right-eye virtual image 50R shown in FIG. 7B are superimposed.
  • the height positions of the display contents presented by the virtual image 50L for the left eye and the virtual image 50R for the right eye are aligned, it is easy to fuse and identify the two.
  • by presenting such a virtual image 50 it is possible to prevent the occurrence of visual fatigue and dizziness and improve the visibility of the virtual image 50.
  • the configuration of the optical system of the illumination unit 12 and the display unit 14 for presenting the virtual image 50 as shown in FIG. 7C will be described.
  • the left-eye image 40L and the right-eye image 40R are alternately displayed in time division on one display unit 14, and the left-eye display light 52L and the right-eye display light 52R are alternately generated in time division. To do so.
  • the display unit 14 is alternately irradiated with the first illumination light for generating the left-eye display light 52L and the second illumination light for generating the right-eye display light 52R in a time-division manner. To do. As a result, the left eye display light 52L is projected toward the left side region 76L of the eye box 76, and the right eye display light 52R is projected toward the right side region 76R of the eye box 76.
  • the illumination unit 12 and the display unit 14 are configured so as to further project infrared illumination light in a time-division manner.
  • FIG. 8 to 10 are diagrams showing in detail the configuration of the head-up display device 10 according to the first embodiment.
  • FIG. 8 shows an operating state in which the left-eye image 40L is displayed on the display unit 14 to generate the left-eye display light 52L, corresponding to the configuration of FIG. 3A.
  • FIG. 9 shows an operating state in which the right-eye image 40R is displayed on the display unit 14 to generate the right-eye display light 52R, corresponding to the configuration of FIG. 3B.
  • FIG. 10 shows an operating state in which the infrared illumination light 53 is generated.
  • the illumination unit 12 includes a light source 20, a collimator 23, a fly-eye lens 24, a condenser 30, and a field lens 32.
  • the display unit 14 includes a light diffusing plate 34 and a display element 36.
  • Each optical element constituting the illumination unit 12 and the display unit 14 and the projection mirror 16 are arranged on the projection axis 54.
  • the direction of the projection axis 54 is the z direction, and the two directions orthogonal to the z direction are the x direction and the y direction.
  • the x direction is defined by the left-right direction of the image displayed on the display unit 14, and the y direction is defined by the vertical direction of the image displayed on the display unit 14.
  • the projection shaft 54, the first projection shaft 54L, and the second projection shaft 54R are located in the same plane.
  • the light source 20 includes a first visible light source 21a, a second visible light source 21b, a first infrared light source 22a, and a second infrared light source 22b.
  • Each light source 21a, 21b, 22a, 22b is composed of a semiconductor light emitting element such as an LED.
  • the first visible light source 21a and the first infrared light source 22a are arranged on the right side of the projection axis 54, and are arranged so as to emit light toward the first region 24a of the fly-eye lens 24.
  • the second visible light source 20b is arranged on the left side of the projection shaft 54, and is arranged so as to emit light toward the second region 24b of the fly-eye lens 24.
  • the first visible light source 21a and the second visible light source 21b output white light for generating display lights 52L and 52R.
  • the first infrared light source 22a and the second infrared light source 22b output infrared light for generating infrared illumination light.
  • the emission wavelengths of the first infrared light source 22a and the second infrared light source 22b are, for example, 850 nm or 940 nm, but are not limited thereto. It is known that the wavelength near 940 nm has lower intensity than other wavelengths in the spectrum of sunlight. Therefore, by projecting a wavelength near 940 nm onto the user 70 as illumination light and taking an image with the camera 17, the user 70 can be imaged in an environment less affected by sunlight.
  • the collimator 23 parallelizes the output light of the light source 20 and generates a parallel luminous flux along the projection axis 54.
  • the collimator 23 includes a first reflecting surface 23a and a second reflecting surface 23b.
  • the first reflecting surface 23a and the second reflecting surface 23b are formed of a paraboloid surface or an ellipsoidal surface.
  • the first visible light source 21a and the first infrared light source 22a are arranged at the focal positions of the first reflecting surface 23a, and the second visible light source 21b and the second infrared light source 22b are arranged at the focal positions of the second reflecting surface 23b. Will be done.
  • the first reflecting surface 23a parallelizes the light emitted from the first visible light source 21a and the first infrared light source 22a and causes them to enter the first region 24a of the fly-eye lens 24.
  • the second reflecting surface 23b parallelizes the light emitted from the second visible light source 21b and the second infrared light source 22b and causes them to enter the second region 24b of the fly-eye lens 24.
  • the collimator 23 may be composed of a lens instead of a mirror, and may be, for example, a TIR (Total Internal Reflection) lens that utilizes total internal reflection.
  • the fly-eye lens 24 divides the emitted light of the light source 20 into a plurality of illumination luminous fluxes.
  • the fly-eye lens 24 has a first lens surface 25 and a second lens surface 26, and a plurality of lens elements are arrayed on the first lens surface 25 and the second lens surface 26 in the x-direction and the y-direction, respectively. It is arranged in.
  • Each lens element constituting the fly-eye lens 24 has a rectangular shape, and is configured to have a similar shape to the rectangular display unit 14. For example, if the aspect ratio of the display unit 14 is 1: 2, the aspect ratio of each lens element constituting the fly-eye lens 24 is also 1: 2.
  • Each lens element of the first lens surface 25 collects a parallel light flux incident on the first lens surface 25.
  • the focal point of each lens element of the first lens surface 25 is located at the corresponding lens element of the second lens surface 26. That is, the distance between the first lens surface 25 and the second lens surface 26 in the z direction corresponds to the focal length of each lens element of the first lens surface 25.
  • the parallel light flux incident on the first lens surface 25 is focused on each lens element of the second lens surface 26.
  • Each lens element of the second lens surface 26 can be regarded as a virtual point light source, and a divided illumination light flux is emitted from each lens element of the second lens surface 26.
  • FIG. 11A shows the light source 20 and the collimator 23 as seen from the fly-eye lens 24.
  • the first visible light source 21a and the first infrared light source 22 are arranged on the right side of the projection axis 54, and are arranged adjacent to each other in the left-right direction.
  • the second visible light source 21b and the second infrared light source 22b are arranged on the left side of the projection axis 54, and are arranged adjacent to each other in the left-right direction.
  • the first visible light source 21a and the first infrared light source 22 may be arranged adjacent to each other in the vertical direction.
  • the second visible light source 21b and the second infrared light source 22b may be arranged adjacent to each other in the vertical direction.
  • the first reflecting surface 23a and the second reflecting surface 23b have a shape in which two paraboloids or ellipsoids are connected by being displaced in the left-right direction.
  • At the connection between the first reflective surface 23a and the second reflective surface 23b that is, between the first visible light source 21a and the second visible light source 21b (or between the first infrared light source 22a and the second infrared light source 22b). Is not provided with a reflective surface.
  • FIG. 11B shows the fly-eye lens 24 as seen from the first lens surface 25.
  • FIG. 11B is opposite to that in FIG. 11A in the left-right direction.
  • a plurality of lens elements 25a and 25b are arranged in the x-direction and the y-direction on the first lens surface 25.
  • a plurality of first lens elements 25a are provided in the first region 24a of the fly-eye lens 24, and a plurality of second lens elements 25b are provided in the second region 24b of the fly-eye lens 24.
  • the fly-eye lens 24 has a symmetrical shape, and the first lens element 25a and the second lens element 25b have the same shape or optical characteristics as each other.
  • FIG. 11 (c) shows the fly-eye lens 24 as seen from the second lens surface 26.
  • FIG. 11C is opposite to FIG. 11B in the left-right direction.
  • the second lens surface 26 is configured in the same manner as the first lens surface 25, and a plurality of lens elements 26a and 26b are arranged on the second lens surface 26 in the x-direction and the y-direction.
  • a plurality of third lens elements 26a are provided in the first region 24a of the fly-eye lens 24, and a plurality of fourth lens elements 26b are provided in the second region 24b of the fly-eye lens 24.
  • the third lens element 26a and the fourth lens element 26b have the same shape or optical characteristics as each other.
  • a plurality of illumination light fluxes are emitted from the second lens surface 26. Specifically, a plurality of first illumination light fluxes are emitted from the first region 24a of the second lens surface 26, and each third lens element 26a emits one first illumination luminous flux. Similarly, a plurality of second illumination light fluxes are emitted from the second region 24b of the second lens surface 26, and each fourth lens element 26b emits one second illumination light flux.
  • the first state in which only the first visible light source 21a is turned on, the second state in which only the second visible light source 21b is turned on, and only the first infrared light source 22a and the second infrared light source 22b are used.
  • a plurality of first visible illumination light fluxes are emitted from the first region 24a of the fly-eye lens 24.
  • a plurality of second visible illumination light fluxes are emitted from the second region 24b of the fly-eye lens 24.
  • a plurality of infrared illumination light fluxes are emitted from the first region 24a and the second region 24b of the fly-eye lens 24.
  • the condenser 30 generates illumination light by superimposing a plurality of illumination light fluxes emitted from the fly-eye lens 24.
  • the capacitor 30 causes the illumination light flux emitted from each lens element of the fly-eye lens 24 to be illuminated over the entire display area of the display unit 14. Therefore, the plurality of illumination light fluxes emitted from the fly-eye lens 24 overlap each other in the display area of the display unit 14.
  • the capacitor 30 is composed of a convex lens.
  • the capacitor 30 may be composed of a concave mirror.
  • the capacitor 30 generates the first illumination light 51a by superimposing a plurality of first visible illumination light fluxes, so that the display unit 14 is illuminated by the first illumination light 51a.
  • the first illumination light 51a is incident on the display unit 14 from the right side of the projection shaft 54 along the first projection shaft 54L.
  • the capacitor 30 generates the second illumination light 51b by superimposing a plurality of second visible illumination light fluxes.
  • the second illumination light 51b is incident on the display unit 14 from the left side of the projection shaft 54 along the second projection shaft 54R.
  • the capacitor 30 generates a third illumination light 51c by superimposing a plurality of infrared illumination light fluxes.
  • the third illumination light 51c is incident on the display unit 14 from both the right side and the left side of the projection shaft 54.
  • the field lens 32 is provided after the condenser 30, and adjusts the light distribution of the first illumination light 51a, the second illumination light 51b, and the third illumination light 51c.
  • the field lens 32 includes, for example, the first illumination light 51a and the second illumination light 51b so that the range in which the reflecting surface of the projection mirror 16 exists and the range in which the display lights 52L and 52R are incident on the projection mirror 16 correspond to each other. Adjust the light distribution.
  • the light diffusing plate 34 is provided on the light incident side of the display element 36, and is configured to diffuse the first illumination light 51a, the second illumination light 51b, and the third illumination light 51c incident on the display element 36.
  • the light diffusing plate 34 is composed of a transmissive screen such as a microbead film.
  • the light diffusing plate 34 functions like a backlight of the display element 36, and functions so that the display element 36 looks like a self-luminous display.
  • the display element 36 is a transmissive display element such as a liquid crystal panel, and by modulating the illumination light incident on each pixel of the display area, the display light 52 corresponding to the display content of the image displayed in the display area is generated. To do.
  • the display element 36 may be a DMD (Digital Mirror Device) or a reflective display element such as LCOS (Liquid Crystal on Silicon).
  • the display element 36 is arranged at the intersection of the first projection shaft 54L and the second projection shaft 54R, and is arranged so that both the first illumination light 51a and the second illumination light 51b can be modulated.
  • the display element 36 has an image 40L for the left eye for generating the display light 52L for the left eye, an image 40R for the right eye for generating the display light 52R for the right eye, and an illumination image for generating the infrared illumination light 53. 38 and is displayed.
  • the illumination image 38 is an image for illuminating the user 70 with a uniform brightness, and is an image in which the brightness values of all the pixels are the same (for example, the maximum value). Therefore, the illumination image 38 does not include the content to be presented as the virtual image 50.
  • the left-eye image 40L and the right-eye image 40R include contents to be presented as a virtual image 50.
  • the display element 36 modulates the first illumination light 51a to generate the display light 52L for the left eye. Since the first illumination light 51a is incident from the right side of the projection shaft 54 toward the display unit 14, the left-eye display light 52L generated based on the first illumination light 51a is displayed as indicated by the arrow L. It is emitted from the portion 14 in the leftward direction. The left-eye display light 52L emitted from the center 14C of the display unit 14 is reflected at a position 16L shifted to the left from the center 16C of the projection mirror 16 and heads toward the left side region 76L of the eyebox 76.
  • the display element 36 modulates the second illumination light 51b to generate the display light 52R for the right eye. Since the second illumination light 51b is incident on the display unit 14 from the left side of the projection shaft 54, the display light 52R for the right eye generated based on the second illumination light 51b is displayed as indicated by an arrow R. It is emitted from the portion 14 in the rightward direction. The right-eye display light 52R emitted from the center 14C of the display unit 14 is reflected at a position 16R shifted to the right from the center 16C of the projection mirror 16 and heads toward the right region 76R of the eyebox 76.
  • the display element 36 transmits the third illumination light 51c to generate the infrared illumination light 53. Since the third illumination light 51c is incident on the display unit 14 from both the right side and the left side of the projection axis 54, the infrared illumination light 53 emitted from the display unit 14 is reflected by the entire projection mirror 16. Head to the entire eyebox 76. The infrared illumination light 53 is projected along the first projection shaft 54L and the second projection shaft 54R, respectively, and faces both the left side region 76L and the right side region 76R of the eyebox 76.
  • the control device 18 includes a binocular position detection unit 42, an image processing unit 44, a light source control unit 46, and a display control unit 48.
  • the binocular position detection unit 42 detects the positions of both eyes 72L and 72R of the user 70 in the height direction based on the captured image of the camera 17.
  • the binocular position detection unit 42 detects which of the left eye 72L and the right eye 72R of the user 70 is on the upper side.
  • the binocular position detection unit 42 detects the angle ⁇ formed by the straight line connecting the binoculars 72L and 72R of the user 70 and the reference horizontal line, and determines the difference ⁇ h1 between the positions of the binoculars 72L and 72R in the height direction based on the angle ⁇ . To detect.
  • the binocular position detection unit 42 may directly calculate the difference ⁇ h1 between the positions of the binoculars 72L and 72R in the height direction based on the image captured by the camera 17 without using the angle ⁇ .
  • the image processing unit 44 generates an image to be displayed on the display unit 14.
  • the image processing unit 44 corrects the display image based on the detection result of the binocular position detection unit 42 to generate the left eye image 40L and the right eye image 40R.
  • the image processing unit 44 reduces or cancels the vertical parallax ⁇ caused by the height difference ⁇ h1 between the eyes 72L and 72R detected by the binocular position detecting unit 42, so that the left eye image and the right eye image Correction processing is performed so that the position of each display content of is relatively shifted in the vertical direction.
  • the image processing unit 44 corrects the display image so that the position of the display content of the right eye image 40R is shifted downward from the left eye image 40L.
  • the image processing unit 44 displays so that the position of the display content of the right eye image 40R is shifted upward in the image than the left eye image 40L. Correct the image for use. If the positions of both eyes 72L and 72R in the height direction are the same, the image processing unit 44 generates a display image so that the positions of the display contents of the left eye image 40L and the right eye image 40R are the same in the image. To do.
  • the image processing unit 44 may perform correction processing on both the left eye image 40L and the right eye image 40R, or may perform correction processing on only one of the left eye image 40L and the right eye image 40R. For example, when the left eye 72L is located on the lower side and the right eye 72R is located on the upper side, the display content of the left eye image 40L is shifted upward and the display content of the right eye image 40R is shifted downward to obtain the left eye image 40L and the left eye image 40L. Correction processing may be applied to both of the images 40R for the right eye. In addition, while only the display content of the left-eye image 40L is shifted upward, the display content of the right-eye image 40R may not be shifted in the vertical direction. On the contrary, the display content of the left-eye image 40L may be shifted in the vertical direction, and only the display content of the right-eye image 40R may be shifted downward.
  • the correction amount (vertical shift amount) ⁇ h3 of the correction processing performed by the image processing unit 44 is set in advance according to the design of the optical system of the head-up display device 10.
  • the image processing unit 44 holds, for example, mathematical formulas or table information for associating the difference ⁇ h1 between the positions of the eyes 72L and 72R in the height direction with the correction amount ⁇ h3 of the correction processing, and the binocular position detecting unit 42 refers to these.
  • the correction amount ⁇ h3 is determined from the detection result ⁇ h1.
  • the light source control unit 46 controls the operation of the light source 20.
  • the light source control unit 46 lights only the first visible light source 21a, the second state in which only the second visible light source 21b is turned on, and only the first infrared light source 22a and the second infrared light source 22b. Switch between the third state and the third state.
  • the light source control unit 46 alternately switches between the first state and the second state, and inserts the third state between the first state and the second state.
  • the light source control unit 46 switches the operating state of the light source 20 at a speed that cannot be perceived by the human eye.
  • the switching speed of each operating state is, for example, 120 times or more (120 Hz or more) per second.
  • the display control unit 48 controls the operation of the display unit 14.
  • the display control unit 48 generates an image signal for driving the display element 36 so that the left eye image 40L, the right eye image 40R, and the illumination image 38 are switched and displayed on the display element 36.
  • the display control unit 48 switches the image to be displayed on the display element 36 in synchronization with the operation of the light source 20.
  • the display control unit 48 causes the display element 36 to display the image 40L for the left eye when the first illumination light 51a is generated in the first state.
  • the display control unit 48 causes the display element 36 to display the image 40R for the right eye when the second illumination light 51b is generated in the second state.
  • the display control unit 48 causes the display element 36 to display the illumination image 38 when the infrared illumination light 53 is generated in the third state.
  • FIG. 12 is a diagram schematically showing the operation of the light source 20 and the display element 36.
  • the first period T1 is the first state in which the display light 52L for the left eye is generated, the first visible light source 21a is turned on, the second visible light source 21b, the first infrared light source 22a, and the second infrared light source 22b are turned off. Then, the image 40L for the left eye is displayed on the display element 36.
  • the second period T2 is the second state in which the display light 52R for the right eye is generated, the second visible light source 21b is turned on, and the first visible light source 21a, the first infrared light source 22a, and the second infrared light source 22b are turned off.
  • the third period T3 is a third state in which the infrared illumination light 53 is generated, the first infrared light source 22a and the second infrared light source 22b are turned on, and the first visible light source 21a and the second visible light source 21b are turned off. Then, the illumination image 38 is displayed.
  • the third period T3 is inserted between the first period T1 and the second period T2.
  • the operating state is switched in the order of the first period T1, the third period T3, the second period T2, and the third period T3, and the reference period T0 including these four periods is repeated.
  • the repetition period of the reference period T0 is, for example, 30 times or more (30 Hz or more) per second.
  • the difference ⁇ h1 between the height positions of both eyes 72L and 72R of the user 70 is detected by using the camera 17, and the vertical positions of the display contents of the right eye image 40R and the left eye image 40L.
  • the vertical parallax ⁇ visually recognized by the user 70 can be reduced.
  • the difference in height position between the left eye virtual image 50L and the right eye virtual image 50R can be reduced, and the difference between the left eye virtual image 50L and the right eye virtual image 50L can be reduced.
  • the fusion of the virtual image 50R can be easily performed. As a result, the visibility of the virtual image 50 visually recognized by binocular vision can be improved, and the occurrence of visual fatigue and dizziness can be suitably prevented.
  • the infrared illumination light 53 for illuminating the user 70 is also used as an optical system for generating the display light 52, so that the user 70 can visually recognize the virtual image 50. If the system is adjusted, the infrared illumination light 53 can be reliably projected on the face of the user 70. Further, since the infrared illumination light 53 is generated by using the optical system for generating the display light 52, the infrared illumination light 53 having a uniform illuminance distribution can be projected on the face of the user 70, and the user 70 can be projected. The face of the light can be imaged more appropriately.
  • the moving image of the virtual image 50 is displayed as if a black image is inserted in the third period T3.
  • the virtual image 50 can be presented.
  • the configuration in which the head-up display device 10 includes the camera 17 is shown.
  • the head-up display device 10 does not have to include the camera 17, and the positions of both eyes 72L, 72R of the user 70 are used by using an image captured by an arbitrary camera capable of capturing the face of the user 70. May be detected.
  • the head-up display device 10 may be configured not to generate the infrared illumination light 53 for illuminating the user 70, and may not include the first infrared light source 22a and the second infrared light source 22b.
  • the user 70 may be illuminated by an infrared illumination light source different from the head-up display device 10.
  • the driver monitor device separate from the head-up display device 10 may include an infrared camera and an infrared illumination device, and the head-up display device 10 acquires an image of the user 70 captured by the driver monitor device. May be good.
  • FIG. 13 is a diagram schematically showing the configuration of the head-up display device 210 according to the second embodiment.
  • the second embodiment is configured to project infrared illumination light onto the face of the user 70, as in the first embodiment.
  • the second embodiment is different from the first embodiment in that the display light for the right eye and the display light for the left eye are not generated separately, but the display light common to the right eye and the left eye is generated.
  • the second embodiment will be described focusing on the differences from the first embodiment described above.
  • the head-up display device 210 includes a projection unit 212, a projection mirror 16, a camera 17, and a control device 218.
  • the head-up display device 210 is connected to the external device 64.
  • the projection mirror 16, the camera 17, and the external device 64 are configured in the same manner as in the first embodiment described above.
  • the projection unit 212 generates the projected light 252.
  • the projection unit 212 generates display light composed of visible light and illumination light composed of infrared light as the projection light 252.
  • the projection unit 212 projects the projected light 252 toward the projection mirror 16.
  • the projected light 252 emitted from the projection unit 212 is reflected by the projection mirror 16 and reflected by the windshield 62, and then heads toward the eye box 76 where the eyes 72 of the user 70 are located.
  • the projection unit 212 projects the display light composed of visible light and the illumination light composed of infrared light along the same projection axis 54.
  • the control device 218 generates a display image and controls the operation of the projection unit 212 so that the virtual image 250 corresponding to the display image is presented.
  • the control device 218 is connected to the external device 64, and for example, generates a display image based on the information from the external device 64.
  • the control device 218 controls the operation of the projection unit 212 so that the illumination light of infrared light is projected toward the face of the user 70.
  • the control device 218 acquires an image captured by the camera 17, and monitors the state of the user 70 by analyzing the acquired image.
  • FIG. 14 is a diagram showing in detail the configuration of the head-up display device 210 according to the second embodiment.
  • FIG. 14 shows in detail the optical configuration of the projection unit 212 and the functional configuration of the control device 218.
  • the projection unit 212 includes a light source 220, a light tunnel 226, a Fresnel lens 228, a light diffusing plate 230, and a display element 232.
  • the control device 218 includes an image processing unit 242, a light source control unit 244, and a display control unit 246.
  • the light source 220 has a visible light source 222a and 222b (collectively referred to as a visible light source 222) and an infrared light source 224.
  • the visible light source 222 and the infrared light source 224 are composed of semiconductor light emitting elements such as LEDs.
  • the visible light source 222 outputs white light for generating display light.
  • the infrared light source 224 outputs infrared light for generating illumination light.
  • the emission wavelength of the infrared light source 224 is, for example, 850 nm or 940 nm, but is not limited thereto.
  • the light tunnel 226, the Fresnel lens 228, and the light diffusing plate 230 are optical elements that adjust the illuminance distribution and light distribution of visible and infrared light output from the light source 220.
  • the light tunnel 226 has four reflecting surfaces arranged around the projection axis 54, and has a rectangular cross section orthogonal to the projection axis 54.
  • the light tunnel 226 has a tapered structure in which the opening size increases from the light incident end to the light emitting end.
  • the Fresnel lens 228 is configured to be a concave lens, and adjusts the direction of the main light beam of the projected light 252 emitted from the display element 232.
  • the light diffusing plate 230 is composed of a transmissive screen such as a microbead film.
  • the light diffusing plate 230 functions like a backlight of the display element 232, so that the display element 232 looks like a self-luminous display.
  • the display element 232 modulates visible light and infrared light to generate projected light 252.
  • the display element 232 is a transmissive display element such as a liquid crystal panel, and by modulating visible light and infrared light incident on each pixel of the display area, projection corresponding to the display content of the image displayed in the display area is achieved. Generates light 252.
  • the projection unit 212 may be configured to use a reflective display element such as a DMD (Digital Mirror Device) or LCOS (Liquid Crystal on Silicon) as the display element 232.
  • the image processing unit 242 generates an image to be displayed on the display element 232.
  • the image processing unit 242 generates a display image for generating display light and an illumination image for generating illumination light.
  • FIG. 15A shows an example of the display image 236.
  • the display image 236 is an image including the content to be presented as the virtual image 250, and is generated based on the information acquired from the external device 64.
  • FIG. 15B shows an example of the illumination image 238.
  • the illumination image 238 is an image for illuminating the user 70 with a uniform brightness, and is an image in which the brightness values of all the pixels are the same (for example, the maximum value). Therefore, the illumination image 238 does not include the content to be presented as the virtual image 250.
  • the light source control unit 244 controls the operation of the light source 220.
  • the light source 220 alternately lights the visible light source 222 and the infrared light source 224 so that the display light based on visible light and the illumination light based on infrared light are alternately generated.
  • the light source control unit 244 switches between the lit state and the non-lit state of the visible light source 222 and the infrared light source 224 at a speed at which the human eye cannot perceive the switching between the display light and the illumination light.
  • the switching speed between the lit state and the non-lit state is, for example, 60 times or more (60 Hz or more) per second.
  • the display control unit 246 controls the operation of the display element 232.
  • the display control unit 246 generates an image signal for driving the display element 232 so that the display image 236 and the illumination image 238 are alternately displayed on the display element 232.
  • the display control unit 246 switches the display of the display element 232 in synchronization with the operation of the light source 220.
  • the display control unit 246 turns on the visible light source 222 and causes the display element 232 to display the display image 236 when visible light is generated.
  • the display control unit 246 turns on the infrared light source 224 and causes the display element 232 to display the illumination image 238 when infrared light is generated. As a result, the display light of visible light and the illumination light of infrared light are alternately projected as projected light 252.
  • FIG. 16 is a diagram schematically showing the operation of the light source 220 and the display element 232.
  • the first period T21 is a timing for generating the display light of visible light
  • the visible light source 222 is turned on
  • the infrared light source 224 is turned off
  • the display image 236 is displayed on the display element 232.
  • the second period T22 is a timing for generating the illumination light of the infrared light
  • the visible light source 222 is turned off
  • the infrared light source 224 is turned on
  • the illumination image 238 is displayed on the display element 232.
  • the first period T21 and the second period T22 are alternately repeated. For example, the first period T21 and the second period T22 are switched at a cycle of 60 times or more (60 Hz or more) per second.
  • the virtual image 250 can be presented to the user 70, and the user 70 can be imaged to function as a driver monitor. Since the infrared illumination light for illuminating the user 70 is also used as the optical system for generating the display light of visible light, the optical system is adjusted so that the user 70 can visually recognize the virtual image 250. For example, the illumination light can be reliably projected on the face of the user 70. Further, since the illumination light is generated by using the optical system for generating the display light, the illumination light having a uniform illuminance distribution can be projected on the face of the user 70, and the face of the user 70 can be imaged more appropriately. it can.
  • the moving image of the virtual image 250 is generated as if a black image is inserted in the second period T22. Can be presented. As a result, it is possible to suppress "moving image blurring" in which the moving image appears blurry due to the inability to instantly switch the display image between adjacent frames due to the response characteristics of the display element 32, resulting in excellent visibility.
  • the virtual image 250 can be presented.
  • FIG. 17 is a diagram schematically showing the configuration of the head-up display device 310 according to the third embodiment.
  • the arrangement of the camera 317 is different from the above-described first and second embodiments, and is different from the above-described embodiment in that the image pickup shaft 356 reflected by the windshield 62 is used.
  • the region 86 in which the imaging shaft 356 is folded back in the windshield 62 is located below the region 84 in which the projection shaft 54 is folded back in the windshield 62.
  • the camera 316 is arranged behind the projection mirror 16. Also in this embodiment, the same effect as that of the above-described embodiment can be obtained.
  • the imaging axes 56 and 356 of the cameras 17 and 317 may be located above the projection axis 54, and are configured to image the face of the user 70 from above or from the front. You may. In this case, the cameras 17 and 317 may be aimed at the user 70 or the windshield 62. Even in this case, the face of the user 70 can be more appropriately imaged by shifting the imaging axes 56 and 356 of the cameras 17 and 317 from the projection axis 54.
  • a projection unit that projects display light and illumination light along a projection axis that is reflected by the windshield and faces the user, and presents a virtual image based on the display light to the user.
  • a head-up display device comprising: a camera that images the user to which the illumination light is projected with an imaging axis different from the projection axis.
  • Aspect 2 Aspect 1 is characterized in that the camera is arranged so that the imaging axis is located below the projection axis in the range from the windshield to the user with reference to the vertical direction of the user's face.
  • a projection mirror that reflects the display light and the illumination light emitted from the projection unit toward the windshield is further provided.
  • the projection unit includes a light source having a visible light source and an infrared light source, and a display element that modulates visible light and infrared light output from the light source to generate the display light and the illumination light.
  • the head-up display device is A light source control unit that alternately lights the visible light source and the infrared light source, From Embodiment 1, further comprising a display control unit for displaying a display image on the display element when the visible light source is lit and displaying the illumination image on the display element when the infrared light source is lit.
  • the head-up display device according to any one of 3.
  • the vertical parallax of the virtual image visually recognized by the user can be reduced.
  • the face of a user using a head-up display can be appropriately imaged.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)

Abstract

La présente invention concerne un dispositif d'affichage tête haute (10) comprenant : une unité d'affichage (14) qui module une première lumière d'éclairage pour générer une lumière d'affichage d'œil gauche et module une seconde lumière d'éclairage pour générer une lumière d'affichage d'œil droit ; une caméra (17) qui capture des images des deux yeux d'un utilisateur (70) positionnée dans une boîte oculaire (76) ; une unité de détection de position binoculaire qui détecte la différence de la position des deux yeux de l'utilisateur (70) dans la direction de la hauteur, sur la base des images capturées provenant de la caméra (17) ; une unité de traitement d'image qui génère une image d'œil gauche et une image d'œil droit corrigées de telle sorte que les positions affichées sont déplacées, relativement, dans la direction verticale, en fonction de la différence des positions des deux yeux de l'utilisateur (70) dans la direction de la hauteur telle que détectée par l'unité de détection de position binoculaire (70) ; et une unité de commande d'affichage qui affiche l'image d'œil gauche sur l'unité d'affichage lorsque la première lumière d'éclairage est générée et affiche l'image d'œil droit dans l'unité d'affichage lorsque la seconde lumière d'éclairage est générée.
PCT/JP2020/024947 2019-07-17 2020-06-25 Dispositif d'affichage tête haute WO2021010123A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2019-131604 2019-07-17
JP2019131604A JP2021017064A (ja) 2019-07-17 2019-07-17 ヘッドアップディスプレイ装置
JP2019-138707 2019-07-29
JP2019138707A JP2021022851A (ja) 2019-07-29 2019-07-29 ヘッドアップディスプレイ装置

Publications (1)

Publication Number Publication Date
WO2021010123A1 true WO2021010123A1 (fr) 2021-01-21

Family

ID=74210685

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/024947 WO2021010123A1 (fr) 2019-07-17 2020-06-25 Dispositif d'affichage tête haute

Country Status (1)

Country Link
WO (1) WO2021010123A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI833272B (zh) * 2022-07-06 2024-02-21 賴俊賢 抬頭顯示盒

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010191404A (ja) * 2009-01-22 2010-09-02 Nippon Seiki Co Ltd 車両用表示装置
WO2011036827A1 (fr) * 2009-09-28 2011-03-31 パナソニック株式会社 Dispositif d'affichage d'image 3d et procédé d'affichage d'image 3d
WO2012153447A1 (fr) * 2011-05-11 2012-11-15 パナソニック株式会社 Dispositif de traitement d'image, procédé de traitement d'image vidéo, programme, et circuit intégré
JP2014010418A (ja) * 2012-07-03 2014-01-20 Yazaki Corp 立体表示装置及び立体表示方法
JP2015215505A (ja) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 表示装置、および表示方法
JP2019015823A (ja) * 2017-07-05 2019-01-31 京セラ株式会社 3次元投影装置、3次元投影システム、および移動体

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010191404A (ja) * 2009-01-22 2010-09-02 Nippon Seiki Co Ltd 車両用表示装置
WO2011036827A1 (fr) * 2009-09-28 2011-03-31 パナソニック株式会社 Dispositif d'affichage d'image 3d et procédé d'affichage d'image 3d
WO2012153447A1 (fr) * 2011-05-11 2012-11-15 パナソニック株式会社 Dispositif de traitement d'image, procédé de traitement d'image vidéo, programme, et circuit intégré
JP2014010418A (ja) * 2012-07-03 2014-01-20 Yazaki Corp 立体表示装置及び立体表示方法
JP2015215505A (ja) * 2014-05-12 2015-12-03 パナソニックIpマネジメント株式会社 表示装置、および表示方法
JP2019015823A (ja) * 2017-07-05 2019-01-31 京セラ株式会社 3次元投影装置、3次元投影システム、および移動体

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI833272B (zh) * 2022-07-06 2024-02-21 賴俊賢 抬頭顯示盒

Similar Documents

Publication Publication Date Title
JP6175589B2 (ja) 投写型表示装置及び投写表示方法
US9269193B2 (en) Head-mount type display device
US10895743B2 (en) Display apparatus for superimposing a virtual image into the field of vision of a user
US8684531B2 (en) Stereoscopic display device projecting parallax image and adjusting amount of parallax
JP7003925B2 (ja) 反射板、情報表示装置および移動体
US9580015B2 (en) Image display device
JP2010072455A (ja) 車載用表示装置及び表示方法
WO2020261830A1 (fr) Dispositif d'affichage tête haute
JP2015146012A (ja) 虚像表示システム
JP2017026675A (ja) ヘッドアップディスプレイ
JP2018203245A (ja) 表示システム、電子ミラーシステム及び移動体
WO2021010123A1 (fr) Dispositif d'affichage tête haute
WO2018124299A1 (fr) Dispositif d'affichage d'image virtuelle et procédé correspondant
US8746889B2 (en) Auto-variable perspective autostereoscopic 3D display
JP2021022851A (ja) ヘッドアップディスプレイ装置
JP2000214408A (ja) 画像表示装置
JP7205704B2 (ja) ヘッドアップディスプレイ装置
WO2020031549A1 (fr) Dispositif d'affichage d'image virtuelle
JP7111071B2 (ja) ヘッドアップディスプレイ装置
JP7111070B2 (ja) ヘッドアップディスプレイ装置
WO2023228771A1 (fr) Dispositif d'affichage d'images, véhicule et procédé d'affichage d'images
WO2024090297A1 (fr) Système d'affichage tête haute
JP2021017064A (ja) ヘッドアップディスプレイ装置
JP2006113476A (ja) 映像表示システム
JP2020064126A (ja) 表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20840383

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20840383

Country of ref document: EP

Kind code of ref document: A1