WO2023104534A1 - Système optique d'affichage tête haute à réalité augmentée - Google Patents

Système optique d'affichage tête haute à réalité augmentée Download PDF

Info

Publication number
WO2023104534A1
WO2023104534A1 PCT/EP2022/083132 EP2022083132W WO2023104534A1 WO 2023104534 A1 WO2023104534 A1 WO 2023104534A1 EP 2022083132 W EP2022083132 W EP 2022083132W WO 2023104534 A1 WO2023104534 A1 WO 2023104534A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical
virtual image
optical system
combiner
optical element
Prior art date
Application number
PCT/EP2022/083132
Other languages
English (en)
Inventor
Andrey Mikhailovich Belkin
Vitaly Andreevich PONOMAREV
Anton Alekseevich SHEHERBINA
Mikhail Aleksandrovich SVARYCHEUSKI
Kseniia Igorevna Lvova
Original Assignee
Wayray Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/546,388 external-priority patent/US20230161158A1/en
Application filed by Wayray Ag filed Critical Wayray Ag
Publication of WO2023104534A1 publication Critical patent/WO2023104534A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0185Displaying image at variable distance
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/40Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images giving the observer of a single two-dimensional [2D] image a perception of depth

Definitions

  • Optical systems of see-through head-up displays provide the ability to present information and graphics to an observer without requiring the observer to look away from a given viewpoint or otherwise refocus his or her eyes.
  • the observer views an external scene through a combiner.
  • the combiner allows light from the external scene to pass through while also redirecting an image artificially generated by a projector so that the observer can see both the external light as well as the projected image at the same time.
  • the projected image can include one or more virtual objects that augment the observer’s view of the external scene, which is also referred to as augmented reality (AR).
  • AR augmented reality
  • FIG. 1 is a block diagram of an AR display environment including an optical system, in accordance with an embodiment of the present disclosure.
  • FIG. 2 is a schematic diagram of the optical system of FIG. 1, in accordance with some embodiments of the present disclosure.
  • FIG. 3 is a top view of a virtual image surface inclined in the direction of a horizontal field of view and produced by the optical system of FIGS. 1 and 2, in accordance with an embodiment of the present disclosure.
  • FIGS. 4 A and 4B are schematic diagrams of a correcting optical unit of the optical system of FIG. 1, in accordance with an example of the present disclosure.
  • FIG. 5 is a front view and a top view of a virtual image surface inclined in the direction of a horizontal field of view and produced by the optical system of FIG. 1, in accordance with some embodiments of the present disclosure.
  • FIG. 6 shows an example transformation of a local coordinate system in spatial relation to components of the optical system of FIG. 1, in accordance with an embodiment of the present disclosure.
  • FIG. 7 shows estimations of a stereoscopic depth of field for the optical system of FIG. 1, in accordance with some embodiments of the present disclosure.
  • FIG. 8 shows estimations of a usable size of a virtual object, which is not perceived as inclined, generated by the optical system of FIG. 1, in accordance with some embodiments of the present disclosure.
  • FIGS. 9 A and 9B show example design parameters for the optical system of FIGS. 1, 4A and 4B, in accordance with an embodiment of the present disclosure.
  • FIGS. 10A and 10B show example holographic optical element (HOE) recording parameters for the combiner of the optical system of FIG. 1 including the correcting optical unit of FIGS. 4A and 4B, in accordance with an embodiment of the present disclosure.
  • HOE holographic optical element
  • FIGS. 11A and 11B are schematic diagrams of a correcting optical unit of the optical system of FIG. 1, in accordance with another example of the present disclosure.
  • FIG. 12A shows a surface sag of an example optical element surface in the direction of a vertical field of view of the optical system of FIGS. 1 including the correcting optical unit of FIGS. 11 A and 1 IB, in accordance with an embodiment of the present disclosure.
  • FIG. 12B shows a surface sag of an example optical element surface in the direction of a horizontal field of view of the optical system of FIGS. 1 including the correcting optical unit of FIGS. 11 A and 1 IB, in accordance with an embodiment of the present disclosure.
  • FIG. 13 shows an example total shape of an asymmetrical freeform surface of an example optical element of the optical system of FIGS. 1 including the correcting optical unit of FIGS. 11 A and 1 IB, in accordance with an embodiment of the present disclosure.
  • FIGS. 14A and 14B are schematic diagrams of a correcting optical unit of the optical system of FIG. 1, in accordance with yet another example of the present disclosure.
  • FIG. 15A shows a surface sag of an example optical element surface in the direction of a vertical field of view of the optical system of FIGS. 1 including the correcting optical unit of FIGS. 14A and 14B, in accordance with an embodiment of the present disclosure.
  • FIG. 15B shows a surface sag of an example optical element surface in the direction of a horizontal field of view of the optical system of FIGS. 1 including the correcting optical unit of FIGS. 14A and 14B, in accordance with an embodiment of the present disclosure.
  • FIG. 16 shows an example total shape of an asymmetrical freeform surface of an example optical element of the optical system of FIGS. 1 including the correcting optical unit of FIGS. 14A and 14B, in accordance with an embodiment of the present disclosure.
  • FIG. 17 shows deviation from a best fit sphere of an example optical element surface of the optical system of FIGS. 1 including the correcting optical unit of FIGS. 14A and 14B, in accordance with an embodiment of the present disclosure.
  • FIG. 18 shows stereoscopic depth of field with respect to a horizontal field of view in the optical system of FIG. 1, in accordance with some embodiments of the present disclosure.
  • FIG. 19 shows three-dimensional virtual image sizes and positions within a horizontal field of view for several scenes in a field of view of the optical system of FIG. 1, in accordance with some embodiments of the present disclosure.
  • FIG. 20 shows three-dimensional virtual image sizes and positions within horizontal and vertical fields of view for several scenes in a field of view of the optical system of FIG. 1, in accordance with some embodiments of the present disclosure.
  • FIG. 21 is a chart showing a combiner focal length versus a volume of an optical system for each of several virtual image distances of the optical system of FIG. 1, in accordance with some embodiments of the present disclosure.
  • FIG. 22 shows a horizontal field of view for a side-view optical system, in accordance with an embodiment of the present disclosure.
  • FIG. 23 shows an example stereoscopic depth of field relative to angles of inclination of a virtual image surface.
  • FIGS. 24A-B show example scenes of a three-dimensional virtual image size and depth from an observer for the angles of inclination of the virtual image surface in FIG. 23.
  • FIG. 25 shows an example stereoscopic depth of field relative to angles of field of view of a virtual image surface.
  • FIGS. 26A-B show example scenes of a three-dimensional virtual image size and depth from an observer for the fields of view of the virtual image surface in FIG. 25.
  • FIG. 27 shows a limit of a vertical field of view with respect to the size of a combiner.
  • An optical system in accordance with an example of the present disclosure, is compact and can produce augmented reality in a head-up display with a relatively high stereoscopic depth of field overcoming limitations on the usable size of the virtual objects which does not exceed a stereo-threshold and are not perceived as inclined by an observer.
  • the optical system includes a picture generation unit, a correcting optical unit, and a combiner.
  • the correcting optical unit is configured to create, in a direction of a horizontal field of view, a monotonic variation of an optical path length of light rays propagating from the picture generation unit.
  • the combiner is configured to redirect light rays propagating from the correcting optical unit toward an eye box, thereby producing one or more virtual images observable from the eye box.
  • the optical system provides a virtual image surface inclined in the direction of the horizontal field of view for displaying the one or more virtual images at different distances from an observer, such that a virtual image on a first side of the virtual image surface appears closer to the eye box than a virtual image on a second side of the virtual image surface.
  • the virtual image surface has a non-zero angle between projections on a horizontal plane defined by a first axis and a second axis, the first axis being perpendicular to the virtual image surface and extending through an arbitrary intersection point on the virtual image surface, and the second axis being parallel to a line of sight and extending through an arbitrary intersection point on the virtual image surface.
  • an AR HUD augmented reality head-up display
  • an AR HUD can be arranged such that an observer perceives the virtual objects at different distances along a virtual image surface, which provides a sense of depth of those objects within the augmented reality scene.
  • virtual image surface refers to an imaginary surface (planar or non-planar) upon which virtual objects and other virtual images appear to lie when viewed from the eye box inside the field of view area.
  • the visual ergonomics of AR HUD improve as the stereoscopic depth of field increases. For example, a large stereoscopic depth of field increases the number of virtual objects that can simultaneously appear to be at different distances in front of an observer.
  • a standard AR HUD achieves stereoscopic depth of field by inclining a virtual image surface with the respect to a road or ground surface (the inclination of the virtual image surface in a direction of a vertical field of view).
  • virtual objects displayed in a lower portion of the field of view appear to be closer to the observer than virtual objects are in the upper portion of the field of view.
  • some existing AR HUDs have a relatively small stereoscopic depth of field due, for example, to structural limitations of the HUD on the maximum size of the vertical field of view (FoV) and the spatial orientation of the virtual image surface.
  • a structural limitation on the maximum size of the vertical FoV of the HUD relates to a combiner size.
  • the combiner inclination angle is more than 60°, so the combiner size is at least twice larger than the combiner’s projection on the vertical plane. So, a vertical field of view increase leads to the combiner size increase, the numerical aperture increase, and hence the fast increase of aberrations (especially, increase of the astigmatism aberration).
  • Such limitations, as well as human binocular vision limitations can also limit the maximum usable size of the virtual objects which are not perceived as inclined. For instance, virtual image surfaces inclined in the direction of the vertical field of view enable a limited stereoscopic depth of field due to the limited size of a vertical field of view of the HUD.
  • the stereoscopic depth of field can be increased by increasing the inclination angle of the virtual image surface.
  • increasing the inclination angle relative to the direction of the vertical field of view reduces the usable size and/or height of the virtual objects (which are not perceived as inclined) displayed on the virtual image surface inclined in the direction of the vertical field of view. This decrease in the usable size of the virtual objects restricts an improvement of the stereoscopic depth of field in existing HUDs.
  • an optical system is provided that is relatively compact and can produce augmented reality in a head-up display with a relatively high stereoscopic depth of field overcoming the limitations on the usable size of the virtual objects (which are not perceive as inclined) appearing in the field of view area.
  • An example optical system includes a picture generation unit, a correcting optical unit, and a combiner.
  • the correcting optical unit is configured to create, in a direction of a horizontal field of view, a monotonic variation of an optical path length of light rays propagating from the picture generation unit.
  • the combiner is configured to redirect light rays propagating from the correcting optical unit toward an eye box, thereby producing one or more virtual images observable at the eye box.
  • the optical system thus provides a virtual image surface inclined in the direction of the horizontal field of view for displaying the one or more virtual images at different distances from the observer such that one or more virtual images on a first side of the virtual image surface appear closer to the eye box than one or more virtual images on a second side of the virtual image surface.
  • the correcting optical unit includes a specific combination of optical elements.
  • the inclination of the virtual image surface can be achieved by inclining a lens through which the optical image passes, by using an optical surface with an asymmetrical shape forming a wedge with adjacent optical surfaces, or by using a combination of an inclined lens and an optical surface with an asymmetrical shape.
  • the combiner includes a holographic optical element with positive optical power, which in combination with the correcting optical unit further increases the stereoscopic depth of field.
  • FIG. 1 is a block diagram of an augmented reality display environment 100, in accordance with an example of the present disclosure.
  • the environment 100 includes a vehicle 102 with an optical system (such as a head-up display or HUD) 104.
  • the vehicle 102 can be any type of vehicle (e.g., passenger vehicle such as a car, truck, or limousine; a boat; a plane).
  • passenger vehicle such as a car, truck, or limousine
  • a boat a plane
  • at least a portion of the optical system 104 is mounted in the passenger vehicle 102 between (or within) a windshield 102 a and a driver, although other examples may include a second such optical system 104 mounted between a side window and a passenger as will be discussed in turn.
  • the optical system 104 is configured to generate a virtual image 108 that is visible from an eye box 106 of the driver.
  • the eye box 106 is an area or location within which the virtual image 108 can be seen by either or both eyes of the driver, and thus the driver’s head occupies or is adjacent to at least a portion of the eye box 106 during operation of the vehicle 102.
  • the virtual image 108 includes one or more virtual objects, symbols, characters, or other elements that are optically located ahead of the vehicle 102 such that the virtual image 108 appears to be at a nonzero distance (up to perceptible infinity) away from the optical system 104 (e.g., ahead of the vehicle 102).
  • Such a virtual image 108 is also referred to as augmented reality when combined with light from a real-world environment, such as the area ahead of the vehicle 102.
  • the optical system 104 produces an inclined virtual image surface that is non-perpendicular to a line of sight through the system 104 such that virtual objects on the left side of the virtual image 108 are displayed closer to a viewer than virtual objects on the right side of the virtual image 108, or such that virtual objects on the right side of the field of view of the system 104 are displayed closer to a viewer than virtual objects on the left side of the field of view of the system 104, depending on the angle of inclination of the virtual image surface in the direction of the horizontal field of view.
  • the line of sight is a line extending from the center of the eye box area 106 into the center of the field of view area of the optical system 104.
  • the optical system 104 is designed to occupy a relatively small and compact area (by volume) so as to be easily integrated into the structure of the vehicle 102.
  • Several examples of the optical system 104 are described below with respect to FIGS. 2, 4A-B, 11A-B, and 14A-B.
  • FIG. 2 is a schematic diagram of the optical system 104 of FIG. 1, in accordance with an example of the present disclosure.
  • the optical system 104 can be implemented as at least a portion of the environment 100 of FIG. 1.
  • the optical system 104 includes a picture generation unit (PGU) 202, a correcting optical unit 204, and a combiner 206.
  • PGU picture generation unit
  • the PGU 202 can include, in some examples, a digital micromirror device (DMD) projector, a liquid crystal on silicon (LCoS) projector, a liquid-crystal display (LCD) with laser illumination projector, a micro-electro- mechanical system (MEMS) projector with one dual-axis scanning mirror or with two single-axis scanning mirrors, an array of semiconductor lasers, a thin-film-transistor liquid-crystal display (TFT LCD), an organic light emitting diode (OLED), an Active Matrix Organic Light Emitting Diode (AMOLED), or other suitable illumination device.
  • the PGU 202 can further include a diffusing element or a microlens array.
  • the PGU 206 is configured to generate and output an optical image 210, represented in FIG. 2 by a ray of light.
  • the optical image 210 may include, for instance, one or more features (such as symbols, characters, or other elements) to be projected into, or to otherwise augment, external light 214 from a real -world scene viewable through the combiner 206.
  • the combiner 206 includes a holographic optical element (HOE) with a positive optical power, which can be placed on the inside surface of a windshield of the vehicle 102 or integrated into the windshield in a process of triplex production.
  • HOE holographic optical element
  • the optical system 104 is arranged such that the optical image 210 output by the PGU 206 passes through the correcting optical unit 204, which produces one or more modified optical images 212.
  • the modified optical images 212 redirects to the combiner 206, then toward an eye box 208 outside of the optical system 104.
  • the combiner 206 is further configured to permit at least some of the external light 214 to pass through the combiner 206 and combine with the redirected optical image to produce an augmented reality scene 216 visible from the eye box 208.
  • the augmented reality scene 216 includes an augmented reality display of the virtual image 108 of FIG. 1, where at least some objects in the virtual image 108 are perceived by the driver or observer to be located on a virtual image surface that is inclined horizontally with respect to the field of view of the AR HUD optical system.
  • the combination of the PGU 202, the correcting optical unit 204, and the combiner 206 are arranged such that one or more virtual objects 302 displayed on a right side 304 of a field of view (FoV) 306 of a horizontally inclined virtual image surface 310 appear to be closer to the eye box 208 than one or more virtual objects 308 displayed on a left side 312 of the FoV 306 of the horizontally inclined virtual image surface 310, where the right and left sides 304, 312 are defined with respect to the horizontal FoV 306 from the eye box 208.
  • FoV field of view
  • the horizontal plane (XZ) is defined with respect to a local gravity direction, where the horizontal plane approximates the surface of the earth, or with respect to a vehicle, such as a motor vehicle, a vessel, or an aircraft.
  • a stereoscopic depth of field is defined as the range of distances of the horizontally inclined virtual image surface 310 within the horizontal field of view 306. For example, the greater the inclination of the virtual image surface 310 in the direction of the horizontal field of view, the greater the stereoscopic depth of field 314.
  • Another technique to improve stereoscopic depth of field without an influence on the usable size of the virtual object is to increase the field of view.
  • a twofold increase in the field of view from 6° to 12° leads to a twofold increase in the stereoscopic depth of field from 5.8 meters to 15.3 meters while keeping the usable size of the virtual object unchanged, such as shown in FIGS. 26A-B, respectively.
  • an AR HUD has vertical field of view limitations.
  • the combiner inclination angle is more than 60°, so the combiner size is at least twice larger than the combiner projection on the vertical plane, such as shown in FIG. 27.
  • the virtual image surface 310 is an imaginary surface upon which virtual objects projected from the PGU 202 appear to lie. It will be understood that the virtual image surface 310 can be inclined such as shown in FIG.
  • the virtual image surface 310 can be inclined such that the one or more virtual objects 302 displayed on the left side 312 of the field of view (FoV) 306 appear to be closer to the eye box 208 than the one or more virtual objects 308 displayed on the right side 304 of the FoV 306.
  • FIGS. 4 A and 4B are schematic diagrams of a correcting optical unit 400, in accordance with an example of the present disclosure.
  • the correcting optical unit 400 can be implemented as at least part of the correcting optical unit 204 of FIG. 2.
  • the correcting optical unit 400 includes a telecentric lens 402, an optical element 404 with a cylindrical surface and an aspherical surface, a cylindrical mirror 406, and an output lens 408 with an aspherical surface and a spherical surface.
  • a telecentric lens is a compound lens that has its entrance or exit pupil at infinity, where the chief rays are parallel or substantially parallel to the local optical axis in front of or behind the lens, respectively.
  • the optical element 404 is inclined at an angle 0 with the respect to a local optical axis 410 in a direction 412 of a horizontal FoV.
  • the optical element 404 can be inclined toward a first side of the horizontal FoV at an angle 0, such as shown in FIG. 4B, or toward a second side of the horizontal FoV.
  • a diffuser 414 can be included as part of the correcting optical unit 400 or the PGU 202.
  • the optical system 104 operates in monochromatic mode at a wavelength of 532 nm. Referring to FIGS. 2, 4A and 4B, the optical system 104 with the correcting optical unit 400 operates as follows.
  • the PGU 202 projects the optical image 210 onto the diffuser 414.
  • Rays of light from the diffuser 414 propagate through the correcting optical unit 400, where the inclined optical element 404 (e.g., inclined with respect to the local coordinate axes FoVx, FoVy, z) creates the optical path length monotonic variation in the direction of the horizontal field of view 412, producing the modified optical images 212.
  • the modified optical images 212 rays reach the combiner 206, which redirects the modified optical images 212 toward the eye box 208.
  • An inclined virtual image surface such as shown in FIG.
  • FIG. 6 shows a transformation of a local coordinate system (FoV x , FoV y , Z) in spatial relation to components of the optical system 104 of FIG. 1, including the PGU 202, the correcting optical unit 204, and the combiner 206, in accordance with an example of the present disclosure.
  • the direction in which the correcting optical unit 400 creates an optical path length monotonic variation relates to a local coordinate system (FoVx, FoVy, Z) for defining the FoV, where the Z- axis coincides with the local optical axis defined as the chief ray passing through the center of the field of view area (FoV x , FoV y ).
  • estimations of a stereoscopic depth of field for the optical system 104 are as follows.
  • the stereoscopic depth of field in a linear measure is the difference between the nearest point to an observer L ne ar and the farthest point from the observer L/ ar
  • the stereoscopic depth of field in an angular measure is the angle T in milliradians (mrad), defined as the difference between angles a> and 0 converging on the nearest point to a viewer and the farthest point to a viewer.
  • the stereoscopic depth of field can be estimated in a number of scenes (e.g., Scene 1, Scene 2, etc.) of a 3D virtual image placed between the nearest point to a viewer and the farthest point to a viewer.
  • the size of each scene in a 3D virtual image is an area at a predetermined distance from a viewer where the displayed virtual objects (e.g., the letters “A” and “B” in FIG. 7) are not perceived as inclined.
  • Each scene size is defined by the usable size of the virtual object (which are not perceived as inclined), the field of view, and the viewing distance L. Dividing the virtual image surface into several scenes demonstrates the usable size of the virtual object at a predetermined distance, the aspect ratio of the virtual object, and the position of the virtual object within the field of view.
  • the usable size of the virtual object, which isn’t perceived as inclined, is limited by the stereo-threshold of human vision, such as shown in FIG. 8.
  • the stereo-threshold is the smallest stereoscopic depth of field that can be reliably discriminated by the observer and is based on human vision physiological properties.
  • the stereo-threshold can be approximately 150 arc- seconds.
  • FIGS. 9 A and 9B show example design parameters for the optical system 104 including the correcting optical unit 400 of FIGS. 4A and 4B for an operating wavelength of 532 nanometers, in accordance with an embodiment of the present disclosure.
  • FIGS. 10A and 10B show example HOE recording parameters for the combiner 206 of the optical system 104 including the correcting optical unit 400 of FIGS. 4A and 4B for an operating wavelength of 532 nanometers, in accordance with an embodiment of the present disclosure.
  • FIG. 10B lists example Zernike fringe coefficients of the HOE combiner.
  • FIGS. 11A and 11B are schematic diagrams of a correcting optical unit 1100, in accordance with another example of the present disclosure.
  • the correcting optical unit 1100 can be implemented as at least part of the correcting optical unit 204 of FIG. 2.
  • the correcting optical unit 1100 includes a telecentric lens 1102, an optical element 1104 with a cylindrical surface and an aspherical surface, a freeform mirror 1106 (a mirror with a freeform surface shape), and an output lens 1108 with an aspherical surface and a spherical surface.
  • the optical element 1104 is inclined at an angle 0 with the respect to a local optical axis 1110 in the direction of a horizontal field 1112 of view.
  • a diffuser 1114 can be included as part of the correcting optical unit 1100 or the PGU 202.
  • the freeform mirror 1106 has an asymmetrical surface profile forming a wedge with adjacent optical surfaces of the optical element 1104 and the output lens 1108.
  • the cross-sectional shape of the freeform mirror 1106 in the direction of a vertical field of view can, in some examples, be close to a parabolic cylinder surface, such as shown in FIG. 12A.
  • the cross-sectional shape of the freeform mirror 1106 in the direction of a horizontal field of view can, in some examples, have an asymmetrical profile with a maximum sag of about 60 micrometers (pm), such as shown in
  • FIG. 13 shows an example of the total shape of the surface of the freeform mirror 1106.
  • the optical system 104 with the correcting optical unit 1100 operates as follows.
  • the PGU 202 projects the optical image 210 onto the diffuser 1114. Rays of light from the diffuser 1114 propagate through the correcting optical unit 1100, where the inclined optical element 1104 and the freeform mirror 1106 create an optical path length monotonic variation in the direction of the horizontal field of view 1112, producing the modified optical images 212.
  • the modified optical images 212 reach the combiner 206, which redirects the modified optical images 212 toward the eye box 208.
  • An inclined virtual image surface such as shown in FIG.
  • FIGS. 14A and 14B are schematic diagrams of a correcting optical unit 1400, in accordance with yet another example of the present disclosure.
  • the correcting optical unit 1400 can be implemented as at least part of the correcting optical unit 204 of FIG. 2.
  • the correcting optical unit 1400 includes a telecentric lens 1402, an optical element 1404 with a cylindrical surface and an aspherical surface, a cylindrical mirror 1406, and an output lens 1408 with a freeform surface and a spherical surface.
  • the optical element 1404 is inclined at an angle 0 with the respect to a local optical axis 1410 in the direction of a horizontal field 1412 of view.
  • a diffuser 1414 can be included as part of the correcting optical unit 1400 or the PGU 202.
  • the freeform surface of the output lens 1408 in the direction of ray propagation has an asymmetrical freeform profile and forms a wedge with adjacent optical surfaces of the output lens 1408 and the mirror 1406 in the direction of the horizontal field of view.
  • the cross-sectional shape of the freeform surface of the output lens 1408 in the direction of a vertical field of view can, in some examples, have a symmetrical shape closed to a sphere with a radius of about -280 mm, such as shown in FIG. 15A.
  • the cross-sectional shape of the freeform surface of the output lens 508 in the direction of a horizontal field of view can, in some examples, have an asymmetrical shape closed to a sphere with a radius of about -400 mm, such as shown in FIG.
  • FIG. 16 shows an example total shape of the freeform surface of the output lens 1408, which is similar to a biconic surface.
  • the best fit sphere for the first surface of the output lens 1408 can be calculated as a sphere minimizing the sum of the squared residuals.
  • the maximum deviation of the freeform surface of the output lens 1408 from the best fit sphere with a radius of -426.6 millimeters (mm) is approximately 4 mm, such as shown in FIG. 17.
  • the optical system 104 with the correcting optical unit 1400 operates as follows.
  • the PGU 202 projects an intermediate image onto the diffuser 1414. Rays of light from the diffuser 1414 propagate through the correcting optical unit, wherein the inclined optical element 1404 and the output lens 1408 with the freeform surface create an optical path length monotonic variation in the direction of the horizontal field of view 1412, producing the modified optical images 212.
  • the modified optical images 212 reach the combiner 206, which redirects the modified optical images 212 toward the eye box 208.
  • An inclined virtual image surface such as shown in FIG.
  • the optical system 104 can operate in monochromatic mode or in fullcolor mode with chromatism correction.
  • Table 1 shows example geometrical characteristics of the optical system 104.
  • FIG. 18 is a top down view of a horizontal field of view 1800 of the optical system 104, in accordance with an example of the present disclosure.
  • virtual objects on the right side of the field of view 1800 are displayed closer to a viewer than virtual objects on the left side of the field of view 1800.
  • the angle between the projection of a perpendicular through an arbitrary point of the virtual image surface onto the horizontal plane and a line-of-sight projection onto the horizontal plane is 72°, which corresponds to 4.84 milliradians stereoscopic depth of field, with a 2° usable size and/or width of the virtual object (which isn’t perceived as inclined) and seven (7) scenes in a three-dimensional virtual image, such as listed in Table 2 below and in FIG. 19.
  • FIG. 20 shows scenes in the field of view defining several areas at predetermined distances from a viewer, where the displayed virtual objects aren’t perceived as inclined.
  • the size of each scene is limited by the usable size of the virtual object and the vertical field of view FoV(V).
  • Table 2 shows the stereoscopic depth of field parameters of the optical system 104, according to an example of the present disclosure.
  • the usable size of the virtual object is listed for a stereo-threshold of 150 arc sec.
  • the combiner 206 includes a HOE.
  • An advantage of the holographic combiner in AR HUD optical systems is the ability to provide a wide field of view while maintaining the compactness of AR HUD.
  • the smaller the combiner focal distance (or the higher the combiner optical power) the smaller the volume of AR HUD which is needed to create a virtual image at distances above 3 m from a viewer.
  • a combiner with low optical power can, for example, be incorporated into a windshield reflecting area and have focal lengths of more than about 1000 mm.
  • a holographic combiner such as the combiner 206 of the optical system 104, in some examples, can have a small focal length (a high optical power about 1.1 - 6.6 diopters).
  • an AR HUD with a holographic combiner provides a wider field of view than AR HUD with a combiner having a low optical power.
  • the result of the holographic combiner is an increased of the field of view and stereoscopic depth of field.
  • the optical system 104 is suitable for integration into side-view AR HUDs, where a viewer observes the real world surrounding the vehicle 102 at an angle to the direction of travel, such as shown in FIG. 22.
  • a viewer observes the real world surrounding the vehicle 102 at an angle to the direction of travel, such as shown in FIG. 22.
  • a passenger sitting near the side window is observing the real world surrounding at some angle y relative to the travel direction, and it is natural that the passenger’s eyes follow objects 2200 located along the road.
  • the angle a (between a projection of a perpendicular through an arbitrary point of virtual image surface onto the horizontal plane and the line-of-sight projection onto the horizontal plane) can be matched in accordance with the angle y between the line-of-sight and the travel direction.
  • Example 1 provides an optical system for an augmented reality head-up display.
  • the optical system includes a picture generation unit; a correcting optical unit configured to create, in a direction of a horizontal field of view, a monotonic variation of an optical path length of light rays propagating from the picture generation unit; and a combiner configured to redirect light rays propagating from the correcting optical unit toward an eye box, thereby producing one or more virtual images observable from the eye box; wherein the optical system provides a virtual image surface inclined in the direction of the horizontal field of view for displaying the one or more virtual images at different distances from the eye box, the virtual image surface having a non-zero angle between projections on a horizontal plane defined by a first axis and a second axis, the first axis being perpendicular to the virtual image surface and extending through an arbitrary intersection point on the virtual image surface, the second axis being parallel to a line of sight and extending through the arbitrary intersection point on the virtual image surface, such that a virtual image
  • Example 2 includes the subject matter of Example 1, wherein the correcting optical unit includes at least one optical element having at least one optical surface inclined in the direction of the horizontal field of view.
  • Example 3 includes the subject matter of any one of Examples 1 and 2, wherein the correcting optical unit includes at least one optical element having at least one optical surface with an asymmetrical cross-sectional profile.
  • Example 4 includes the subject matter of Example 1, wherein the correcting optical unit includes a combination of at least one optical element inclined in the direction of the horizontal field of view and at least one optical surface with an asymmetrical cross-sectional profile.
  • Example 5 includes the subject matter of any one of Examples 1-4, wherein the combiner includes a holographic optical element with a positive optical power.
  • Example 6 includes the subject matter of any one of Examples 1 -5, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross-sectional profile, the second surface having a spherical cross-sectional profile.
  • the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross-sectional profile, the second surface having a
  • Example 7 includes the subject matter of any one of Examples 1 -5, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including the at least one optical surface, the at least one optical surface having a freeform shape with an asymmetrical cross-sectional profile; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical crosssectional profile, the second surface having a spherical cross-sectional profile.
  • the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including the at least one optical surface, the at least one optical surface having a freeform shape with an asymmetrical cross-section
  • Example 8 includes the subject matter of any one of Examples 1 -5, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having a freeform shape with an asymmetrical cross-sectional profile in the direction of the horizontal field of view, the second surface having a spherical cross-sectional profile.
  • the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having a
  • Example 9 includes the subject matter of any one of Examples 1-8, wherein the inclined virtual image surface is approximately planar.
  • Example 10 includes the subject matter of any one of Examples 1-9, wherein the correcting optical unit is implemented for a side-view perception functionality and provides the inclined virtual image surface being aligned with a direction of travel of a vehicle.
  • Example 11 provides an optical system for an augmented reality head-up display.
  • the optical system includes a picture generation unit configured to generate an optical image; a correcting optical unit configured to create, in a direction of a horizontal field of view, a monotonic variation of a plurality of optical path lengths of light rays in the optical image propagating from the picture generation unit, thereby producing a plurality of modified optical images; and a combiner configured to redirect the modified optical images propagating from the correcting optical unit toward an eye box, thereby producing one or more virtual images observable from the eye box; wherein the optical system provides a virtual image surface inclined in the direction of the horizontal field of view for displaying the one or more virtual images at different distances from the eye box, the virtual image surface having a non-zero angle between projections on a horizontal plane defined by a first axis and a second axis, the first axis being perpendicular to the virtual image surface and extending through an arbitrary intersection point on the virtual image surface, the second axis being parallel
  • Example 12 includes the subject matter of Example 11, wherein the correcting optical unit includes at least one optical element having at least one optical surface inclined in the direction of the horizontal field of view.
  • Example 13 includes the subject matter of any one of Examples 11 and 12, wherein the correcting optical unit includes at least one optical element having at least one optical surface with an asymmetrical cross-sectional profile.
  • Example 14 includes the subject matter of Example 11, wherein the correcting optical unit includes a combination of at least one optical element inclined in the direction of the horizontal field of view and at least one optical surface with an asymmetrical cross-sectional profile.
  • Example 15 includes the subject matter of any one of Examples 11-14, wherein the combiner includes a holographic optical element with a positive optical power.
  • Example 16 includes the subject matter of any one of Examples 11-15, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross- sectional profile, the second surface having a spherical cross-sectional profile.
  • the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross- sectional profile, the second surface having a
  • Example 17 includes the subject matter of any one of Examples 11-15, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including the at least one optical surface, the at least one optical surface having a freeform shape with an asymmetrical cross-sectional profile; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having an aspherical cross-sectional profile, the second surface having a spherical cross-sectional profile.
  • the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including the at least one optical surface, the at least one optical surface having a freeform shape with an asymmetrical cross-section
  • Example 18 includes the subject matter of any one of Examples 11-15, wherein the correcting optical unit includes a telecentric lens located between the picture generation unit and the optical element, the optical element including a lens with a cylindrical surface and an aspherical surface; a mirror located between the optical element and the combiner, the mirror including a cylindrical surface; and an output lens located between the mirror and the combiner, the output lens including a first surface and a second surface, the first surface having a freeform shape with an asymmetrical cross-sectional profile in the direction of the horizontal field of view, the second surface having a spherical cross-sectional profile.
  • Example 19 includes the subject matter of any one of Examples 11-18, wherein the inclined virtual image surface is approximately planar.
  • Example 20 includes the subject matter of any one of Examples 11-19, wherein the correcting optical unit is implemented for a side-view perception functionality and provides the inclined virtual image surface being aligned with a direction of travel of a vehicle.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Lenses (AREA)

Abstract

L'invention concerne un système optique comprenant une unité de génération d'image, une unité optique de correction configurée pour créer, dans une direction d'un champ de vision horizontal, une variation monotone d'une longueur de trajet optique de rayons lumineux se propageant à partir de l'unité de génération d'image, et un combineur configuré pour rediriger des rayons lumineux se propageant à partir de l'unité optique de correction vers une région oculaire, produisant une ou plusieurs images virtuelles observables à partir de la région oculaire. Le système optique fournit une surface d'image virtuelle inclinée dans la direction du champ de vision horizontal pour afficher les images virtuelles. La surface d'image virtuelle a un angle non nul entre des projections sur un plan horizontal défini par un premier axe perpendiculaire à la surface d'image virtuelle et s'étendant à travers un point d'intersection arbitraire sur la surface d'image virtuelle, et un second axe parallèle à une ligne de visée et s'étendant à partir de la région oculaire à travers le point d'intersection.
PCT/EP2022/083132 2021-12-09 2022-11-24 Système optique d'affichage tête haute à réalité augmentée WO2023104534A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US17/546,388 2021-12-09
US17/546,388 US20230161158A1 (en) 2021-11-22 2021-12-09 Optical system of augmented reality head-up display

Publications (1)

Publication Number Publication Date
WO2023104534A1 true WO2023104534A1 (fr) 2023-06-15

Family

ID=84541324

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/083132 WO2023104534A1 (fr) 2021-12-09 2022-11-24 Système optique d'affichage tête haute à réalité augmentée

Country Status (1)

Country Link
WO (1) WO2023104534A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180157036A1 (en) * 2016-12-02 2018-06-07 Lg Electronics Inc. Head-up display for vehicle
US20180299672A1 (en) * 2015-10-09 2018-10-18 Maxell, Ltd. Projection optical system and head-up display device
US20190236999A1 (en) * 2016-07-12 2019-08-01 Audi Ag Method for operating a display device of a motor vehicle
US20210070175A1 (en) * 2018-05-24 2021-03-11 Mitsubishi Electric Corporation Vehicular display control device and vehicular display control method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180299672A1 (en) * 2015-10-09 2018-10-18 Maxell, Ltd. Projection optical system and head-up display device
US20190236999A1 (en) * 2016-07-12 2019-08-01 Audi Ag Method for operating a display device of a motor vehicle
US20180157036A1 (en) * 2016-12-02 2018-06-07 Lg Electronics Inc. Head-up display for vehicle
US20210070175A1 (en) * 2018-05-24 2021-03-11 Mitsubishi Electric Corporation Vehicular display control device and vehicular display control method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
LEE BYOUNGHO ET AL: "Holographic optical elements for head-up displays and near-eye displays", SPIE SMART STRUCTURES AND MATERIALS + NONDESTRUCTIVE EVALUATION AND HEALTH MONITORING, 2005, SAN DIEGO, CALIFORNIA, UNITED STATES, SPIE, US, vol. 11708, 5 March 2021 (2021-03-05), pages 1170803 - 1170803, XP060140373, ISSN: 0277-786X, ISBN: 978-1-5106-4548-6, DOI: 10.1117/12.2582484 *

Similar Documents

Publication Publication Date Title
US8441733B2 (en) Pupil-expanded volumetric display
US10302936B2 (en) Vehicle display device
US9740004B2 (en) Pupil-expanded biocular volumetric display
US10831021B2 (en) Display device and including a screen concave mirror and flat mirror
CN107111142B (zh) 具有弯曲的微透镜阵列的头戴式成像装置
JP7114146B2 (ja) ディスプレイ装置、および、それを用いた自動車のヘッドアップディスプレイシステム(display device and automobile head-up display system using the same)
CN107430274B (zh) 投影光学系统以及使用投影光学系统的平视显示器装置
JP7003925B2 (ja) 反射板、情報表示装置および移動体
US20110012874A1 (en) Scanning image display apparatus, goggle-shaped head-mounted display, and automobile
JP2021121535A (ja) 情報表示装置
JP2015534124A (ja) 車両のための視野ディスプレイ
JP6940361B2 (ja) 情報表示装置
JP2022189851A (ja) 結像光学系および結像光学系を搭載した移動体
JPWO2018066675A1 (ja) 変倍投影光学系及び画像表示装置
CN111427152B (zh) 虚拟窗口显示器
US20230161158A1 (en) Optical system of augmented reality head-up display
EP2180364A1 (fr) Affichage et procédé de fonctionnement d'un affichage
CN219676374U (zh) 显示装置、抬头显示装置和交通工具
Shih et al. Dual-eyebox head-up display
WO2023104534A1 (fr) Système optique d'affichage tête haute à réalité augmentée
WO2018199245A1 (fr) Dispositif d'affichage d'image virtuelle et système d'affichage de corps mobile
CN112526748A (zh) 一种抬头显示设备、成像系统和车辆
US10852539B2 (en) Projection optical system, head-up display device, and vehicle
US20230096336A1 (en) Optical system of augmented reality head-up display
KR20200017832A (ko) 헤드 업 디스플레이 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22826665

Country of ref document: EP

Kind code of ref document: A1