WO2023007230A1 - Compact holographic head-up display device - Google Patents

Compact holographic head-up display device Download PDF

Info

Publication number
WO2023007230A1
WO2023007230A1 PCT/IB2021/056977 IB2021056977W WO2023007230A1 WO 2023007230 A1 WO2023007230 A1 WO 2023007230A1 IB 2021056977 W IB2021056977 W IB 2021056977W WO 2023007230 A1 WO2023007230 A1 WO 2023007230A1
Authority
WO
WIPO (PCT)
Prior art keywords
hhud
compact
optical element
optical
refractive
Prior art date
Application number
PCT/IB2021/056977
Other languages
French (fr)
Inventor
Vitaly PONOMAREV
Andrey BELKIN
Oleg BUSHMANOV
Anton SHCHERBINA
Mikhail SVARYCHEUSKI
Original Assignee
Wayray Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wayray Ag filed Critical Wayray Ag
Priority to PCT/IB2021/056977 priority Critical patent/WO2023007230A1/en
Publication of WO2023007230A1 publication Critical patent/WO2023007230A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B27/0103Head-up displays characterised by optical features comprising holographic elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B27/0103Head-up displays characterised by optical features comprising holographic elements
    • G02B2027/0105Holograms with particular structures
    • G02B2027/0107Holograms with particular structures with optical power
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/015Head-up displays characterised by mechanical features involving arrangement aiming to get less bulky devices

Definitions

  • Embodiments discussed herein are generally related to optical devices and head-up displays (HUDs), and in particular, to configurations and arrangements of optical elements to provide compact holographic HUDs.
  • a Head-Up Display is a transparent display that presents information without requiring a viewer to look away from their viewpoint.
  • Typical HUDs include a combiner, a light projection device (referred to as a “projector” or “projection unit”), and a video/image generation computer device.
  • the combiner is usually a piece of glass located directly in front of the viewer, that redirects the projected image from projector in such a way as to see the field of view and the projected infinity image at the same time.
  • the projector is often an optical collimator including a lens or mirror with a cathode-ray tube, light emitting diode (LED) display, or liquid crystal display (LCD) that produces an image where the light is collimated (i.e., where the focal point is perceived to be at infinity).
  • LED light emitting diode
  • LCD liquid crystal display
  • Holographic HUDs typically include a laser projector and a holographic optical element (HOE).
  • HOE holographic optical element
  • Some hHUDs place the HOE inside a display screen, such as a windscreen or windshield of an automobile or aircraft.
  • a display screen such as a windscreen or windshield of an automobile or aircraft.
  • most hHUDs with HOEs inside the display screen cannot produce large, high quality images without producing optical aberrations in a similar manner as with the classical HUDs.
  • typical hHUDs with HOEs inside the display screen require several corrective optical devices to correct these aberrations.
  • classical HUDs and typical hHUDs require optical elements with large dimensions and substantial optical power to provide high quality images.
  • Figures 1, 2, and 3 illustrate a holographic head-up display (hHUD) system and demonstrate operation of the hHUD system.
  • Figures 4, 5, and 6 illustrate various configurations and arrangements of the hHUD system.
  • Figures 7 and 8 illustrate example surface patterns.
  • Figure 9 illustrates simulation results of using the various holographic optical systems discussed herein.
  • Figure 10 illustrates an example HUD system for a vehicle.
  • Figure 10 illustrates an example display system configurable to interface with an on-board vehicle operating system.
  • Figure 11 illustrates an example implementation of a vehicle embedded computer device according to various embodiments.
  • an hHUD includes a holographic optical element (HOE) inside a display screen and has a certain arrangement of refractive and reflective optical elements that work together with the HOE to work as a HUD to show images far ahead of a viewer.
  • the HOE may also have various working geometries, including the off-axis one, may be formed by complex aberrated wavefronts, optimized for the minimization of residual aberrations together with a compact corrector, achieving high image quality despite its small size.
  • the configurations and arrangements discussed herein allow smaller optical elements with lower optical power than conventional HUDs and hHUDs to be used. Additionally, the configurations and arrangements discussed herein loosen display screen tolerance requirements due to the transfer of the combiner’s optical functions from the display screen to the hologram and/or HOE. Furthermore, the configurations and arrangements discussed herein do not require the use of complex wedge- shaped films for flare elimination by the divergence of 0 and 1 diffraction orders. Moreover, the size reduction of the hHUDs discussed herein reduce the amount or energy absorbed by the hHUD components, for example, reducing sensor heat caused by the narrowness of the hologram’s working spectral region.
  • Figures 1 and 2 illustrate a holographic head-up display (hHUD) system 100 (or simply “hHUD 100”).
  • the hHUD 100 includes a picture generation unit (PGU) 101, correction optics assembly 102 (also referred to as “correction optics 102”, “corrective optics 102”, and/or the like), and a combiner 103.
  • the correction optics 102 includes various optical elements including a scattering surface 120, optical element 121, optical element 122, optical element 123, and a corrector 124 (for the sake of clarity, many of the optical elements of the correction optics 102 are not labeled in Figure 1).
  • the PGU 101 projects laser light 110 (or otherwise generates light 110) through various optical elements 121-124 of the correction optics 102 including illumination of the scattering surface 120.
  • the PGU 101 creates an intermediate image at the scattering surface 120 by projecting light representative of a virtual image on to the scattering surface 120, which is then magnified and/or filtered by optical elements 121 and/or 122, and redirected to corrector 124 by reflection surface 123.
  • the 102 redirects 111 the light 110 (i.e., as light rays 111) onto an HOE that is on or in the combiner
  • HOE 103 (hereinafter referred to as “HOE 103” or the like), which are then reflected 112 towards a viewer 115.
  • the HOE 103 with optical power generates a virtual image of the previously generated intermediate image (at the scattering surface 120) at the given distance 215 from the viewer 115 (see Figure 2).
  • AR augmented reality
  • the hHUD 100 allows the viewer to see the projected/generated image at approximately 10 or 15 meters ahead of the viewer (through the combiner 103), which matches real world objects that can be seen through the combiner 103.
  • the PGU 101 may be realized by the use a suitable projector and is the same or similar to the PGU 1130 discussed infra with respect to Figure 11.
  • the combiner 103 in this example is a (semi-)transparent display surface located directly in front of a viewer that redirects a projected virtual image from the PGU 101 in such a way as to allow the viewer to view a field ofview(FoV) and the virtual image at the same time thereby facilitating augmented reality (AR).
  • AR augmented reality
  • the size of the virtual image is defined by the largest optical element of a HUD, which is usually a combiner element such as combiner 103.
  • the combiner is typically a large mirror.
  • the largest optical element is the HOE (e.g., HOE 1131 of Figure 11 discussed infra) that is in or on the combiner 103.
  • HOE e.g., HOE 1131 of Figure 11 discussed infra
  • Various aspects of the combiner 103, HOE, and PGU 101 are discussed infra with respect to Figure 11.
  • the correction optics 102 (also referred to as “corrective optics 102” or “auxiliary optics 102”) works together with the HOE 103 to display the virtual images.
  • the correction optics 102 comprises one or more optical elements 121-124, which may include, for example, lenses, prisms, mirrors, HOEs, and/or other optical elements, and/or combinations thereof.
  • the corrective optics 102 includes, inter alia, scattering surface 120 and corrector 124.
  • the scattering surface 120 may be a diffusion screen, a diffuser plate, and/or an array of microlenses with selected parameters of scattering on a plane close to the focal plane.
  • the corrector 124 is an optical element that primarily corrects aberrations caused by the HOE 103. All of the optical elements of correction optics 102 work together to correct aberrations.
  • the corrector 124 may comprise one or more of a prism, a lens, a mirror, prismatic lens, and/or various combinations of such optical elements.
  • the properties of the corrector 124 are dependent on the particular arrangement and configuration of various optical elements of the hHUD 100 within a particular environment (e.g., within an automobile cabin, aircraft cockpit, head-mounted display, etc.).
  • the corrector 124 may have a first set of properties when the hHUD 100 is configured or deployed within an automobile and may have a first set of properties when the hHUD 100 is configured or deployed within an aircraft cockpit.
  • the corrector 124 may have a first set of properties when the hHUD 100 is configured or deployed within a first automobile of a first make and model, and may have a first set of properties when the hHUD 100 is configured or deployed within a second automobile of a second make and model.
  • the set of properties include, for example, the surface types or patterns of the corrector 124, a shape formed by the surfaces of the corrector 124, a size of the corrector 124, a position of the corrector 124 with respect to at least one other optical element, an orientation of the corrector 124 with respect to at least one other optical element, materials or substances used to make the corrector 124, and/or other properties.
  • Other aspects of the corrector 124 are discussed in more detail infra.
  • the dimensions (e g., size, volume, etc.) of the various optical elements of the hHUD 100 can be reduced to have dimensions that the optical elements are much smaller than optical elements used in existing hHUDs and classical HUDs. This is because the light rays 110 coming from the PGU 101 to the corrective optics 102 are converging. And because the light rays 110 are converging, smaller optical elements can be used to cover all the light rays 110.
  • the dimensional reduction of the optical elements can be very significant, for example, in some implementations the volume and/or size of the optical elements can be reduced by a factor of 10 (i.e., the volume and/or size of the optical elements can be up to 10 times smaller than the optical elements of existing HUDs and hHUDs). Additionally, in some implementations, the use of smaller optical elements may allow the PGU 101 and the correction optics 102 to be positioned closer (in distance) to one another, and/or positioned to be positioned closer (in distance) to the combiner 103, which may provide further enhancements to the virtual image (e.g., sharpness, clarity, etc.).
  • optical aberrations may also be increased or exaggerated when operating the hHUD 100 and/or arranging the optical elements of the hHUD 100 in this manner.
  • Aberrations can cause the virtual image formed on the combiner 103 to be blurred or distorted, with the nature of the distortion depending on the type of aberration.
  • an aberration occurs when light from one point of an object does not converge into (or does not diverge from) a single point after transmission through the hHUD 100.
  • the correction optics 102 having a configuration or arrangement as discussed infra is capable of correcting or compensating for aberrations.
  • the optical power of an optical system or optical element is the degree to which an optical element or optical system converges or diverges light.
  • the correction optics 102 having a configuration or arrangement as discussed infra compensates for this type of image degradation.
  • introduction of the optical correction optics 102 allows the HOE in the combiner 103 to have higher optical power and/or more focused light without experiencing the image degradation that a similar optical power would produce in existing hHUDs and/or classical HUDs.
  • This also allows for the reduction in the size of the optical system 100 because the rays 110, 111 can converge faster while being covered by smaller optical elements.
  • the HOE 103 has an optical power in the range of 1,1 - 6,6 diopters.
  • some conventional/classical HUD systems utilize a wedge-shaped film in or on a combiner in order to avoid ghosting (where a replica of the transmitted image (i.e., a “ghost” image), which is offset in position, is superimposed on top of the main image).
  • the correction optics 102 having a configuration or arrangement as discussed infra does not need an HOE or combiner 103 to have a wedge-shaped film or layer, which reduces complexity in producing/manufacturing combiners 103 and/or allows combiners 103 to be produced/manufactured using less materials and complexity.
  • the HOE 103 may be formed by the registration of the interference between two wavefronts inside a suitable HOE material (e.g., a photopolymer and/or other material such as those discussed herein).
  • the wavefronts may be spherical diverging 307 and plane 308, geometries of which correspond to the spatial scheme of the hHUD 100 in a vehicle (e g., an automobile, truck, watercraft, aircraft, etc.).
  • the recording may be performed using preliminarily aberrated wavefronts due to the existence of special optical elements in an optical scheme 309, which may resemble simple spherical and cylindrical elements as well as aspherical and/or freeform elements.
  • correction of aberrations is made by the compact correction optics 102 (including corrector 124), parameters of which may be calculated together with parameters of elements participating in the formation of the HOE’s 103 wavefronts.
  • the correction optics 102 may be used whilst working with an HOE 103 without an off-axis angle (or with an on-axis angle).
  • the optical axis 130 is an imaginary line that defines the path along which light propagates through the system (i.e., hHUD 100).
  • the optical axis passes through the center of curvature of each surface, and coincides with the axis of rotational symmetry.
  • the optical axis may be coincident with a mechanical axis of the system (e.g., mechanical axis of the hHUD 100, mechanical axis of the correction optics 102, or mechanical axis of the corrector 124).
  • the object e.g., the correction optics 102 and/or the corrector 124 is located on the optical axis 130 coinciding with the HOE’s 103 normal.
  • the optical elements of the correction optics 102 comprise at least two refractive surfaces (or at least two optical elements, each having a refractive surface) and may in addition comprise at least one reflective surface (or at least one optical element with a reflective surface).
  • the at least two refractive elements include optical elements 121 and 122 as well as the surface of corrector 124 facing the HOE 103, and the at least one reflective element includes reflective element 123.
  • the refractive elements 121, 122 and reflective element 123 can be formed into any type of shapes such as, for example, flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes.
  • the shapes of the refractive elements 121, 122 and/or the reflective element 123 may predominantly have rotational symmetry.
  • the optical element 121 is a rectangular polyhedron lens or prism
  • optical element 122 is a cylindrical polyhedron lens or prism or a rectangular polyhedral lens or prism having convex sides
  • optical element 123 is a flat or substantially planar reflective surface.
  • the optical elements 121, 122, and 123 may have surfaces of various types and/or patterns including, for example, spherical, aspherical, toroidal, and/or freeform surfaces.
  • the corrector 124 is formed from at least two refractive surfaces. Individual surfaces of the corrector 124 can be spherical, aspherical, anamorphic, and/or freeform surfaces. The surfaces of corrector 124 can form any type of shape that predominantly has rotational symmetry. For example, the shape of the of corrector 124 can be flat or a planar, a sphere, a prismatic (prism or prism-like), pyramidal, ellipsis shape, conical, cylindrical, toroidal, and/or some other three-dimensional (3D) shape.
  • 3D three-dimensional
  • FIG 4 shows a hHUD 400 with correction optics assembly 402 including an arrangement of optical element 421, optical element 422, optical element 423, and corrector 424.
  • the hHUD 400 is used whilst working with an HOE 103 with an off-axis angle (or without an on- axis angle).
  • the off-axis angle is an angle b between the HOE’s 103 normal 430 and an axis 420, connecting centers of the HOE 103 and the object (e.g., corrector 424).
  • An off-axis optical system is an optical system in which the optical axis of an optical element is not coincident with the mechanical center of the optical element, and/or where the optical axis is not coincident with the mechanical axis of the system.
  • the axis 420 may be the normal of the system (e.g., correction optics 402 and/or corrector 424) or the mechanical axis of the system (e.g., correction optics 402 and/or corrector 424).
  • the optical elements of the correction optics 402 comprises at least two refractive elements (or at least two optical elements with a refractive surface) and at least one reflective element (or at least one optical elements with a reflective surface).
  • the at least two refractive elements include optical elements 421 and 422 and corrector 424
  • the at least one reflective element includes reflective element 423.
  • the refractive elements 421, 422, 424 and reflective element 423 can be formed into any type of shapes such as, for example, flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes.
  • the shapes of the refractive elements 421, 422 and/or the reflective element 423 may have rotational symmetry and/or may be formed into an aspherical shape/form.
  • the optical element 421 is an oblique pyramidal or tetrahedral lens or prism
  • optical element 422 is a rectangular polyhedron lens or prism
  • optical element 423 is a flat or substantially planar reflective surface.
  • the optical elements 421, 422, and 423 may have surfaces of various types and/or patterns including, for example, spherical, aspherical, anamorphic, and/or freeform surfaces.
  • the corrector 424 is formed from at least two refractive surfaces.
  • the corrector 424 can be formed to have more than two refractive surfaces and/or more than one reflective surface.
  • the refractive and reflective surfaces of the corrector 424 can be spherical, aspherical, anamorphic, and/or freeform surfaces.
  • the surfaces of corrector 424 can form any type of shape that predominantly has a decentered (e.g., off-axis) aspherical shape/form as is shown in Figure 4 in order to correct occurring difference between optical paths in the vertical plane.
  • the shape of the of corrector 424 can be flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes.
  • Figure 5 shows a hHUD 500 with correction optics assembly 502 including an arrangement of optical element 522 and corrector 524.
  • the hHUD 500 is used whilst working with an HOE 103 with an off-axis angle or an on-axis angle.
  • the optical elements of the correction optics 502 comprises at least two refractive elements (or at least two optical elements with a refractive surface.
  • the at least two refractive elements include optical element 522 and corrector 524.
  • the optical elements 522, 524 can be formed into any type of shapes such as, for example, flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes.
  • the optical element 522 can have spherical, aspherical, anamorphic, and/or freeform surfaces.
  • the optical element 522 is a rectangular polyhedron lens or prism with a concave surface/side oriented to face the corrector 524.
  • the corrector 524 is formed from at least two refractive surfaces.
  • the corrector 524 can be formed to have more than refractive surfaces and/or more than one reflective surface.
  • Individual surfaces of the corrector 524 can be spherical, aspherical, anamorphic, and/or freeform surfaces.
  • the surfaces of corrector 524 can form any type of shape that predominantly has a decentered (e.g., off-axis) prism-shaped (or prismatic) form as is shown in Figure 5 in order to correct occurring difference between optical paths in the vertical plane.
  • the shape of the of corrector 524 can be flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes.
  • Figure 6 shows a hHUD 600 with correction optics assembly 602 including an arrangement of optical element 621 and corrector 624.
  • the hHUD 600 is used whilst working with an HOE 103 with an off-axis angle or an on-axis angle.
  • the optical element 621 is a refractive element that can be formed into any type of shape such as, for example, flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes.
  • the optical element 621 can have spherical, aspherical, anamorphic, and/or freeform surfaces.
  • the optical element 621 is a rectangular polyhedron lens or prism with a concave surface/side oriented to face the scattering surface 120.
  • the corrector 624 is formed from at least two refractive surfaces 624a and 624c and at least one reflective surface 624b.
  • a first refractive surface 624a of the corrector 624 faces the optical element 621 and a second refractive surface 624c of the corrector 624 faces the HOE 103.
  • the reflective surface 624b of the corrector 624 is within the corrector 624 and is oriented at an angle with respect to the optical element 621 and/or the HOE 103. Light rays from the scattering surface 120 is guided to the corrector 624 by the optical element 621, and enters the corrector 624 through the first refractive surface 624a.
  • the light rays are then reflected off of the reflective surface 624b on to the HOE 103 via the second refractive surface 624c.
  • Individual surfaces of the corrector 624 can be spherical, aspherical, anamorphic, and/or freeform surfaces.
  • the surfaces of corrector 624 can form any type of shape that predominantly has a prismatic form.
  • the corrector 624 in this example is formed as an integral reflective-prismatic element with a freeform prismatic shape for the correction of the occurring optical paths difference.
  • Figure 7 shows example surface shapes or patterns than may be used for the various optical elements discussed with respect to Figures 1-6.
  • the example surfaces include a spherical surface 701, an aspherical surface 702, anamorphic, and a freeform surfaces 703a, 703b, and 703c.
  • Freeform surfaces 703a and 703b are 3D plots where darker shadings indicate greater surface height, and freeform surface 703c is an interferogram.
  • Figure 8 shows additional example surface shapes or patterns than may be used for the various optical elements discussed with respect to Figures 1-6.
  • the example surfaces include a spherical surface 801, an off-axis spherical surface 802, an aspherical surface 803, an off-axis aspherical surface 804, and a freeform surface 805.
  • the freeform surfaces discussed with respect to Figures 1-8 may be modeled and/or formed based on mathematical descriptions. Examples of such mathematical descriptions include radial basis functions, basis splines, wavelets, non-uniform rational basis splines, orthogonal polynomials (e.g., Zernike polynomials, 2D-Q2D-Q polynomials, (p-polynomial, etc.), non- orthogonal bases (e.g., X-YX-Y polynomials), hybrid stitched representations, and/or combinations thereof.
  • orthogonal polynomials e.g., Zernike polynomials, 2D-Q2D-Q polynomials, (p-polynomial, etc.
  • non- orthogonal bases e.g., X-YX-Y polynomials
  • hybrid stitched representations and/or combinations thereof.
  • the hHUD device of Figures 1-6 can be implemented in various ways.
  • hHUD device can include a single PGU 101 and a periscopic system to provide virtual images at different distances.
  • multiple HOEs may be provided in or on a single substrate (e.g., combiner 103) to provide several virtual images at intersecting eyebox areas from different PGUs 101 disposed at different positions/locations. This implementation may provide viewer convenience and increased AR capabilities.
  • the multiple HOEs can also be placed in such a way as to provide several different eyeboxes, for example, for a driver and a passenger, from different PGUs 101, in order to create augmented reality without FoV losses when observation of the surrounding world is done through the same transparent surface from different zones.
  • some or all of the components of the hHUDs of Figures 1-6 can be included in a single housing or frame.
  • the correction optics 102, 402, 502, 602 may be disposed in a single housing or frame.
  • the PGU 101 (or multiple PGUs 101) may be disposed in the same housing/frame as the correction optics 102, 402, 502, 602. Any of the aforementioned implementations may be combined or rearranged depending on the specific use cases involved and/or the environment in which the hHUD system is deployed/disposed.
  • Figure 9 shows a graph 900 including curves of volume growth for a corrective optics system located between a combiner 103 and a PGU 101 depending on its optical power for different distances to a virtual image.
  • focal lengths are used in graph 900.
  • the values taken as initial data include distance to eyebox of 700 mm, circular eyebox radius of 71 mm, FoV (radius) of 6.5 degrees.
  • Graph 900 shows the approximate volume of an hHUD system in liters depending on the combiner’s focal length in millimeters (mm) for different distances (given in mm) to a virtual image. It is demonstrated that the growth of an hHUD system’s volume, which is necessary to form a reasonable distance to virtual image (>3 m) is very fast. And only those hHUD systems, which form an image, located below the vehicle’s hood ( ⁇ 2 m), may be located in reasonable volumes even with small optical power of a combiner. Optical power or focal length of traditional systems with combiner, resembling windshield area, may be evaluated by the minimal nearest radius of curvature in this area.
  • each of the elements/components shown and described herein may be manufactured or formed using any suitable fabrication means, such as those discussed herein. Additionally, each of the elements/components shown and described herein may be coupled to other elements/components and/or coupled to a portion/section of the vehicle by way of any suitable fastening means, such as those discussed herein. Furthermore, the geometry (shape), position, and/or orientation of the elements/components shown and described herein may be different from the depicted shapes, positions, and/or orientations in the example embodiments of Figures 1-6 depending on the shape, size, and/or other features of the vehicle in which the hHUD is disposed. 2.
  • FIG. 10 illustrates an example display system 1000 configurable to interface with an onboard unit (OBU) 1020.
  • the display system 1000 may be configured, arranged, or otherwise compatible with a number of different types of vehicles (e g., makes, models, etc.), which may be associated with different operator positions, including height of the operator’s eyes or distance from the operator to display device 1060.
  • the OBU 1020 comprises one or more vehicle processors or onboard computers, memory circuitry (with instructions stored in the memory), and interface circuitry that interfaces with the HUD processor 1010.
  • the display system 1000 connects to the OBU 1020 via an onboard diagnostics (OBD) port of a vehicle.
  • OBD onboard diagnostics
  • HUD processor 1010 controls or otherwise operates a projection device 1030 that, in turn, generates and/or projects light representative of at least one virtual image onto an imaging matrix 1050.
  • Imaging matrix 1050 selectively distributes and/or propagates the virtual image received as light from the projection device 1030 and/or optical devices 1040 as one or more wave fronts to a display device 1060.
  • the HUD processor 1010 is a computing device that determines virtual graphics to display on display device 1060 according to one or more HUD apps, and provides indications/signals of the virtual graphics to projection device 1030 that, in turn, generates and/or projects light representative of the virtual graphic to the imaging matrix 1050.
  • the projection device 1030 may be the same or similar as the PGU 101 discussed previously and/or the PGU(s) 1130 discussed infra with respect to Figure 11.
  • the HUD apps may cause the generation of virtual graphics based on, for example, predetermined operational parameters including vehicle parameters (e.g., speed, location, travel direction, destination, windshield location, traffic, and the like), road parameters (e.g., location or presence of real world objects, roads, and the like), vehicle observer parameters (e.g., operator location within vehicle, observer eye tracking, eye location, position of system, and the like), and/or a combination thereof.
  • Operational parameters may further include any input received from any of a plurality of sources including vehicle systems or settings including, for example, sensor circuitry 1121, I/O devices 1186, actuators 1122, ECUs 1123, positioning circuitry 1145, or a combination thereof as shown by Figure 11.
  • the one or more optical devices 1040 or lenses are configured to correct aberrations, filter, and/or to improve light utilization efficiencies.
  • Optical devices 1040 may include any type of optical device (e.g., filters) such as those discussed herein.
  • the optical devices 1040 may be the same or similar to the corrector 102 (or portions thereof) discussed previously.
  • display device 1060 comprises a windscreen or windshield of a terrestrial vehicle, watercraft, or aircraft, a holographic film placed adjacent to the windshield/windscreen, or a combination thereof. Additionally or alternatively, the display device 1060 comprises a head-mounted display (HMD) screen (e.g., an augmented reality (AR) or virtual reality (VR) headset), a transparent (or semi-transparent) eyepiece such as those used for optical HMDs, helmet-mounted displays, and/or the like.
  • HMD head-mounted display
  • AR augmented reality
  • VR virtual reality
  • eyepiece such as those used for optical HMDs, helmet-mounted displays, and/or the like.
  • the display system 1000 is configurable or operable to generate one or more virtual graphics on image plane 1070.
  • the image plane 1070 is associated with a focal distance 1075.
  • image plane 1070 is illustrated as being located on an opposite side of display device 1060 from imaging matrix 1050, in some implementations, the display device 1000 is configured to reflect light of the wave front propagated by imaging matrix 1050 so that the resulting image is reflected back to an observer (e.g., viewer 115 of Figure 1). While the image may be reflected back from display device 1060 to the observer, the image plane 1070 may nevertheless appear to the observer to be located on the opposite side of the display device 1060 (e.g., on the same side of the display device 1060 as the real-world objects, outside of the vehicle).
  • the display system 1000 may comprise a translation device or motor 1080 that can vary the focal distance 1075 of image plane 1070 such as by moving the imaging matrix 1050 relative to the display device 1060 in any direction (e.g., vertical or horizontal) and/or vice versa, as well as change the incline angle of the imaging device 1050.
  • a translation device or motor 1080 that can vary the focal distance 1075 of image plane 1070 such as by moving the imaging matrix 1050 relative to the display device 1060 in any direction (e.g., vertical or horizontal) and/or vice versa, as well as change the incline angle of the imaging device 1050.
  • display system 1000 may include multiple projection devices 1030, optical devices 1040 imaging matrices 1050, display devices 1060, and motors 1080 that may be disposed in a multitude of arrangements.
  • FIG 11 illustrates an example computing system 1100, in accordance with various embodiments.
  • the system 1100 may include any combinations of the components as shown, which may be implemented as integrated circuits (ICs) or portions thereof, discrete electronic devices, or other modules, logic, hardware, software, firmware, middleware or a combination thereof adapted in the system 1100, or as components otherwise incorporated within a chassis of a larger system, such as a HUD system 1000 and/or vehicle 1005. Additionally or alternatively, some or all of the components of system 1100 may be combined and implemented as a suitable System-on-Chip (SoC), System-in-Package (SiP), multi-chip package (MCP), or some other like package.
  • SoC System-on-Chip
  • SiP System-in-Package
  • MCP multi-chip package
  • the system 1100 is an embedded system or any other type of computer device discussed herein.
  • the system 1100 may be a separate and dedicated and/or special- purpose computer device designed specifically to carry out air-barrier solutions of the
  • the processor circuitry 1102 comprises one or more processing elements/devices configurable to perform basic arithmetical, logical, and input/output operations by carrying out and/or executing instructions. According to various embodiments, processor circuitry 1102 is configurable to perform some or all of the calculations associated with the preparation and/or generation of virtual graphics and/or other types of information that are to be projected by HUD system 1000 for display, in real time. Additionally, processor circuitry 1102 is configurable to gather information from sensor circuitry 1120 (e.g., process a video feed from a camera system or image capture devices), obtain user input from one or more I/O devices 1186, and obtain vehicle input in substantially in real time. Some or all of the inputs may be received and/or transmitted via communication circuitry 1109.
  • sensor circuitry 1120 e.g., process a video feed from a camera system or image capture devices
  • the processor circuitry 1102 may execute instructions 1180, and/or may be loaded with an appropriate bit stream or logic blocks to generate virtual graphics based, at least in part, on any number of parameters, including, for example, input from sensor circuitry 1120, input from I/O devices 1186, input from actuators 1122, input from ECUs 1123, input from positioning circuitry 1145, and/or the like. Additionally, processor circuitry 1102 may be configurable to receive audio input, or to output audio, over an audio device 1120. For example, processor circuitry 1102 may be configurable to provide signals/commands to an audio output device 1186 to provide audible instructions to accompany the displayed navigational route information or to provide audible alerts.
  • the processor circuitry 1102 includes circuitry such as, but not limited to one or more processor cores and one or more of cache memory, low drop-out voltage regulators (LDOs), interrupt controllers, serial interfaces such as serial peripheral interface (SPI), inter-integrated circuit (I 2 C) or universal programmable serial interface circuit, real time clock (RTC), timer- counters including interval and watchdog timers, general purpose input-output (EO), memory card controllers, interconnect (IX) controllers and/or interfaces, universal serial bus (USB) interfaces, mobile industry processor interface (MIPI) interfaces, Joint Test Access Group (JTAG) test access ports, and the like.
  • LDOs low drop-out voltage regulators
  • interrupt controllers serial interfaces such as serial peripheral interface (SPI), inter-integrated circuit (I 2 C) or universal programmable serial interface circuit, real time clock (RTC), timer- counters including interval and watchdog timers, general purpose input-output (EO), memory card controllers, interconnect (IX) controller
  • the processor circuitry 1102 may include on-chip memory circuitry or cache memory circuitry, which may include any suitable volatile and/or non-volatile memory, such as DRAM, SRAM, EPROM, EEPROM, Flash memory, solid-state memory, and/or any other type of memory device technology, such as those discussed herein.
  • volatile and/or non-volatile memory such as DRAM, SRAM, EPROM, EEPROM, Flash memory, solid-state memory, and/or any other type of memory device technology, such as those discussed herein.
  • the processor(s) of processor circuitry 1102 may be, for example, one or more application processors or central processing units (CPUs), one or more graphics processing units (GPUs), one or more reduced instruction set computing (RISC) processors, one or more Acom RISC Machine (ARM) processors, one or more complex instruction set computing (CISC) processors, one or more DSPs, one or more microprocessor without interlocked pipeline stages (MIPS), one or more programmable logic devices (PLDs) and/or hardware accelerators) such as field-programmable gate arrays (FPGAs), structured/programmable Application Specific Integrated Circuit (ASIC), programmable SoCs (PSoCs), etc., one or more microprocessors or controllers, or any suitable combination thereof.
  • CPUs central processing units
  • GPUs graphics processing units
  • RISC reduced instruction set computing
  • CISC complex instruction set computing
  • DSPs digital signal processor without interlocked pipeline stages
  • PLDs programmable logic devices
  • PLDs programmable logic devices
  • the processor circuitry 1102 may be implemented as a standalone system/device/package or as part of an existing system/device/package (e.g., an ECU/ECM, EEMS, etc.) of the vehicle 1000.
  • the processor circuitry 1102 may include special-purpose processor/controller to operate according to the various embodiments herein.
  • Individual processors (or individual processor cores) of the processor circuitry 1102 may be coupled with or may include memory/storage and may be configurable to execute instructions stored in the memory/storage to enable various applications or operating systems to run on the system 1100.
  • one or more processors (or cores) of the processor circuitry 1102 may correspond to the processor 1012 of Figure 10 and is/are configurable to operate application software (e.g., HUD app) to provide specific services to a user of the system 1100.
  • application software e.g., HUD app
  • one or more processors (or cores) of the processor circuitry 1102, such as one or more GPUs or GPU cores may correspond to the HUD processor 1010 and is/are configurable to generate and render graphics as discussed previously.
  • the processor circuitry 1102 may include an Intel® Architecture CoreTM based processor, such as a QuarkTM, an AtomTM, an i3, an i5, an i7, or an MCU-class processor, Pentium® processor(s), Xeon® processor(s), or another such processor available from Intel® Corporation, Santa Clara, California.
  • Intel® Architecture CoreTM based processor such as a QuarkTM, an AtomTM, an i3, an i5, an i7, or an MCU-class processor, Pentium® processor(s), Xeon® processor(s), or another such processor available from Intel® Corporation, Santa Clara, California.
  • any number other processors may be used, such as one or more of Advanced Micro Devices (AMD) Zen® Core Architecture, such as Ryzen® or EPYC® processor(s), Accelerated Processing Units (APUs), MxGPUs, Epyc® processor(s), or the like; A5-A12 and/or S1-S4 processor(s) from Apple® Inc., QualcommTM or CentriqTM processor(s) from Qualcomm® Technologies, Inc., Texas Instruments, Inc.® Open Multimedia Applications Platform (OMAP)TM processor(s); a MIPS-based design from MIPS Technologies, Inc.
  • AMD Advanced Micro Devices
  • Zen® Core Architecture such as Ryzen® or EPYC® processor(s), Accelerated Processing Units (APUs), MxGPUs, Epyc® processor(s), or the like
  • the processor circuitry 1102 may include a sensor hub, which acts as a coprocessor by processing data obtained from the sensor circuitry 1120.
  • the sensor hub may include circuitry configurable to integrate data obtained from each of the sensor circuitry 1120 by performing arithmetical, logical, and input/output operations.
  • the sensor hub may capable of timestamping obtained sensor data, providing sensor data to the processor circuitry 1102 in response to a query for such data, buffering sensor data, continuously streaming sensor data to the processor circuitry 1102 including independent streams for each sensor circuitry 1120, reporting sensor data based upon predefined thresholds or conditions/triggers, and/or other like data processing functions.
  • the memory circuitry 1104 comprises any number of memory devices arranged to provide primary storage from which the processor circuitry 1102 continuously reads instructions 1182 stored therein for execution.
  • the memory circuitry 1104 includes on-die memory or registers associated with the processor circuitry 1102.
  • the memory circuitry 1104 may include volatile memory such as random access memory (RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), etc.
  • the memory circuitry 1104 may also include non-volatile memory (NVM) such as read-only memory (ROM), high speed electrically erasable memory (commonly referred to as “flash memory”), and non-volatile RAM such as phase change memory, resistive memory such as magnetoresistive random access memory (MRAM), etc.
  • NVM non-volatile memory
  • ROM read-only memory
  • flash memory high speed electrically erasable memory
  • non-volatile RAM such as phase change memory
  • resistive memory such as magnetoresistive random access memory (MRAM), etc.
  • the processor circuitry 1102 and memory circuitry 1104 may comprise logic blocks or logic fabric, memory cells, input/output (I O) blocks, and other interconnected resources that may be programmed to perform various functions of the example embodiments discussed herein.
  • the memory cells may be used to store data in lookup-tables (LUTs) that are used by the processor circuitry 1102 to implement various logic functions.
  • LUTs lookup-tables
  • the memory cells may include any combination of various levels of memory/storage including, but not limited to, EPROM, EEPROM, flash memory, SRAM, anti-fuses, etc.
  • the memory circuitry 1104 may also comprise persistent storage devices, which may be temporal and/or persistent storage of any type, including, but not limited to, non-volatile memory, optical, magnetic, and/or solid state mass storage, and so forth.
  • Storage circuitry 1108 is arranged to provide (with shared or respective controllers) persistent storage of information such as data, applications, operating systems, and so forth.
  • the storage circuitry 1108 may be implemented as hard disk drive (HDD), a micro HDD, a solid-state disk drive (SSDD), flash memory, flash memory cards (e.g., SD cards, microSD cards, xD picture cards, and the like), USB flash drives, resistance change memories, phase change memories, holographic memories, or chemical memories, and the like.
  • HDD hard disk drive
  • SSDD solid-state disk drive
  • flash memory cards e.g., SD cards, microSD cards, xD picture cards, and the like
  • USB flash drives e.g., USB flash drives, resistance change memories, phase change memories, holographic memories, or chemical memories, and the like.
  • the storage circuitry 1108 may be or may include memory devices that use chalcogenide glass, multi -threshold level NAND flash memory, NOR flash memory, single or multi-level Phase Change Memory (PCM), a resistive memory, nanowire memory, ferroelectric transistor random access memory (FeTRAM), anti-ferroelectric memory, magnetoresistive random access memory (MRAM) memory that incorporates memristor technology, phase change RAM (PRAM), resistive memory including the metal oxide base, the oxygen vacancy base and the conductive bridge Random Access Memory (CB-RAM), or spin transfer torque (STT)-MRAM, a spintronic magnetic junction memory based device, a magnetic tunneling junction (MTJ) based device, a Domain Wall (DW) and Spin Orbit Transfer (SOT) based device, a thyristor based memory device, or a combination of any of the above, or other memory. As shown, the storage circuitry 1108 is included in the system 1100; however, in other embodiments, storage circuitry 11
  • the storage circuitry 1108 is configurable to store computational logic 1183 (or “modules 1183”) in the form of software, firmware, microcode, or hardware-level instructions to implement the techniques described herein.
  • the computational logic 1183 may be employed to store working copies and/or permanent copies of programming instructions for the operation of various components of system 1100 (e.g., drivers, libraries, application programming interfaces (APIs), etc.), an OS of system 1100, one or more applications, and/or for carrying out the embodiments discussed herein.
  • the computational logic 1183 may include one or more program code or other sequence of instructions for controlling the various components of the system 1000 as discussed previously.
  • the permanent copy of the programming instructions may be placed into persistent storage devices of storage circuitry 1108 in the factory or in the field through, for example, a distribution medium (not shown), through a communication interface (e.g., from a distribution server (not shown)), or over-the-air (OTA).
  • the computational logic 1183 may be stored or loaded into memory circuitry 1104 as instructions 1182, which are then accessed for execution by the processor circuitry 1102 to carry out the functions described herein.
  • the instructions 1182 direct the processor circuitry 1102 to perform a specific sequence or flow of actions, for example, as described with respect to the flowchart(s) and block diagram(s) of operations and functionality depicted herein.
  • the modules/logic 1183 and/or instructions 1180 may be implemented by assembler instructions supported by processor circuitry 1102 or high- level languages that may be compiled into instructions 1180 to be executed by the processor circuitry 1102.
  • the computer program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Python, Ruby, Scala, Smalltalk, JavaTM, C++, C#, or the like; a procedural programming languages, such as the “C” programming language, the Go (or “Golang”) programming language, or the like; a scripting language such as JavaScript, Server-Side JavaScript (SSJS), PHP, Pearl, Python, Ruby or Ruby on Rails, Accelerated Mobile Pages Script (AMPscript), VBScript, and/or the like; a markup language such as HTML, XML, wiki markup or Wikitext, Wireless Markup Language (WML), etc.; a data interchange format/definition such as Java Script Object Notion (JSON), Apache® MessagePackTM, etc.; a stylesheet language such as Cascading Stylesheets (CSS), extensible stylesheet language
  • the computer program code for carrying out operations of the present disclosure may also be written in any combination of the programming languages discussed herein.
  • the program code may execute entirely on the system 1100, partly on the system 1100 as a stand-alone software package, partly on the system 1100 and partly on a remote computer, or entirely on the remote computer.
  • the remote computer may be connected to the system 1100 through any type of network (e.g., network 1117).
  • the OS of system 1100 manages computer hardware and software resources, and provides common services for various applications (apps) (e.g., HUD apps, mapping applications, turn-by- tum navigation apps, AR apps, gaming apps, on-board diagnostics apps, and/or the like).
  • the OS may include one or more drivers or APIs that operate to control particular devices that are embedded in the system 1100, attached to the system 1100, or otherwise communicatively coupled with the system 1100.
  • the drivers may include individual drivers allowing other components of the system 1100 to interact or control various I/O devices that may be present within, or connected to, the system 1100.
  • the drivers may include a display driver (or HUD system driver) to control and allow access to the HUD system 10000, a touchscreen driver to control and allow access to a touchscreen interface of the system 1100, sensor drivers to obtain sensor readings of sensor circuitry 1120 and control and allow access to sensor circuitry 1120, actuator drivers to obtain actuator positions of the actuators 1124 and/or control and allow access to the actuators 1122, ECU drivers to obtain control system information from one or more of the ECUs 1123, audio drivers to control and allow access to one or more audio devices.
  • the OSs may also include one or more libraries, drivers, APIs, firmware, middleware, software glue, etc., which provide program code and/or software components for one or more applications to obtain and use the data from other applications operated by the system 1100.
  • the OS may be a general purpose OS, while in other embodiments, the OS is specifically written for and tailored to the system 1100.
  • the OS may be Unix or a Unix-like OS such as Linux e.g., provided by Red Hat Enterprise, Windows 10TM provided by Microsoft Corp.®, macOS provided by Apple Inc.®, or the like.
  • the OS may be a mobile OS, such as Android® provided by Google Inc ®, iOS® provided by Apple Inc.®, Windows 10 Mobile® provided by Microsoft Corp.®, KaiOS provided by KaiOS Technologies Inc., or the like.
  • the OS may be an embedded OS or a real-time OS (RTOS), such as Windows Embedded Automotive provided by Microsoft Corp ®, Windows 10 For IoT® provided by Microsoft Corp.®, Apache Mynewt provided by the Apache Software Foundation®, Micro-Controller Operating Systems (“MicroC/OS” or “pC/OS”) provided by Micrium®, Inc., FreeRTOS, VxWorks® provided by Wind River Systems, Inc.®, PikeOS provided by Sysgo AG®, Android Things® provided by Google Inc.®, QNX® RTOS provided by BlackBerry Ltd., or any other suitable embedded OS or RTOS, such as those discussed herein.
  • the OS may be a robotics middleware framework, such as Robot Operating System (ROS), Robotics Technology (RT) -middleware provided by Object Management Group®, Yet Another Robot Platform (YARP), and/or the like.
  • ROS Robot Operating System
  • RT Robotics Technology
  • YARP Yet Another Robot Platform
  • processor circuitry 1102 and memory circuitry 1104 includes hardware accelerators in addition to or alternative to processor cores
  • the hardware accelerators may be pre-configured (e.g., with appropriate bit streams, logic blocks/fabric, etc.) with the logic to perform some functions of the embodiments herein (in lieu of employment of programming instructions to be executed by the processor core(s)).
  • the processor circuitry 1102, memory circuitry 1104, and/or storage circuitry 1108 may be packaged together in a suitable SoC or the like.
  • IX 1106 is a controller area network (CAN) bus system, a Time-Trigger Protocol (TTP) system, or a FlexRay system, which may allow various devices (e.g., ECUs 1123, sensor circuitry 1120, actuators 1122, etc.) to communicate with one another using messages or frames.
  • CAN controller area network
  • TTP Time-Trigger Protocol
  • FlexRay FlexRay
  • the IX 1106 may include any number of other IX technologies, such as a Local Interconnect Network (LIN), industry standard architecture (ISA), extended ISA (EISA), inter-integrated circuit (I2C), a serial peripheral interface (SPI), point-to-point interfaces, power management bus (PMBus), peripheral component interconnect (PCI), PCI express (PCIe), Ultra Path Interface (UPI), Accelerator Link (IAL), Common Application Programming Interface (CAPI), QuickPath Interconnect (QPI), Omni-Path Architecture (OP A) IX, RapidIOTM system interconnects, Ethernet, Cache Coherent Interconnect for Accelerators (CCIA), Gen-Z Consortium IXs, Open Coherent Accelerator Processor Interface (OpenCAPI), and/or any number of other IX technologies.
  • the IX 1106 may be a proprietary bus, for example, used in a SoC based system.
  • the communication circuitry 1109 is a hardware element, or collection of hardware elements, used to communicate over one or more networks (e.g., network 1117) and/or with other devices.
  • the communication circuitry 1109 includes modem 1110 and transceiver circuitry (“TRx”) 1112.
  • the modem 1110 includes one or more processing devices (e.g., baseband processors) to carry out various protocol and radio control functions.
  • Modem 1110 interfaces with application circuitry of system 1100 (e.g., a combination of processor circuitry 1102 and memory 1104) for generation and processing of baseband signals and for controlling operations of the TRx 1112.
  • the modem 1110 handles various radio control functions that enable communication with one or more radio networks 1117 via the TRx 1112 according to one or more wireless communication protocols, such as those discussed herein.
  • the modem 1110 may include circuitry such as, but not limited to, one or more single-core or multi-core processors (e.g., one or more baseband processors) or control logic to process baseband signals received from a receive signal path of the TRx 1112, and to generate baseband signals to be provided to the TRx 1112 via a transmit signal path.
  • the modem 1110 may implement a real-time OS (RTOS) to manage resources of the modem 1110, schedule tasks, etc.
  • RTOS real-time OS
  • the communication circuitry 1109 also includes TRx 1112 to enable communication with wireless networks 1117 using modulated electromagnetic radiation through a non-solid medium.
  • TRx 1112 includes a receive signal path, which comprises circuitry to convert analog RF signals (e.g., an existing or received modulated waveform) into digital baseband signals to be provided to the modem 1110.
  • the TRx 1112 also includes a transmit signal path, which comprises circuitry configurable to convert digital baseband signals provided by the modem 1110 to be converted into analog RF signals (e.g., modulated waveform) that will be amplified and transmitted via an antenna array including one or more antenna elements (not shown).
  • the antenna array is coupled with the TRx 1112 using metal transmission lines or the like.
  • the antenna array may be a one or more microstrip antennas or printed antennas that are fabricated on the surface of one or more printed circuit boards; a patch antenna array formed as a patch of metal foil in a variety of shapes; a glass-mounted antenna array or “on-glass” antennas; or some other known antenna or antenna elements.
  • the TRx 1112 may include one or more radios that are compatible with, and/or may operate according to one or more radio access technologies, protocols, and/or standards including those discussed herein.
  • Network interface circuitry/controller (NIC) 1116 may be included to provide wired communication to the network 1117 or to other devices using a standard network interface protocol. In most cases, the NIC 1116 may be used to transfer data over a network (e.g., network 1117) via a wired connection while the vehicle is stationary (e.g., in a garage, testing facility, or the like)
  • the standard network interface protocol may include Ethernet, Ethernet over GRE Tunnels, Ethernet over Multiprotocol Label Switching (MPLS), Ethernet over USB, or may be based on other types of network protocols, such as Controller Area Network (CAN), Local Interconnect Network (LIN), DeviceNet, ControlNet, Data Highway ⁇ , PROFIBUS, or PROFINET, among many others.
  • CAN Controller Area Network
  • LIN Local Interconnect Network
  • DeviceNet ControlNet
  • Data Highway ⁇ PROFIBUS
  • PROFINET PROFINET
  • Network connectivity may be provided to/from the system 1100 via NIC 1116 using a physical connection, which may be electrical (e.g., a “copper interconnect”) or optical.
  • the physical connection also includes suitable input connectors (e.g., ports, receptacles, sockets, etc.) and output connectors (e.g., plugs, pins, etc.).
  • the NIC 1116 may include one or more dedicated processors and/or FPGAs to communicate using one or more of the aforementioned network interface protocols.
  • the NIC 1116 may include multiple controllers to provide connectivity to other networks using the same or different protocols.
  • the system 1100 may include a first NIC 1116 providing communications to the cloud over Ethernet and a second NIC 1116 providing communications to other devices over another type of network.
  • the input/output (EO) interface 1118 is configurable to connect or coupled the system 1100 with external devices or subsystems.
  • the external interface 1118 may include any suitable interface controllers and connectors to couple the system 1100 with the external components/devices, such as an external expansion bus (e.g., Universal Serial Bus (USB), FireWire, PCIe, Thunderbolt, LightingTM, etc.), used to connect system 1100 with external components/devices, such as sensor circuitry 1120, actuators 1122, electronic control units (ECUs) 1123, positioning system 1145, input device(s) 1186, and picture generation units (PGUs) 1130.
  • USB Universal Serial Bus
  • ECUs electronice control units
  • PGUs picture generation units
  • the EO interface circuitry 1118 may be used to transfer data between the system 1100 and another computer device (e.g., a laptop, a smartphone, or some other user device) via a wired connection.
  • EO interface circuitry 1118 may include any suitable interface controllers and connectors to interconnect one or more of the processor circuitry 1102, memory circuitry 1104, storage circuitry 1108, communication circuitry 1109, and the other components of system 1100.
  • the interface controllers may include, but are not limited to, memory controllers, storage controllers (e.g., redundant array of independent disk (RAID) controllers, baseboard management controllers (BMCs), inpu output controllers, host controllers, etc.).
  • RAID redundant array of independent disk
  • BMCs baseboard management controllers
  • the connectors may include, for example, busses (e.g., IX 1106), ports, slots, jumpers, interconnect modules, receptacles, modular connectors, etc.
  • the I/O interface circuitry 1118 may also include peripheral component interfaces including, but are not limited to, non-volatile memory ports, USB ports, audio jacks, power supply interfaces, on-board diagnostic (OBD) ports, etc.
  • the sensor circuitry 1120 includes devices, modules, or subsystems whose purpose is to detect events or changes in its environment and send the information (sensor data) about the detected events to some other a device, module, subsystem, etc.
  • sensors 1120 include, inter alia, inertia measurement units (IMU) comprising accelerometers, gyroscopes, and/or magnetometers; microelectromechanical systems (MEMS) or nanoelectromechanical systems (NEMS) comprising 3-axis accelerometers, 3-axis gyroscopes, and/or magnetometers; level sensors; flow sensors; temperature sensors (e.g., thermistors); pressure sensors; barometric pressure sensors; gravimeters; altimeters; image capture devices (e.g., cameras); light detection and ranging (LiDAR) sensors; proximity sensors (e.g., infrared radiation detector and the like), depth sensors, ambient light sensors, ultrasonic transceivers; microphones; etc.
  • IMU inertia
  • Some of the sensor circuitry 1120 may be sensors used for various vehicle control systems, and may include, inter alia, exhaust sensors including exhaust oxygen sensors to obtain oxygen data and manifold absolute pressure (MAP) sensors to obtain manifold pressure data; mass air flow (MAF) sensors to obtain intake air flow data; intake air temperature (IAT) sensors to obtain IAT data; ambient air temperature (AAT) sensors to obtain AAT data; ambient air pressure (AAP) sensors to obtain AAP data; catalytic converter sensors including catalytic converter temperature (CCT) to obtain CCT data and catalytic converter oxygen (CCO) sensors to obtain CCO data; vehicle speed sensors (VSS) to obtain VSS data; exhaust gas recirculation (EGR) sensors including EGR pressure sensors to obtain ERG pressure data and EGR position sensors to obtain position/orientation data of an EGR valve pintle; Throttle Position Sensor (TPS) to obtain throttle position/orientation/angle data; a crank/cam position sensors to obtain crank cam/piston position/orientation/angle data; coolant temperature sensors including
  • the positioning circuitry 1145 includes circuitry to receive and decode signals transmitted/broadcasted by a positioning network of a global navigation satellite system (GNSS).
  • GNSS global navigation satellite system
  • Examples of navigation satellite constellations (or GNSS) include United States’ Global Positioning System (GPS), Russia’s Global Navigation System (GLONASS), the European Union’s Galileo system, China’s BeiDou Navigation Satellite System, a regional navigation system or GNSS augmentation system (e.g., Navigation with Indian Constellation (NAVIC), Japan’s Quasi -Zenith Satellite System (QZSS), France’s Doppler Orbitography and Radio- positioning Integrated by Satellite (DORIS), etc.), or the like.
  • GPS Global Positioning System
  • GLONASS Global Navigation System
  • Galileo system China
  • BeiDou Navigation Satellite System e.g., Navigation with Indian Constellation (NAVIC), Japan’s Quasi -Zenith Satellite System (QZSS), France’s Dopp
  • the positioning circuitry 1145 comprises various hardware elements (e.g., including hardware devices such as switches, filters, amplifiers, antenna elements, and the like to facilitate OTA communications) to communicate with components of a positioning network, such as navigation satellite constellation nodes.
  • the positioning circuitry 1145 may include a Micro-Technology for Positioning, Navigation, and Timing (Micro-PNT) IC that uses a master timing clock to perform position tracking/estimation without GNSS assistance.
  • the positioning circuitry 1145 may also be part of, or interact with, the communication circuitry 1109 to communicate with the nodes and components of the positioning network.
  • the positioning circuitry 1145 may also provide position data and/or time data to the application circuitry, which may use the data to synchronize operations with various infrastructure (e.g., radio base stations), for tum-by-tum navigation, or the like. Additionally or alternatively, the positioning circuitry 1145 may be incorporated in, or work in conjunction with the communication circuitry to determine the position or location of the vehicle 10 by, for example, implementing the LTE Positioning Protocol (LPP), Wi-Fi positioning system (WiPS or WPS) methods, triangulation, signal strength calculations, and/or some other suitable localization technique(s).
  • LTP LTE Positioning Protocol
  • WiPS Wi-Fi positioning system
  • triangulation triangulation
  • signal strength calculations and/or some other suitable localization technique(s).
  • Individual ECUs 1123 may be embedded systems or other like computer devices that control a corresponding system of the vehicle 10.
  • individual ECUs 1123 may each have the same or similar components as the system 1100, such as a microcontroller or other like processor device, memory device(s), communications interfaces, and the like.
  • the ECUs 1123 may include, inter alia, aDrivetrain Control Unit (DCU), an Engine Control Unit (ECU), an Engine Control Module (ECM), EEMS, a Powertrain Control Module (PCM), a Transmission Control Module (TCM), a Brake Control Module (BCM) including an anti-lock brake system (ABS) module and/or an electronic stability control (ESC) system, a Central Control Module (CCM), a Central Timing Module (CTM), a General Electronic Module (GEM), a Body Control Module (BCM), a Suspension Control Module (SCM), a Door Control Unit (DCU), a Speed Control Unit (SCU), a Human -Machine Interface (HMI) unit, a Telematic Control Unit (TTU), a Battery Management System (which may be the same or similar as battery monitor 1126) and/or any other entity or node in a vehicle system.
  • the one or more of the ECUs 1123 and/or system 1100 may be part of or included in a vehicle system.
  • the actuators 1122 are devices that allow system 1100 to change a state, position, orientation, move, and/or control a mechanism or system in the vehicle 10.
  • the actuators 1122 comprise electrical and/or mechanical devices for moving or controlling a mechanism or system, and converts energy (e.g., electric current or moving air and/or liquid) into some kind of motion.
  • the actuators 1122 may include one or more electronic (or electrochemical) devices, such as piezoelectric biomorphs, solid state actuators, solid state relays (SSRs), shape-memory alloy- based actuators, electroactive polymer-based actuators, relay driver integrated circuits (ICs), and/or the like.
  • the actuators 1122 may include one or more electromechanical devices such as pneumatic actuators, hydraulic actuators, electromechanical switches including electromechanical relays (EMRs), motors (e g., linear motors, DC motors, brushless motors, stepper motors, servomechanisms, ultrasonic piezo motor with optional position feedback, screw-type motors, etc.), mechanical gears, magnetic switches, valve actuators, fuel injectors, ignition coils, wheels, thrusters, propellers, claws, clamps, hooks, an audible sound generator, and/or other like electromechanical components.
  • EMRs electromechanical relays
  • motors e g., linear motors, DC motors, brushless motors, stepper motors, servomechanisms, ultrasonic piezo motor with optional position feedback, screw-type motors, etc.
  • mechanical gears e e g., linear motors, DC motors, brushless motors, stepper motors, servomechanisms, ultrasonic piezo motor
  • the system 1100 may be configurable to operate one or more actuators 1122 based on one or more captured events and/or instructions or control signals received from various ECUs 1123 or system 1100.
  • the system 1100 may transmit instructions to various actuators 1122 (or controllers that control one or more actuators 1122) to reconfigure an electrical network as discussed herein.
  • the processor circuitry 1102 and/or the ECUs 1123 are configurable to operate one or more actuators 1122 by transmitting/sending instructions or control signals to one or more actuators 1122 based on detected events.
  • Individual ECUs 1123 may be capable of reading or otherwise obtaining sensor data from the sensor circuitry 1120, processing the sensor data to generate control system data, and providing the control system data to the system 1100 for processing.
  • the control system information may be a type of state information discussed previously.
  • an ECU 1123 may provide engine revolutions per minute (RPM) of an engine of the vehicle 10, fuel injector activation timing data of one or more cylinders and/or one or more injectors of the engine, ignition spark timing data of the one or more cylinders (e.g., an indication of spark events relative to crank angle of the one or more cylinders), transmission gear ratio data and/or transmission state data (which may be supplied to the ECU 1123 by the TCU), real-time calculated engine load values from the ECM, etc.; a TCU may provide transmission gear ratio data, transmission state data, etc.; and the like.
  • RPM revolutions per minute
  • the EO devices 1186 may be present within, or connected to, the system 1100.
  • the EO devices 1186 include input devices and output devices including one or more user interfaces designed to enable user interaction with the system 1100 and/or peripheral component interaction with the system 1100 via peripheral component interfaces.
  • the input devices include any physical or virtual means for accepting an input including, inter alia, one or more physical or virtual buttons (e.g., a reset button), a physical keyboard, keypad, mouse, touchpad, touchscreen, microphones, scanner, headset, and/or the like.
  • user input may comprise voice commands, control input (e.g., via buttons, knobs, switches, etc ), an interface with a smartphone, or any combination thereof.
  • the output devices are used to show or convey information, such as sensor readings, actuator position(s), or other like information. Data and/or graphics may be displayed on one or more user interface components of the output devices.
  • the output devices may include any number and/or combinations of audio or visual display, including, inter alia, one or more simple visual outputs/indicators (e g., binary status indicators (e.g., light emitting diodes (LEDs)) and multi character visual outputs, or more complex outputs such as display devices or touchscreens (e.g., Liquid Chrystal Displays (LCD), LED displays, quantum dot displays, projectors, Head-Up Display (HUD) devices, etc ), with the output of characters, graphics, multimedia objects, and the like being generated or produced from the operation of the system 1100.
  • simple visual outputs/indicators e g., binary status indicators (e.g., light emitting diodes (LEDs)
  • multi character visual outputs e.g., multi character visual outputs
  • the output devices may also include speakers or other audio emitting devices, printer(s), and/or the like.
  • the sensor circuitry 1120 may be used as an input device (e.g., an image capture device, motion capture device, or the like) and one or more actuators 1122 may be used as an output device (e.g., an actuator to provide haptic feedback or the like).
  • NFC near field communication
  • the output devices include the HUD system 1000.
  • the HUD system 1000 is also included in the vehicle 1005.
  • the HUD system 1000 comprises one or more PGUs 1130 (e.g., PGU 101 of Figure 1) and one or more optical elements (e.g., corrector 102 (or components thereof) and/or combiner 103 of Figure 1) where at least one of the optical elements is a display element (e.g., combiner 103 of Figure 1).
  • the PGU 1130 includes a projection unit (or “projector”) and a computer device.
  • the computer device comprises one or more electronic elements that create/generate digital content to be displayed by the projection unit.
  • the computer device may be the processor circuitry 1102, HUD processor 1010, OBU 1020, and/or a similar processing device as discussed herein.
  • the digital content may be any type of content stored by the storage circuitry 1108, streamed from a remote system/service (e.g., one or more vehicles, one or more roadside units (RSUs), app server(s), cloud computing service, edge network/computing service, etc.) via the network 1117 and/or the communication circuitry 1109, and/or based on outputs from various sensors 1120, ECUs 1124, and/or actuators 1122.
  • a remote system/service e.g., one or more vehicles, one or more roadside units (RSUs), app server(s), cloud computing service, edge network/computing service, etc.
  • the content to be displayed may include, for example, safety messages (e.g., collision warnings, emergency warnings, pre-crash warnings, traffic warnings, and the like), Short Message Service (SMS)/Multimedia Messaging Service (MMS) messages, navigation system information (e.g., maps, turn-by-turn indicator arrows), movies, television shows, video game images, and the like.
  • safety messages e.g., collision warnings, emergency warnings, pre-crash warnings, traffic warnings, and the like
  • SMS Short Message Service
  • MMS Multimedia Messaging Service
  • navigation system information e.g., maps, turn-by-turn indicator arrows
  • movies e.g., movies, television shows, video game images, and the like.
  • the projector may be the same or similar as the projection device 1030 discussed previously with respect to Figure 10, and in some implementations, the projection unit includes the imaging matrix 1050 discussed previously with respect to Figure 10.
  • the projection unit (or “projector”) is a device or system that projects still or moving images onto the surface(s) of a display surface such as display device 1060 and/or virtual image planes 1070, and/or the surface(s) of an HOE 1131.
  • the projected light may be projected onto the surface(s) via one or more reflection surfaces (e g., mirrors) based on signals received from the computer device.
  • the projection unit includes a light generator (or light source) to generate light based on the digital content, which is focused or (re)directed to one or more HOEs (e.g., display surface(s)).
  • the projection unit also includes various electronic elements (or an electronic system) that convert digital content or signals obtained from the computer device into signals for controlling the light source to generate/output light of different colors and intensities.
  • the projector is a light emitting diode (LED) projector, a laser diode projector, a liquid crystal display (LCD) projector and/or an LCD with laser illumination, a digital light processing (DLP) projector, a digital micro-mirror device (DMD), a liquid crystal on silicon (LCoS) matrix/projector, micromirror(s), a microelectromechanical (MEMS) and/or microoptoelectromechanical system (MOEMS) scanner, MEMS and/or MOEMS laser scanner, MEMS and/or MOEMS mirror(s), and/or any other suitable projection device, including those discussed elsewhere herein.
  • LED light emitting diode
  • DLP digital light processing
  • DMD digital micro-mirror device
  • LCDoS liquid crystal on silicon
  • MEMS microelectromechanical
  • MOEMS microoptoelectromechanical system
  • the PGU 1130 may also comprise scanning mirrors that copy the image pixel -by-pixel and then project the image for display.
  • the PGU 1130 performs scanning using one dual-axis mirror or two single-axis mirrors (sometimes referred to as “tip-tilt”), and/or by the use of some other matrix-type amplitude or phase modulation.
  • scanning may be performed using an array of semiconductor lasers.
  • Various drive forces may be used to operate such micromirror and/or MEMS/MOEMS devices such as relevant principles for driving such as electromagnetic, electrostatic, thermo-electric, and/or piezo-electric effects.
  • the one or more optical elements of the HUD system 1000 include optical components that manipulate light such as lenses, filters, prisms, mirrors, beam splitters, diffraction gratings, multivariate optical elements (MOEs), and/or the like, including the various optical elements discussed previously with respect to corrector 102 of Figure 1. Additionally or alternatively, the one or more optical elements include a collimator (e.g., a one or more lenses, apertures, curved mirror, etc.) to narrow the beam of projected light from the projector.
  • a collimator e.g., a one or more lenses, apertures, curved mirror, etc.
  • the one or more optical elements include a relay lens assembly and another combiner element (which is different than the combiner used for displaying the projected image).
  • the relay lens assembly comprises one or more relay lenses, which re-image images from the projector into an intermediate image that then reaches the HOE 1131 in or on the display element through a reflector.
  • the one or more optical elements include a holographic optical element (HOE) 1131.
  • the HOE 1131 is an optical element that is used to produce holographic images.
  • the at least one optical element that is the display element is the combiner 103 of Figure 1.
  • the combiner 103 combines the generated/projected light with external (e.g., natural) light and/or combines different light paths into one light path to define a palette of colors.
  • the combiner 103 may be a beam splitter or semi-transparent display surface located directly in front of a viewer (e.g., operator of vehicle 1005) that redirects the projected image from projector in such a way as to see the FoV and the projected image at the same time.
  • the combiner 103 In addition to reflecting the projected light from the projector unit, the combiner 103 also allows other wavelengths of light to pass through the combiner 103. In this way, the combiner 103 (as well as the HOE 1131) mixes the digital images output by the projector with a viewed real-world to facilitate augmented reality.
  • the combiner (e.g., combiner 103) may have a flat surface or a curved surface (e.g., concave or convex) to aid in focusing the projected image.
  • the HOE 1131 may be a transmissive optical element, where the transmitted beam (reference beam) hits the HOE 1131 and the diffracted beam(s) go through the HOE 1131.
  • the HOE 1131 may be a reflective optical element, where the transmitted beam (reference beam) hits the HOE 1131 and the diffracted beam(s) reflects off of the HOE 1131 (e.g., the reference beam and diffracted beams are on the same side of the HOE 1131).
  • the combiner 103 may be formed or made of a suitable material and includes an HOE 1131 that enables the combiner 103 to reflect the projected light while allowing external (natural) light to pass through the combiner 103.
  • the combiner 103 may be a windshield or windscreen of the vehicle 1005, a separate semi-reflective surface mounted to a dashboard of the vehicle 1005, a switchable projection screen that switches between high contrast mode (e.g., a frosted or matte) and a transparent (e.g., holographic) mode, an HMD, helmet-mounted display, and/or the like.
  • the HOE 1131 is disposed on an inner part of the combiner 103.
  • the combiner 103 may be formed or made of a suitable material with a holographic film that enables the combiner 103 to reflect the projected light while allowing external (natural) light to pass through the combiner 103.
  • the HOE 1131 is the holographic film.
  • the holographic film may cover an entirety of the combiner 103 or a selected portion of the combiner 103.
  • the holographic film is a film composite including one or more substrate films, one or more photopolymer films, one or more protective films, and/or other films/substrates and/or combinations thereof. These films may be arranged in any desired arrangement or configuration.
  • the HOE 1131 is disposed inside the combiner 103.
  • the combiner 103 may be formed or made of one or more suitable materials that surround the HOE 1131 using a duplex production process to form an A-B duplex (where A and B are materials) or a triplex production process to form an A-B-A triplex or an A-B-C triplex (where A, B, and C are materials).
  • the materials or material composites (including materials A, B, and C) of the HOE 1131 are based on polycarbonate (PC), polyethylene terephthalate (PET), polybutylene terephthalate, polyethylene, polypropylene, cellulose acetate, cellulose hydrate, cellulose nitrate, cyclo olefin polymers (also referred to as cyclic olefin polymers), polystyrene, polyepoxides, polysulphone, cellulose triacetate (CTA), polyamide, polymethyl methacrylate, polyvinyl chloride, polyvinyl butyral (PVB), polydicyclopentadiene, thermoplastic polyurethane (TPU), and/or combinations thereof.
  • PC polycarbonate
  • PET polyethylene terephthalate
  • PET polybutylene terephthalate
  • polyethylene polypropylene
  • cellulose acetate cellulose hydrate
  • cellulose nitrate cyclo olefin polymers
  • composites of the aforementioned materials may be formed as film laminates, coextrudates, and/or transparent films (e.g. with little or no haze as described in ASTM International, “Standard Test Method for Haze and Luminous Transmittance of Transparent Plastics”, ASTM D1003, West Conshohocken, PA (2021) (“ASTM D1003-21”)).
  • the suitable material of the combiner 103 may be formed from one or more of glass, plastic(s), polymer(s), and/or other similar material including any one or more of the aforementioned HOE 1131 materials.
  • the battery 1124a and/or power block 1124b may power the system 1100.
  • the battery 1124a may be a typical lead-acid automotive battery, although in some embodiments, such as when vehicle 1005 is a hybrid vehicle, the battery 1124a may be a lithium ion battery, a metal-air battery, such as a zinc-air battery, an aluminum-air battery, a lithium-air battery, a lithium polymer battery, and the like.
  • the battery monitor 1126 may be included in the system 1100 to track/monitor various parameters of the battery 1124a, such as a state of charge (SoCh) of the battery 1124, state of health (SoH), and the state of function (SoF) of the battery 1124.
  • the battery monitor 1126 may include a battery monitoring IC, which may communicate battery information to the processor circuitry 1102 over the IX 1106.
  • EO devices such as a display, a touchscreen, or keypad may be connected to the system 1100 via IX 1106 to accept input and display outputs.
  • GNSS and/or GPS circuitry and associated applications may be included in or connected with system 1100 to determine a geolocation of the vehicle 1005.
  • the communication circuitry 1109 may include a Universal Integrated Circuit Card (UICC), embedded UICC (eUICC), and/or other elements/components that may be used to communicate over one or more wireless networks 1117.
  • UICC Universal Integrated Circuit Card
  • eUICC embedded UICC
  • Example A01 includes a correction optics subassembly of a holographic head-up display (hHUD) system, comprising: one or more optical elements disposed between a picture generation unit (PGU) of the hHUD system and a display element of the hHUD system, wherein a combination of surfaces of the one or more optical elements includes at least one reflective surface and at least two refractive surfaces.
  • hHUD holographic head-up display
  • Example A02 includes the correction optics subassembly of example A01 and/or some other example(s) herein, wherein a first refractive surface of the at least two refractive surfaces is a surface of a first optical element of the one or more optical elements, a second refractive surface of the at least two refractive surfaces is a surface of a second optical element of the one or more optical elements, and the at least one reflective surface is a surface of a third optical element of the one or more optical elements, wherein the first, second, and third optical elements are different from one another.
  • Example A03 includes the correction optics subassembly of example A02 and/or some other example(s) herein, wherein the third optical element is disposed between the first and second optical elements.
  • Example A04 includes the correction optics subassembly of examples A02-A03 and/or some other example(s) herein, wherein the first optical element is configured to guide light generated by the PGU to the third optical element, the third optical element is configured to reflect the guided light from the first optical element to the second optical element, and the second optical element is configured to guide the reflected light to the display element.
  • Example A05 includes the correction optics subassembly of example A04 and/or some other example(s) herein, wherein a shape of the second optical element is a shape substantially having rotational symmetry.
  • Example A06 includes the correction optics subassembly of example A05 and/or some other example(s) herein, wherein the hHUD system is an on-axis optical system.
  • Example A07 includes the correction optics subassembly of example A04 and/or some other example(s) herein, wherein a shape of the second optical element is a substantially aspherical shape.
  • Example A08 includes the correction optics subassembly of example A07 and/or some other example(s) herein, wherein the hHUD system is an off-axis optical system.
  • Example A09 includes the correction optics subassembly of example A04 and/or some other example(s) herein, wherein a shape of the second optical element is a substantially prismatic shape.
  • Example A10 includes the correction optics subassembly of example A09 and/or some other example(s) herein, wherein the hHUD system is an on-axis optical system or an off-axis optical system.
  • Example Al l includes the correction optics subassembly of examples A01-A10 and/or some other example(s) herein, wherein two refractive surfaces of the at least two refractive surfaces and one reflective surface of the at least one reflective surface are part of a single optical element of the one or more optical elements.
  • Example A12 includes the correction optics subassembly of example Al l and/or some other example(s) herein, wherein a first refractive surface of the two refractive surfaces is configured to guide light generated by the PGU to the one reflective surface, the one reflective surface is configured to reflect the guided light from the first refractive surface to a second refractive surface of the two refractive surfaces, and the second refractive surface is configured to guide the reflected light to the display element.
  • Example A13 includes the correction optics subassembly of example A12 and/or some other example(s) herein, wherein a shape of the second optical element is a freeform shape without rotational symmetry.
  • Example A14 includes the correction optics subassembly of examples A02-A13 and/or some other example(s) herein, wherein the second refractive surface is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
  • Example A15 includes the correction optics subassembly of examples A02-A14 and/or some other example(s) herein, wherein the first refractive surface is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
  • Example A16 includes the correction optics subassembly of examples A14-A15 and/or some other example(s) herein, wherein the freeform surface is formed based on a function selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non-orthogonal polynomial, hybrid stitched representations based on a combination of two or more functions selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non- orthogonal polynomial.
  • Example A17 includes the correction optics subassembly of examples A01-A16 and/or some other example(s) herein, wherein the one or more optical elements are arranged with respect to one another and with respect to the PGU and the display element to correct aberrations in images created by projected light from the PGU.
  • Example A18 includes the correction optics subassembly of examples A01-A17 and/or some other example(s) herein, wherein each optical element of the one or more optical elements is an optical element selected from a group consisting of a lens, prism, prismatic lens, mirror, and holographic optical element.
  • Example A19 includes the correction optics subassembly of examples A01-A18 and/or some other example(s) herein, wherein each optical element of the one or more optical elements are formed into a three-dimensional shape selected from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid, or a combination of any two or more shapes from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid.
  • Example A20 includes the correction optics subassembly of examples A01-A19 and/or some other example(s) herein, wherein the one or more optical elements includes a scattering surface on which light representative of a virtual image is projected by the PGU.
  • Example A21 includes the correction optics subassembly of example A20 and/or some other example(s) herein, wherein the scattering surface comprises a diffusion screen, a diffuser plate, or an array of microlenses.
  • Example A22 includes the correction optics subassembly of examples A01-A21 and/or some other example(s) herein, wherein the display element comprises a holographic optical element disposed on or inside a windshield of a vehicle.
  • Example A23 includes a holographic head-up display (hHUD) system, comprising: a picture generation unit (PGU); a display element; and the correction optics assembly of any one of examples A01-A22 and/or some other example(s) herein disposed between of the PGU and the display element.
  • HHUD holographic head-up display
  • Example B01 includes a compact holographic head-up display (hHUD) device, comprising: a picture generation unit (PGU), a combiner comprising a holographic optical element (HOE) with an optical power between 1,1 - 6,6 diopters; and a correction optics assembly, disposed between the PGU and the combiner, the correction optics assembly comprising at least one optical element with at least two refractive surfaces.
  • hHUD compact holographic head-up display
  • Example B02 includes the compact hHUD device of example B01 and/or some other example(s) herein, wherein the at least one optical element further comprises at least one reflective surface disposed between the at least two refractive surfaces.
  • Example B03a includes the compact hHUD device of example B02 and/or some other example(s) herein, further comprising: a refractive surface of the at least two refractive surfaces is positioned and/or oriented between the HOE and the at least one reflective surface; and another refractive surface of the at least two refractive surfaces is positioned and/or oriented between the PGU and the at least one reflective surface.
  • Example B03b includes the compact hHUD device of examples B02-B03a and/or some other example(s) herein, further comprising: at least one refractive optical element disposed between the HOE and the at least one reflective surface; and at least one refractive optical element disposed between the PGU and the at least one reflective surface.
  • Example B04 includes the compact hHUD device of examples B03a-B03b and/or some other example(s) herein, wherein a form of the at least one optical element is a form substantially having rotational symmetry.
  • Example B05 includes the compact hHUD device of examples B03a-B04 and/or some other example(s) herein, wherein the compact hHUD device is an on-axis optical system.
  • Example B06 includes the compact hHUD device of examples B03a-B03b and/or some other example(s) herein, wherein a shape of a surface of the at least one optical element is a substantially aspherical shape.
  • Example B07 includes the compact hHUD device of examples B03a-B03b, B06, and/or some other example(s) herein, wherein the compact hHUD device is an off-axis optical system.
  • Example B08 includes the compact hHUD device of example B03 and/or some other example(s) herein, wherein a form of the at least one optical element is a substantially prismatic form.
  • Example B09 includes the compact hHUD device of examples B03, B09, and/or some other example(s) herein, wherein the compact hHUD device is an on-axis optical system or an off-axis optical system.
  • Example B10 includes the compact hHUD device of example B02 and/or some other example(s) herein, wherein two refractive surfaces of the at least two refractive surfaces and one reflective surface of the at least one reflective surface are part of a single optical element of the at least one optical element.
  • Example Bl l includes the compact hHUD device of example B02 and/or some other example(s) herein, wherein a shape of a surface of the at least one optical element is a freeform shape without rotational symmetry.
  • Example B 12 includes the compact hHUD device of examples B02-B11 and/or some other example(s) herein, wherein at least one refractive surface of the at least two refractive surfaces is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
  • Example B13 includes the compact hHUD device of example B12 and/or some other example(s) herein, wherein at least one other refractive surface of the at least two refractive surfaces is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
  • Example B 14 includes the compact hHUD device of examples B12-B 13 and/or some other example(s) herein, wherein the freeform surface is formed based on a function selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non-orthogonal polynomial, hybrid stitched representations based on a combination of two or more functions selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non-orthogonal polynomial .
  • Example B 15 includes the compact hHUD device of examples B02-B14 and/or some other example(s) herein, wherein the correction optics assembly comprises a plurality of optical elements, the plurality of optical elements including the at least one optical element.
  • Example B16 includes the compact hHUD device of example B15 and/or some other example(s) herein, wherein the plurality of optical elements are arranged with respect to one another and with respect to the PGU and the combiner to correct aberrations in images created by projected light from the PGU.
  • Example B 17 includes the compact hHUD device of examples B02-B16 and/or some other example(s) herein, wherein the at least one optical element comprises one or more of a lens, prism, prismatic lens, mirror, and a holographic optical element.
  • Example B 18 includes the compact hHUD device of examples B02-B17 and/or some other example(s) herein, wherein the at least one optical element is formed into a three-dimensional shape selected from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid, or a combination of any two or more shapes from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid.
  • Example B 19 includes the compact hHUD device of examples B02-B18 and/or some other example(s) herein, wherein the correction optics assembly further comprises a scattering surface on to which light representative of a virtual image is projected by the PGU.
  • Example B20 includes the compact hHUD device of example B19 and/or some other example(s) herein, wherein the scattering surface comprises a diffusion screen, a diffuser plate, or an array of microlenses.
  • Example C01 includes a vehicle comprising: one or more control components for controlling operation of the vehicle; a windshield; and the head-up display (hHUD) system of example A23 and/or the hHUD device of any one or more of examples B01-B20.
  • hHUD head-up display
  • the phrase “A or B” means (A), (B), or (A and B).
  • the phrases “A/B” and “A or B” mean (A), (B), or (A and B), similar to the phrase “A and/or B.”
  • the phrase “at least one of A and B” means (A), (B), or (A and B).
  • the description may use the terms “embodiment” or “embodiments,” which may each refer to one or more of the same or different embodiments.
  • the terms “comprising,” “including,” “having,” and the like, as used with respect to one or more embodiments, are synonymous, are generally intended as “open” terms (e g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.), and specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
  • Coupled may mean two or more elements are in direct physical or electrical contact with one another, may mean that two or more elements indirectly contact each other but still cooperate or interact with each other, and/or may mean that one or more other elements are coupled or connected between the elements that are said to be coupled with each other.
  • directly coupled may mean that two or more elements are in direct contact with one another.
  • communicatively coupled may mean that two or more elements may be in contact with one another by a means of communication including through a wire or other interconnect connection, through a wireless communication channel or ink, and/or the like.
  • Fabrication refers to the creation of a metal structure using fabrication means.
  • the term “fabrication means” as used herein refers to any suitable tool or machine that is used during a fabrication process and may involve tools or machines for cutting (e.g., using manual or powered saws, shears, chisels, routers, torches including handheld torches such as oxy-fuel torches or plasma torches, and/or computer numerical control (CNC) cutters including lasers, mill bits, torches, water jets, routers, etc ), bending (e.g., manual, powered, or CNC hammers, pan brakes, press brakes, tube benders, roll benders, specialized machine presses, etc.), assembling (e.g., by welding, soldering, brazing, crimping, coupling with adhesives, riveting, using fasteners, etc.), molding or casting (e.g., die casting, centrifugal casting, injection molding, extrusion molding, matrix molding, three
  • fastener refers to device that mechanically joins or affixes two or more objects together, and may include threaded fasteners (e.g., bolts, screws, nuts, threaded rods, etc.), pins, linchpins, r-clips, clips, pegs, clamps, dowels, cam locks, latches, catches, ties, hooks, magnets, molded or assembled joineries, and/or the like.
  • threaded fasteners e.g., bolts, screws, nuts, threaded rods, etc.
  • lateral refers to directions or positions relative to an object spanning the width of a body of the object, relating to the sides of the object, and/or moving in a sideways direction with respect to the object.
  • longitudinal refers to directions or positions relative to an object spanning the length of a body of the object; relating to the top or bottom of the object, and/or moving in an upwards and/or downwards direction with respect to the object.
  • linear refers to directions or positions relative to an object following a straight line with respect to the object, and/or refers to a movement or force that occurs in a straight line rather than in a curve.
  • linear refers to directions or positions relative to an object following along a given path with respect to the object, wherein the shape of the path is straight or not straight.
  • the terms “flexible,” “flexibility,” and/or “pliability” refer to the ability of an object or material to bend or deform in response to an applied force; “the term “flexible” is complementary to “stiffness.”
  • the term “stiffness” and/or “rigidity” refers to the ability of an object to resist deformation in response to an applied force.
  • elasticity refers to the ability of an object or material to resist a distorting influence or stress and to return to its original size and shape when the stress is removed. Elastic modulus (a measure of elasticity) is a property of a material, whereas flexibility or stiffness is a property of a structure or component of a structure and is dependent upon various physical dimensions that describe that structure or component.
  • the term “wear” refers to the phenomenon of the gradual removal, damaging, and/or displacement of material at solid surfaces due to mechanical processes (e.g., erosion) and/or chemical processes (e.g., corrosion). Wear causes functional surfaces to degrade, eventually leading to material failure or loss of functionality.
  • the term “wear” as used herein may also include other processes such as fatigue (e.g., he weakening of a material caused by cyclic loading that results in progressive and localized structural damage and the growth of cracks) and creep (e.g., the tendency of a solid material to move slowly or deform permanently under the influence of persistent mechanical stresses).
  • Mechanical wear may occur as a result of relative motion occurring between two contact surfaces. Wear that occurs in machinery components has the potential to cause degradation of the functional surface and ultimately loss of functionality.
  • Various factors, such as the type of loading, type of motion, temperature, lubrication, and the like may affect the rate of wear.
  • circuitry refers to a circuit or system of multiple circuits configurable to perform a particular function in an electronic device.
  • the circuit or system of circuits may be part of, or include one or more hardware components, such as a logic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group), an Application Specific Integrated Circuit (ASIC), a field-programmable gate array (FPGA), programmable logic device (PLD), System-on-Chip (SoC), System-in-Package (SiP), Multi-Chip Package (MCP), digital signal processor (DSP), etc., that are configurable to provide the described functionality.
  • ASIC Application Specific Integrated Circuit
  • FPGA field-programmable gate array
  • PLD programmable logic device
  • SoC System-on-Chip
  • SiP System-in-Package
  • MCP Multi-Chip Package
  • DSP digital signal processor
  • circuitry may also refer to a combination of one or more hardware elements with the program code used to carry out the functionality of that program code. Some types of circuitry may execute one or more software or firmware programs to provide at least some of the described functionality. Such a combination of hardware elements and program code may be referred to as a particular type of circuitry.
  • the term “element” may refer to a unit that is indivisible at a given level of abstraction and has a clearly defined boundary, wherein an element may be any type of entity.
  • entity may refer to (1) a distinct component of an architecture or device, or (2) information transferred as a payload.
  • device may refer to a physical entity embedded inside, or attached to, another physical entity in its vicinity, with capabilities to convey digital information from or to that physical entity.
  • controller may refer to an element or entity that has the capability to affect a physical entity, such as by changing its state or causing the physical entity to move.
  • the term “computer device” may describe any physical hardware device capable of sequentially and automatically carrying out a sequence of arithmetic or logical operations, equipped to record/store data on a machine readable medium, and transmit and receive data from one or more other devices in a communications network.
  • a computer device may be considered synonymous to, and may hereafter be occasionally referred to, as a computer, computing platform, computing device, etc.
  • the term “computer system” may include any type interconnected electronic devices, computer devices, or components thereof. Additionally, the term “computer system” and/or “system” may refer to various components of a computer that are communicatively coupled with one another.
  • computer system and/or “system” may refer to multiple computer devices and/or multiple computing systems that are communicatively coupled with one another and configurable to share computing and/or networking resources.
  • Examples of “computer devices,” “computer systems,” etc. may include cellular phones or smart phones, feature phones, tablet personal computers, wearable computing devices, an autonomous sensors, laptop computers, desktop personal computers, video game consoles, digital media players, handheld messaging devices, personal data assistants, an electronic book readers, augmented reality devices, server computer devices (e.g., stand-alone, rack-mounted, blade, etc.), cloud computing services/systems, network elements, in-vehicle infotainment (IVI), in-car entertainment (ICE) devices, an Instrument Cluster (IC), head-up display (HUD) devices, onboard diagnostic (OBD) devices, dashtop mobile equipment (DME), mobile data terminals (MDTs), Electronic Engine Management System (EEMS), electronic/engine control units (ECUs), electronic/engine control modules (ECMs), embedded systems, microcontrollers
  • network element refers to a physical computing device of a wired or wireless communication network and be configurable to host a virtual machine.
  • network element may describe equipment that provides radio baseband functions for data and/or voice connectivity between a network and one or more users.
  • the term “network element” may be considered synonymous to and/or referred to as a networked computer, networking hardware, network equipment, network appliance, router, switch, hub, bridge, radio network controller, firewall, radio access network (RAN) node, base station, gateway, server, and/or any other like device.
  • RAN radio access network
  • the term “network element” may be considered synonymous to and/or referred to as a “base station.”
  • the term “base station” may be considered synonymous to and/or referred to as a node B, an enhanced or evolved node B (eNB), next generation nodeB (gNB), base transceiver station (BTS), access point (AP), roadside unit (RSU), etc., and may describe equipment that provides the radio baseband functions for data and/or voice connectivity between a network and one or more users.
  • eNB enhanced or evolved node B
  • gNB next generation nodeB
  • BTS base transceiver station
  • AP access point
  • RSU roadside unit
  • the terms “vehicle-to-vehicle” and “V2V” may refer to any communication involving a vehicle as a source or destination of a message.
  • V2I vehicle-to-infrastructure
  • V2N vehicle-to-network
  • V2P vehicle-to-pedestrian
  • V2X V2X communications
  • channel refers to any transmission medium, either tangible or intangible, which is used to communicate data or a data stream.
  • the term “channel” may be synonymous with and/or equivalent to “communications channel,” “data communications channel,” “transmission channel,” “data transmission channel,” “access channel,” “data access channel,” “link,” “data link,” “carrier,” “radiofrequency carrier,” and/or any other like term denoting a pathway or medium through which data is communicated.
  • the term “link” may refer to a connection between two devices through a Radio Access Technology (RAT) for the purpose of transmitting and receiving information.
  • RAT Radio Access Technology
  • optical element refers to any component, object, substance, and/or material used for, or otherwise related to the genesis and propagation of light, the changes that light undergoes and produces, and/or other phenomena associated with the principles that govern the image-forming properties of various devices that make use of light and/or the nature and properties of light itself.
  • lens refers to a transparent substance or material (usually glass) that is used to form an image of an object by focusing rays of light from the object.
  • a lens is usually circular in shape, with two polished surfaces, either or both of which is/are curved and may be either convex (bulging) or concave (depressed). The curves are almost always spherical; i.e., the radius of curvature is constant.
  • mirror refers to a surface of a material or substance that diverts a ray of light according to the law of reflection.
  • the term “prism” refers to a transparent optical element with flat, polished surface(s) that refract light. Additionally or alternatively, the term “prism” refers to a polyhedron comprising an //-sided polygon base, a second base that is a translated copy (rigidly moved without rotation) of the first base, and n other faces j oining corresponding sides of the two bases.
  • holographic optical element refers to an optical component (e g., mirrors, lenses, filters, beam splitters, directional diffusers, diffraction gratings, etc.) that produces holographic images using holographic imaging processes or principles, such as the principles of diffraction.
  • optical component e g., mirrors, lenses, filters, beam splitters, directional diffusers, diffraction gratings, etc.
  • the shape and structure of an HOE is dependent on the piece of hardware it is needed for, and the coupled wave theory is a common tool used to calculate the diffraction efficiency or grating volume that helps with the design of an HOE.
  • the term “focus” refers to a point where light rays originating from a point on the object converge. Sometimes, the term “focus” may be referred to as an “image point”.
  • the term “optical power” refers to the degree to which an optical element or optical system converges or diverges light. The optical power of an optical element is equal to the reciprocal of the focal length of the device. High optical power corresponds to short focal length.
  • the SI unit for optical power is the inverse meter (m _1 ), which is commonly referred to as a Diopter (or “Dioptre”).
  • the term “optical power” is sometimes referred to as dioptric power, refractive power, focusing power, or convergence power.
  • the term “vergence” refers to the angle formed by rays of light that are not perfectly parallel to one another. Additionally or alternatively, the term “vergence” refers to the curvature of optical wavefronts.
  • the terms “convergence”, “convergent”, and “converging” refer to light rays that move closer to the optical axis as they propagate. Additionally or alternatively, the terms “convergence”, “convergent”, and “converging” refer to wavefronts propagating toward a single point and/or wavefronts that yield a positive vergence.
  • the terms “divergence”, “divergent”, and “diverging” refer to light rays that move away from the optical axis as they propagate.
  • the terms “divergence”, “divergent”, and “diverging” refer to wavefronts propagating away from a single source point and/or wavefronts that yield a negative vergence.
  • convex lenses and concave mirrors cause parallel rays to converge
  • concave lenses and convex mirrors cause parallel rays to diverge.
  • wavefront refers to a set (locus) of all points where a wave has the same phase and/or a surface or medium over which an optical wave has a constant phase.
  • coincidedence in the context of optics refers to an instance of rays of light striking a surface at the same point and at the same time.
  • normal refers to a line, ray, or vector that is perpendicular to a given object.
  • normal ray is the outward-pointing light ray perpendicular to the surface of an optical medium and/or optical element at a given point.
  • optical axis refers to a line along which there is some degree of rotational symmetry in an optical system. Additionally or alternatively, the term “optical axis” refers to a straight line passing through the geometrical center of an optical element. The path of light ray(s) along the optical axis is perpendicular to the surface(s) of the optical element.
  • optical axis may also be referred to as a “principal axis”. All other ray paths passing through the optical element and its optical center (the geometrical center of the optical element) may be referred to as “secondary axes”.
  • the optical axis of a lens is a straight line passing through the geometrical center of the lens and joining the two centers of curvature of its surfaces.
  • the optical axis of a curved mirror passes through the geometric center of the mirror and its center of curvature.
  • off-axis optical system refers to an optical system in which the optical axis of the aperture is not coincident with the mechanical center of the aperture.
  • mechanical axis refers to an axis that passes through the physical center of an optical element and/or is perpendicular to the outside edges of the optical element.
  • aperture refers to a hole or an opening through which light travels. Additionally or alternatively, the “aperture” and focal length of an optical system determine the cone angle of a bundle of rays that come to a focus in the image plane.
  • curvature refers to a rate of change of direction of a curve with respect to distance along the curve.
  • the term “spherical” refers to an object having a shape that is or is substantially similar to a sphere.
  • a “sphere” is a set of all points in three-dimensional space lying the same distance (the radius) from a given point (the center), or the result of rotating a circle about one of its diameters.
  • the term “toroidal” refers to an object having a shape that is or is substantially similar to a torus.
  • a “torus” is a surface of revolution generated by revolving a circle in three-dimensional space about an axis that is coplanar with the circle.
  • anamorphic surface refers to a non-symmetric surface with bi axial symmetry.
  • anamorphic element and/or “anamorphic optical element” refer to an optical element with at least one anamorphic surface and/or an optical element with a combination of spherical, aspherical, and toroidal surfaces.
  • the term “freeform surface” refers to a geometric element that does not have rigid radial dimensions. Additionally or alternatively, the term “freeform surface” refers to a surface with no axis of rotational invariance. Additionally or alternatively, the term “freeform surface” refers to a non-symmetric surface whose asymmetry goes beyond bi-axial symmetry, spheres, rotationally symmetric aspheres, off-axis conics, and toroids. Additionally or alternatively, the term “freeform surface” refers to a freeform surface may be identified by a comatic-shape component or higher-order rotationally variant terms of the orthogonal polynomial pyramids (or equivalents thereof).
  • the term “freeform surface” refers to a specially shaped surface that refracts an incident light beam in a predetermined way. Freeform surfaces have more degrees of freedom in comparison with rotationally symmetric surfaces.
  • the term “freeform optical element” and/or “FOE” refers to an optical element with at least one freeform surface. Additionally or alternatively, the term “freeform optical element” and/or “FOE” refers to an optical element that has no translational or rotational symmetry about axes normal to the mean plane. Additionally or alternatively, the term “freeform optical element” and/or “FOE” refers to an optical element with specially shaped surface(s) that refract an incident light beam in a predetermined way.
  • FOE surface structure In contrast to diffractive optical elements (DOEs), the FOE surface structure is smooth, without abrupt height jumps or high-frequency modulations. Similar to classical lenses, FOEs affect a light beam by refraction at their curved surface structures. FOE refraction behavior is determined by geometrical optics (e.g., ray tracing), in contrast to DOEs, which are described by a wave optical model. Various aspects of freeform optics are discussed in Rolland et ah, "Freeform optics for imaging," Optica, vol. 8, pp. 161-176 (2021), which is hereby incorporated by reference in its entirety.
  • rotational symmetry and “radial symmetry” refer to a property of a shape or surface that looks the same after some rotation by a partial turn.
  • An object's degree of rotational symmetry is the number of distinct orientations in which it looks exactly the same for each rotation.
  • the terms “biaxial symmetry” or “bi-axial symmetry” refers to a property of a shape or surface that contains symmetrical designs on both horizontal and vertical axes.
  • optical aberration and/or “aberration” refers to a property of optical systems and/or optical elements that causes light to be spread out over some region of space rather than focused to a point. An aberration can be defined as a departure of the performance of an optical system from a predicted level of performance (or the predictions of paraxial optics).
  • the term “laser” refers to light amplification by stimulated emission of radiation. Additionally or alternatively, the term “laser” refers to a device that emits light through a process of optical amplification based on stimulated emission of electromagnetic radiation. The term “laser” as used herein may refer to the device that emits laser light, the light produced by such a device, or both.
  • speckle noise refers to a granular pattern of bright and dark regions of intensity that occurs when laser light is scattered (or reflected) from a rough surface.
  • the term “diffuser” refers to any device or material that diffuses or scatters light in some manner.
  • a “diffuser” may include materials that reflect light, translucent materials (e.g., glass, ground glass, reflon/reflow, opal glass, greyed glass, etc.), and/or other materials.
  • the term “diffractive diffuser” refers to a diffuser or diffractive optical element (DOE) that exploits the principles of diffraction and refraction.
  • DOE diffractive optical element
  • the term “speckle diffuser devices also referred to as “speckle diffusers” refers to devices used in optics to destroy spatial coherence (or coherence interference) of laser light prior to reflection from a surface.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)

Abstract

Disclosed embodiments are related to a compact holographic head-up display (hHUD) system comprising a holographic optical element (HOE) with optical power and corrective optical elements that allow the optical elements of the compact hHUD system to be reduced in size and/or volume in comparison to optical elements used in conventional head-up display and hHUD systems.

Description

COMPACT HOLOGRAPHIC HEAD-UP DISPLAY DEVICE
FIELD
[0001] Embodiments discussed herein are generally related to optical devices and head-up displays (HUDs), and in particular, to configurations and arrangements of optical elements to provide compact holographic HUDs.
BACKGROUND
[0002] A Head-Up Display (HUD) is a transparent display that presents information without requiring a viewer to look away from their viewpoint. Typical HUDs include a combiner, a light projection device (referred to as a “projector” or “projection unit”), and a video/image generation computer device. The combiner is usually a piece of glass located directly in front of the viewer, that redirects the projected image from projector in such a way as to see the field of view and the projected infinity image at the same time. The projector is often an optical collimator including a lens or mirror with a cathode-ray tube, light emitting diode (LED) display, or liquid crystal display (LCD) that produces an image where the light is collimated (i.e., where the focal point is perceived to be at infinity). However, these classical HUDs often produce optical aberrations, and multiple mirrors are required to correct for these aberrations.
[0003] Holographic HUDs (hHUDs) typically include a laser projector and a holographic optical element (HOE). Some hHUDs place the HOE inside a display screen, such as a windscreen or windshield of an automobile or aircraft. However, most hHUDs with HOEs inside the display screen cannot produce large, high quality images without producing optical aberrations in a similar manner as with the classical HUDs. typical hHUDs with HOEs inside the display screen require several corrective optical devices to correct these aberrations. Furthermore, classical HUDs and typical hHUDs require optical elements with large dimensions and substantial optical power to provide high quality images.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Embodiments will be readily understood by the following detailed description in conjunction with the accompanying drawings and the appended claims. Embodiments are illustrated by way of example and not by way of limitation in the figures of the accompanying drawings:
[0005] Figures 1, 2, and 3 illustrate a holographic head-up display (hHUD) system and demonstrate operation of the hHUD system. Figures 4, 5, and 6 illustrate various configurations and arrangements of the hHUD system. Figures 7 and 8 illustrate example surface patterns. Figure 9 illustrates simulation results of using the various holographic optical systems discussed herein. [0006] Figure 10 illustrates an example HUD system for a vehicle. Figure 10 illustrates an example display system configurable to interface with an on-board vehicle operating system. Figure 11 illustrates an example implementation of a vehicle embedded computer device according to various embodiments.
DETAILED DESCRIPTION
[0007] The present disclosure describes various configurations and arrangements of corrective optics for holographic head-up displays (hHUDs). As will be described in more detail infra, an hHUD includes a holographic optical element (HOE) inside a display screen and has a certain arrangement of refractive and reflective optical elements that work together with the HOE to work as a HUD to show images far ahead of a viewer. The HOE may also have various working geometries, including the off-axis one, may be formed by complex aberrated wavefronts, optimized for the minimization of residual aberrations together with a compact corrector, achieving high image quality despite its small size. In these ways, the configurations and arrangements discussed herein allow smaller optical elements with lower optical power than conventional HUDs and hHUDs to be used. Additionally, the configurations and arrangements discussed herein loosen display screen tolerance requirements due to the transfer of the combiner’s optical functions from the display screen to the hologram and/or HOE. Furthermore, the configurations and arrangements discussed herein do not require the use of complex wedge- shaped films for flare elimination by the divergence of 0 and 1 diffraction orders. Moreover, the size reduction of the hHUDs discussed herein reduce the amount or energy absorbed by the hHUD components, for example, reducing sensor heat caused by the narrowness of the hologram’s working spectral region.
1. HOLOGRAPHIC HEAD-UP DISPLAY ARRANGEMENTS AND CONFIGURATIONS [0008] Figures 1 and 2 illustrate a holographic head-up display (hHUD) system 100 (or simply “hHUD 100”). The hHUD 100 includes a picture generation unit (PGU) 101, correction optics assembly 102 (also referred to as “correction optics 102”, “corrective optics 102”, and/or the like), and a combiner 103. The correction optics 102 includes various optical elements including a scattering surface 120, optical element 121, optical element 122, optical element 123, and a corrector 124 (for the sake of clarity, many of the optical elements of the correction optics 102 are not labeled in Figure 1). During operation, the PGU 101 projects laser light 110 (or otherwise generates light 110) through various optical elements 121-124 of the correction optics 102 including illumination of the scattering surface 120. In particular, the PGU 101 creates an intermediate image at the scattering surface 120 by projecting light representative of a virtual image on to the scattering surface 120, which is then magnified and/or filtered by optical elements 121 and/or 122, and redirected to corrector 124 by reflection surface 123. The correction optics
102 redirects 111 the light 110 (i.e., as light rays 111) onto an HOE that is on or in the combiner
103 (hereinafter referred to as “HOE 103” or the like), which are then reflected 112 towards a viewer 115. In particular, the HOE 103 with optical power generates a virtual image of the previously generated intermediate image (at the scattering surface 120) at the given distance 215 from the viewer 115 (see Figure 2). At the same time, due to the fact that there is no limitation on the distance, at which the virtual image may be generated, it becomes possible to create augmented reality (AR) when all the virtual objects are perceived as located at the same place as real objects of the surrounding world. In these ways, the hHUD 100 allows the viewer to see the projected/generated image at approximately 10 or 15 meters ahead of the viewer (through the combiner 103), which matches real world objects that can be seen through the combiner 103. [0009] The PGU 101 may be realized by the use a suitable projector and is the same or similar to the PGU 1130 discussed infra with respect to Figure 11. The combiner 103 in this example is a (semi-)transparent display surface located directly in front of a viewer that redirects a projected virtual image from the PGU 101 in such a way as to allow the viewer to view a field ofview(FoV) and the virtual image at the same time thereby facilitating augmented reality (AR). Usually, the size of the virtual image is defined by the largest optical element of a HUD, which is usually a combiner element such as combiner 103. In classical HUDs, the combiner is typically a large mirror. In this example, the largest optical element is the HOE (e.g., HOE 1131 of Figure 11 discussed infra) that is in or on the combiner 103. Various aspects of the combiner 103, HOE, and PGU 101 are discussed infra with respect to Figure 11.
[0010] The correction optics 102 (also referred to as “corrective optics 102” or “auxiliary optics 102”) works together with the HOE 103 to display the virtual images. The correction optics 102 comprises one or more optical elements 121-124, which may include, for example, lenses, prisms, mirrors, HOEs, and/or other optical elements, and/or combinations thereof. In this example, the corrective optics 102 includes, inter alia, scattering surface 120 and corrector 124. The scattering surface 120 may be a diffusion screen, a diffuser plate, and/or an array of microlenses with selected parameters of scattering on a plane close to the focal plane.
[0011] The corrector 124 is an optical element that primarily corrects aberrations caused by the HOE 103. All of the optical elements of correction optics 102 work together to correct aberrations. The corrector 124 may comprise one or more of a prism, a lens, a mirror, prismatic lens, and/or various combinations of such optical elements. The properties of the corrector 124 are dependent on the particular arrangement and configuration of various optical elements of the hHUD 100 within a particular environment (e.g., within an automobile cabin, aircraft cockpit, head-mounted display, etc.). For example, the corrector 124 may have a first set of properties when the hHUD 100 is configured or deployed within an automobile and may have a first set of properties when the hHUD 100 is configured or deployed within an aircraft cockpit. In another example, the corrector 124 may have a first set of properties when the hHUD 100 is configured or deployed within a first automobile of a first make and model, and may have a first set of properties when the hHUD 100 is configured or deployed within a second automobile of a second make and model. The set of properties include, for example, the surface types or patterns of the corrector 124, a shape formed by the surfaces of the corrector 124, a size of the corrector 124, a position of the corrector 124 with respect to at least one other optical element, an orientation of the corrector 124 with respect to at least one other optical element, materials or substances used to make the corrector 124, and/or other properties. Other aspects of the corrector 124 are discussed in more detail infra.
[0012] The dimensions (e g., size, volume, etc.) of the various optical elements of the hHUD 100 (e.g., the optics in correction optics 102, the HOE 103, etc.) can be reduced to have dimensions that the optical elements are much smaller than optical elements used in existing hHUDs and classical HUDs. This is because the light rays 110 coming from the PGU 101 to the corrective optics 102 are converging. And because the light rays 110 are converging, smaller optical elements can be used to cover all the light rays 110. The dimensional reduction of the optical elements can be very significant, for example, in some implementations the volume and/or size of the optical elements can be reduced by a factor of 10 (i.e., the volume and/or size of the optical elements can be up to 10 times smaller than the optical elements of existing HUDs and hHUDs). Additionally, in some implementations, the use of smaller optical elements may allow the PGU 101 and the correction optics 102 to be positioned closer (in distance) to one another, and/or positioned to be positioned closer (in distance) to the combiner 103, which may provide further enhancements to the virtual image (e.g., sharpness, clarity, etc.).
[0013] However, optical aberrations (or simply an “aberrations”) may also be increased or exaggerated when operating the hHUD 100 and/or arranging the optical elements of the hHUD 100 in this manner. Aberrations can cause the virtual image formed on the combiner 103 to be blurred or distorted, with the nature of the distortion depending on the type of aberration. In some cases, an aberration occurs when light from one point of an object does not converge into (or does not diverge from) a single point after transmission through the hHUD 100. In any event, the correction optics 102 having a configuration or arrangement as discussed infra is capable of correcting or compensating for aberrations.
[0014] The optical power of an optical system or optical element is the degree to which an optical element or optical system converges or diverges light. However, as the optical power of an optical element increases, the optical quality of the image degrades. The correction optics 102 having a configuration or arrangement as discussed infra compensates for this type of image degradation. In other words, introduction of the optical correction optics 102 allows the HOE in the combiner 103 to have higher optical power and/or more focused light without experiencing the image degradation that a similar optical power would produce in existing hHUDs and/or classical HUDs. This also allows for the reduction in the size of the optical system 100 because the rays 110, 111 can converge faster while being covered by smaller optical elements. In some implementations, the HOE 103 has an optical power in the range of 1,1 - 6,6 diopters.
[0015] Furthermore, some conventional/classical HUD systems utilize a wedge-shaped film in or on a combiner in order to avoid ghosting (where a replica of the transmitted image (i.e., a “ghost” image), which is offset in position, is superimposed on top of the main image). However, the correction optics 102 having a configuration or arrangement as discussed infra does not need an HOE or combiner 103 to have a wedge-shaped film or layer, which reduces complexity in producing/manufacturing combiners 103 and/or allows combiners 103 to be produced/manufactured using less materials and complexity.
[0016] Referring now to Figure 3, the HOE 103 may be formed by the registration of the interference between two wavefronts inside a suitable HOE material (e.g., a photopolymer and/or other material such as those discussed herein). Here, the wavefronts may be spherical diverging 307 and plane 308, geometries of which correspond to the spatial scheme of the hHUD 100 in a vehicle (e g., an automobile, truck, watercraft, aircraft, etc.). On the other hand, the recording may be performed using preliminarily aberrated wavefronts due to the existence of special optical elements in an optical scheme 309, which may resemble simple spherical and cylindrical elements as well as aspherical and/or freeform elements. At the same time, correction of aberrations is made by the compact correction optics 102 (including corrector 124), parameters of which may be calculated together with parameters of elements participating in the formation of the HOE’s 103 wavefronts.
[0017] Referring back to Figure 1, the correction optics 102 may be used whilst working with an HOE 103 without an off-axis angle (or with an on-axis angle). The optical axis 130 is an imaginary line that defines the path along which light propagates through the system (i.e., hHUD 100). For a system composed of simple lenses and mirrors, the optical axis passes through the center of curvature of each surface, and coincides with the axis of rotational symmetry. For an on-axis system, the optical axis may be coincident with a mechanical axis of the system (e.g., mechanical axis of the hHUD 100, mechanical axis of the correction optics 102, or mechanical axis of the corrector 124). Here, the object (e.g., the correction optics 102 and/or the corrector 124) is located on the optical axis 130 coinciding with the HOE’s 103 normal.
[0018] The optical elements of the correction optics 102 comprise at least two refractive surfaces (or at least two optical elements, each having a refractive surface) and may in addition comprise at least one reflective surface (or at least one optical element with a reflective surface). In this example, the at least two refractive elements include optical elements 121 and 122 as well as the surface of corrector 124 facing the HOE 103, and the at least one reflective element includes reflective element 123. The refractive elements 121, 122 and reflective element 123 can be formed into any type of shapes such as, for example, flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes. The shapes of the refractive elements 121, 122 and/or the reflective element 123 may predominantly have rotational symmetry. In the example of Figure 1, the optical element 121 is a rectangular polyhedron lens or prism, optical element 122 is a cylindrical polyhedron lens or prism or a rectangular polyhedral lens or prism having convex sides, and optical element 123 is a flat or substantially planar reflective surface. Additionally, the optical elements 121, 122, and 123 may have surfaces of various types and/or patterns including, for example, spherical, aspherical, toroidal, and/or freeform surfaces.
[0019] Additionally or alternatively, the corrector 124 is formed from at least two refractive surfaces. Individual surfaces of the corrector 124 can be spherical, aspherical, anamorphic, and/or freeform surfaces. The surfaces of corrector 124 can form any type of shape that predominantly has rotational symmetry. For example, the shape of the of corrector 124 can be flat or a planar, a sphere, a prismatic (prism or prism-like), pyramidal, ellipsis shape, conical, cylindrical, toroidal, and/or some other three-dimensional (3D) shape.
[0020] Figure 4 shows a hHUD 400 with correction optics assembly 402 including an arrangement of optical element 421, optical element 422, optical element 423, and corrector 424. The hHUD 400 is used whilst working with an HOE 103 with an off-axis angle (or without an on- axis angle). Here, the off-axis angle is an angle b between the HOE’s 103 normal 430 and an axis 420, connecting centers of the HOE 103 and the object (e.g., corrector 424). An off-axis optical system is an optical system in which the optical axis of an optical element is not coincident with the mechanical center of the optical element, and/or where the optical axis is not coincident with the mechanical axis of the system. Here, the axis 420 may be the normal of the system (e.g., correction optics 402 and/or corrector 424) or the mechanical axis of the system (e.g., correction optics 402 and/or corrector 424).
[0021] The optical elements of the correction optics 402 comprises at least two refractive elements (or at least two optical elements with a refractive surface) and at least one reflective element (or at least one optical elements with a reflective surface). In this example, the at least two refractive elements include optical elements 421 and 422 and corrector 424, and the at least one reflective element includes reflective element 423. The refractive elements 421, 422, 424 and reflective element 423 can be formed into any type of shapes such as, for example, flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes. The shapes of the refractive elements 421, 422 and/or the reflective element 423 may have rotational symmetry and/or may be formed into an aspherical shape/form. In the example of Figure 4, the optical element 421 is an oblique pyramidal or tetrahedral lens or prism, optical element 422 is a rectangular polyhedron lens or prism, and optical element 423 is a flat or substantially planar reflective surface. Additionally, the optical elements 421, 422, and 423 may have surfaces of various types and/or patterns including, for example, spherical, aspherical, anamorphic, and/or freeform surfaces.
[0022] Additionally or alternatively, the corrector 424 is formed from at least two refractive surfaces. In other implementations, the corrector 424 can be formed to have more than two refractive surfaces and/or more than one reflective surface. The refractive and reflective surfaces of the corrector 424 can be spherical, aspherical, anamorphic, and/or freeform surfaces. The surfaces of corrector 424 can form any type of shape that predominantly has a decentered (e.g., off-axis) aspherical shape/form as is shown in Figure 4 in order to correct occurring difference between optical paths in the vertical plane. Additionally or alternatively, the shape of the of corrector 424 can be flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes.
[0023] Figure 5 shows a hHUD 500 with correction optics assembly 502 including an arrangement of optical element 522 and corrector 524. The hHUD 500 is used whilst working with an HOE 103 with an off-axis angle or an on-axis angle. The optical elements of the correction optics 502 comprises at least two refractive elements (or at least two optical elements with a refractive surface. In this example, the at least two refractive elements include optical element 522 and corrector 524. The optical elements 522, 524 can be formed into any type of shapes such as, for example, flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes. The optical element 522 can have spherical, aspherical, anamorphic, and/or freeform surfaces. In the example of Figure 5, the optical element 522 is a rectangular polyhedron lens or prism with a concave surface/side oriented to face the corrector 524.
[0024] Additionally or alternatively, the corrector 524 is formed from at least two refractive surfaces. In other implementations, the corrector 524 can be formed to have more than refractive surfaces and/or more than one reflective surface. Individual surfaces of the corrector 524 can be spherical, aspherical, anamorphic, and/or freeform surfaces. The surfaces of corrector 524 can form any type of shape that predominantly has a decentered (e.g., off-axis) prism-shaped (or prismatic) form as is shown in Figure 5 in order to correct occurring difference between optical paths in the vertical plane. Additionally or alternatively, the shape of the of corrector 524 can be flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes.
[0025] Figure 6 shows a hHUD 600 with correction optics assembly 602 including an arrangement of optical element 621 and corrector 624. The hHUD 600 is used whilst working with an HOE 103 with an off-axis angle or an on-axis angle. The optical element 621 is a refractive element that can be formed into any type of shape such as, for example, flat or planar, sphere, asphere, prismatic (prism or prism-like), pyramid, ellipsis, conical, cylindrical, toroidal and/or toroidal polyhedrons, and/or some other like shape or combination of shapes. The optical element 621 can have spherical, aspherical, anamorphic, and/or freeform surfaces. In the example of Figure 6, the optical element 621 is a rectangular polyhedron lens or prism with a concave surface/side oriented to face the scattering surface 120.
[0026] The corrector 624 is formed from at least two refractive surfaces 624a and 624c and at least one reflective surface 624b. A first refractive surface 624a of the corrector 624 faces the optical element 621 and a second refractive surface 624c of the corrector 624 faces the HOE 103. The reflective surface 624b of the corrector 624 is within the corrector 624 and is oriented at an angle with respect to the optical element 621 and/or the HOE 103. Light rays from the scattering surface 120 is guided to the corrector 624 by the optical element 621, and enters the corrector 624 through the first refractive surface 624a. The light rays are then reflected off of the reflective surface 624b on to the HOE 103 via the second refractive surface 624c. Individual surfaces of the corrector 624 can be spherical, aspherical, anamorphic, and/or freeform surfaces. The surfaces of corrector 624 can form any type of shape that predominantly has a prismatic form. In order to minimize the overall size of the system, the corrector 624 in this example is formed as an integral reflective-prismatic element with a freeform prismatic shape for the correction of the occurring optical paths difference.
[0027] Figure 7 shows example surface shapes or patterns than may be used for the various optical elements discussed with respect to Figures 1-6. The example surfaces include a spherical surface 701, an aspherical surface 702, anamorphic, and a freeform surfaces 703a, 703b, and 703c. Freeform surfaces 703a and 703b are 3D plots where darker shadings indicate greater surface height, and freeform surface 703c is an interferogram.
[0028] Figure 8 shows additional example surface shapes or patterns than may be used for the various optical elements discussed with respect to Figures 1-6. The example surfaces include a spherical surface 801, an off-axis spherical surface 802, an aspherical surface 803, an off-axis aspherical surface 804, and a freeform surface 805.
[0029] The freeform surfaces discussed with respect to Figures 1-8 may be modeled and/or formed based on mathematical descriptions. Examples of such mathematical descriptions include radial basis functions, basis splines, wavelets, non-uniform rational basis splines, orthogonal polynomials (e.g., Zernike polynomials, 2D-Q2D-Q polynomials, (p-polynomial, etc.), non- orthogonal bases (e.g., X-YX-Y polynomials), hybrid stitched representations, and/or combinations thereof.
[0030] The hHUD device of Figures 1-6 can be implemented in various ways. In one implementation, multiple PGUs 101 placed in different positions/locations to provide virtual images at different distances. Additionally or alternatively, hHUD device can include a single PGU 101 and a periscopic system to provide virtual images at different distances. In another implementation, multiple HOEs may be provided in or on a single substrate (e.g., combiner 103) to provide several virtual images at intersecting eyebox areas from different PGUs 101 disposed at different positions/locations. This implementation may provide viewer convenience and increased AR capabilities. The multiple HOEs can also be placed in such a way as to provide several different eyeboxes, for example, for a driver and a passenger, from different PGUs 101, in order to create augmented reality without FoV losses when observation of the surrounding world is done through the same transparent surface from different zones. Additionally or alternatively, some or all of the components of the hHUDs of Figures 1-6 can be included in a single housing or frame. For example, in one implementation, the correction optics 102, 402, 502, 602 may be disposed in a single housing or frame. Additionally or alternatively, the PGU 101 (or multiple PGUs 101) may be disposed in the same housing/frame as the correction optics 102, 402, 502, 602. Any of the aforementioned implementations may be combined or rearranged depending on the specific use cases involved and/or the environment in which the hHUD system is deployed/disposed.
[0031] Figure 9 shows a graph 900 including curves of volume growth for a corrective optics system located between a combiner 103 and a PGU 101 depending on its optical power for different distances to a virtual image. For convenience, focal lengths are used in graph 900. The values taken as initial data include distance to eyebox of 700 mm, circular eyebox radius of 71 mm, FoV (radius) of 6.5 degrees.
[0032] Graph 900 shows the approximate volume of an hHUD system in liters depending on the combiner’s focal length in millimeters (mm) for different distances (given in mm) to a virtual image. It is demonstrated that the growth of an hHUD system’s volume, which is necessary to form a reasonable distance to virtual image (>3 m) is very fast. And only those hHUD systems, which form an image, located below the vehicle’s hood (~2 m), may be located in reasonable volumes even with small optical power of a combiner. Optical power or focal length of traditional systems with combiner, resembling windshield area, may be evaluated by the minimal nearest radius of curvature in this area. For comparison, such a typical minimal focal length, equals R/2, where R is a median nearest radius of curvature of windshield’s slices in combiner area is roughly 1000 mm. The hHUD systems/devices discussed herein, in order to preserve the compact dimensions, has combiner focal length of at least twice as less as conventional HUD systems/devices. This allows to achieve twice as more in terms of volume benefits in comparison to existing HUD systems/devices.
[0033] Each of the elements/components shown and described herein may be manufactured or formed using any suitable fabrication means, such as those discussed herein. Additionally, each of the elements/components shown and described herein may be coupled to other elements/components and/or coupled to a portion/section of the vehicle by way of any suitable fastening means, such as those discussed herein. Furthermore, the geometry (shape), position, and/or orientation of the elements/components shown and described herein may be different from the depicted shapes, positions, and/or orientations in the example embodiments of Figures 1-6 depending on the shape, size, and/or other features of the vehicle in which the hHUD is disposed. 2. SYSTEM CONFIGURATIONS AND ARRANGEMENTS
[0034] Figure 10 illustrates an example display system 1000 configurable to interface with an onboard unit (OBU) 1020. The display system 1000 may be configured, arranged, or otherwise compatible with a number of different types of vehicles (e g., makes, models, etc.), which may be associated with different operator positions, including height of the operator’s eyes or distance from the operator to display device 1060. The OBU 1020 comprises one or more vehicle processors or onboard computers, memory circuitry (with instructions stored in the memory), and interface circuitry that interfaces with the HUD processor 1010. The display system 1000 connects to the OBU 1020 via an onboard diagnostics (OBD) port of a vehicle. HUD processor 1010 controls or otherwise operates a projection device 1030 that, in turn, generates and/or projects light representative of at least one virtual image onto an imaging matrix 1050. Imaging matrix 1050, in turn, selectively distributes and/or propagates the virtual image received as light from the projection device 1030 and/or optical devices 1040 as one or more wave fronts to a display device 1060.
[0035] The HUD processor 1010 is a computing device that determines virtual graphics to display on display device 1060 according to one or more HUD apps, and provides indications/signals of the virtual graphics to projection device 1030 that, in turn, generates and/or projects light representative of the virtual graphic to the imaging matrix 1050. The projection device 1030 may be the same or similar as the PGU 101 discussed previously and/or the PGU(s) 1130 discussed infra with respect to Figure 11. The HUD apps may cause the generation of virtual graphics based on, for example, predetermined operational parameters including vehicle parameters (e.g., speed, location, travel direction, destination, windshield location, traffic, and the like), road parameters (e.g., location or presence of real world objects, roads, and the like), vehicle observer parameters (e.g., operator location within vehicle, observer eye tracking, eye location, position of system, and the like), and/or a combination thereof. Operational parameters may further include any input received from any of a plurality of sources including vehicle systems or settings including, for example, sensor circuitry 1121, I/O devices 1186, actuators 1122, ECUs 1123, positioning circuitry 1145, or a combination thereof as shown by Figure 11.
[0036] The one or more optical devices 1040 or lenses are configured to correct aberrations, filter, and/or to improve light utilization efficiencies. Optical devices 1040 may include any type of optical device (e.g., filters) such as those discussed herein. In particular, the optical devices 1040 may be the same or similar to the corrector 102 (or portions thereof) discussed previously.
[0037] In some examples, display device 1060 comprises a windscreen or windshield of a terrestrial vehicle, watercraft, or aircraft, a holographic film placed adjacent to the windshield/windscreen, or a combination thereof. Additionally or alternatively, the display device 1060 comprises a head-mounted display (HMD) screen (e.g., an augmented reality (AR) or virtual reality (VR) headset), a transparent (or semi-transparent) eyepiece such as those used for optical HMDs, helmet-mounted displays, and/or the like.
[0038] The display system 1000 is configurable or operable to generate one or more virtual graphics on image plane 1070. The image plane 1070 is associated with a focal distance 1075. Although image plane 1070 is illustrated as being located on an opposite side of display device 1060 from imaging matrix 1050, in some implementations, the display device 1000 is configured to reflect light of the wave front propagated by imaging matrix 1050 so that the resulting image is reflected back to an observer (e.g., viewer 115 of Figure 1). While the image may be reflected back from display device 1060 to the observer, the image plane 1070 may nevertheless appear to the observer to be located on the opposite side of the display device 1060 (e.g., on the same side of the display device 1060 as the real-world objects, outside of the vehicle). The display system 1000 may comprise a translation device or motor 1080 that can vary the focal distance 1075 of image plane 1070 such as by moving the imaging matrix 1050 relative to the display device 1060 in any direction (e.g., vertical or horizontal) and/or vice versa, as well as change the incline angle of the imaging device 1050.
[0039] Although not shown by Figure 10, in various embodiments, display system 1000 may include multiple projection devices 1030, optical devices 1040 imaging matrices 1050, display devices 1060, and motors 1080 that may be disposed in a multitude of arrangements.
[0040] Figure 11 illustrates an example computing system 1100, in accordance with various embodiments. The system 1100 may include any combinations of the components as shown, which may be implemented as integrated circuits (ICs) or portions thereof, discrete electronic devices, or other modules, logic, hardware, software, firmware, middleware or a combination thereof adapted in the system 1100, or as components otherwise incorporated within a chassis of a larger system, such as a HUD system 1000 and/or vehicle 1005. Additionally or alternatively, some or all of the components of system 1100 may be combined and implemented as a suitable System-on-Chip (SoC), System-in-Package (SiP), multi-chip package (MCP), or some other like package. The system 1100 is an embedded system or any other type of computer device discussed herein. In another example, the system 1100 may be a separate and dedicated and/or special- purpose computer device designed specifically to carry out air-barrier solutions of the embodiments discussed herein.
[0041] The processor circuitry 1102 comprises one or more processing elements/devices configurable to perform basic arithmetical, logical, and input/output operations by carrying out and/or executing instructions. According to various embodiments, processor circuitry 1102 is configurable to perform some or all of the calculations associated with the preparation and/or generation of virtual graphics and/or other types of information that are to be projected by HUD system 1000 for display, in real time. Additionally, processor circuitry 1102 is configurable to gather information from sensor circuitry 1120 (e.g., process a video feed from a camera system or image capture devices), obtain user input from one or more I/O devices 1186, and obtain vehicle input in substantially in real time. Some or all of the inputs may be received and/or transmitted via communication circuitry 1109. In order to perform the aforementioned functions, the processor circuitry 1102 may execute instructions 1180, and/or may be loaded with an appropriate bit stream or logic blocks to generate virtual graphics based, at least in part, on any number of parameters, including, for example, input from sensor circuitry 1120, input from I/O devices 1186, input from actuators 1122, input from ECUs 1123, input from positioning circuitry 1145, and/or the like. Additionally, processor circuitry 1102 may be configurable to receive audio input, or to output audio, over an audio device 1120. For example, processor circuitry 1102 may be configurable to provide signals/commands to an audio output device 1186 to provide audible instructions to accompany the displayed navigational route information or to provide audible alerts.
[0042] The processor circuitry 1102 includes circuitry such as, but not limited to one or more processor cores and one or more of cache memory, low drop-out voltage regulators (LDOs), interrupt controllers, serial interfaces such as serial peripheral interface (SPI), inter-integrated circuit (I2C) or universal programmable serial interface circuit, real time clock (RTC), timer- counters including interval and watchdog timers, general purpose input-output (EO), memory card controllers, interconnect (IX) controllers and/or interfaces, universal serial bus (USB) interfaces, mobile industry processor interface (MIPI) interfaces, Joint Test Access Group (JTAG) test access ports, and the like. The processor circuitry 1102 may include on-chip memory circuitry or cache memory circuitry, which may include any suitable volatile and/or non-volatile memory, such as DRAM, SRAM, EPROM, EEPROM, Flash memory, solid-state memory, and/or any other type of memory device technology, such as those discussed herein.
[0043] The processor(s) of processor circuitry 1102 may be, for example, one or more application processors or central processing units (CPUs), one or more graphics processing units (GPUs), one or more reduced instruction set computing (RISC) processors, one or more Acom RISC Machine (ARM) processors, one or more complex instruction set computing (CISC) processors, one or more DSPs, one or more microprocessor without interlocked pipeline stages (MIPS), one or more programmable logic devices (PLDs) and/or hardware accelerators) such as field-programmable gate arrays (FPGAs), structured/programmable Application Specific Integrated Circuit (ASIC), programmable SoCs (PSoCs), etc., one or more microprocessors or controllers, or any suitable combination thereof. In some embodiments, the processor circuitry 1102 may be implemented as a standalone system/device/package or as part of an existing system/device/package (e.g., an ECU/ECM, EEMS, etc.) of the vehicle 1000. In some embodiments, the processor circuitry 1102 may include special-purpose processor/controller to operate according to the various embodiments herein.
[0044] Individual processors (or individual processor cores) of the processor circuitry 1102 may be coupled with or may include memory/storage and may be configurable to execute instructions stored in the memory/storage to enable various applications or operating systems to run on the system 1100. In these embodiments, one or more processors (or cores) of the processor circuitry 1102 may correspond to the processor 1012 of Figure 10 and is/are configurable to operate application software (e.g., HUD app) to provide specific services to a user of the system 1100. In some embodiments, one or more processors (or cores) of the processor circuitry 1102, such as one or more GPUs or GPU cores, may correspond to the HUD processor 1010 and is/are configurable to generate and render graphics as discussed previously.
[0045] As examples, the processor circuitry 1102 may include an Intel® Architecture Core™ based processor, such as a Quark™, an Atom™, an i3, an i5, an i7, or an MCU-class processor, Pentium® processor(s), Xeon® processor(s), or another such processor available from Intel® Corporation, Santa Clara, California. However, any number other processors may be used, such as one or more of Advanced Micro Devices (AMD) Zen® Core Architecture, such as Ryzen® or EPYC® processor(s), Accelerated Processing Units (APUs), MxGPUs, Epyc® processor(s), or the like; A5-A12 and/or S1-S4 processor(s) from Apple® Inc., Snapdragon™ or Centriq™ processor(s) from Qualcomm® Technologies, Inc., Texas Instruments, Inc.® Open Multimedia Applications Platform (OMAP)™ processor(s); a MIPS-based design from MIPS Technologies, Inc. such as MIPS Warrior M-class, Warrior I-class, and Warrior P-class processors; an ARM- based design licensed from ARM Holdings, Ltd., such as the ARM Cortex-A, Cortex-R, and Cortex-M family of processors; the ThunderX2® provided by Cavium™, Inc.; or the like. Other examples of the processor circuitry 1102 are mentioned elsewhere in the present disclosure. [0046] In some implementations, the processor circuitry 1102 may include a sensor hub, which acts as a coprocessor by processing data obtained from the sensor circuitry 1120. The sensor hub may include circuitry configurable to integrate data obtained from each of the sensor circuitry 1120 by performing arithmetical, logical, and input/output operations. In embodiments, the sensor hub may capable of timestamping obtained sensor data, providing sensor data to the processor circuitry 1102 in response to a query for such data, buffering sensor data, continuously streaming sensor data to the processor circuitry 1102 including independent streams for each sensor circuitry 1120, reporting sensor data based upon predefined thresholds or conditions/triggers, and/or other like data processing functions.
[0047] The memory circuitry 1104 comprises any number of memory devices arranged to provide primary storage from which the processor circuitry 1102 continuously reads instructions 1182 stored therein for execution. In some embodiments, the memory circuitry 1104 includes on-die memory or registers associated with the processor circuitry 1102. As examples, the memory circuitry 1104 may include volatile memory such as random access memory (RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), etc. The memory circuitry 1104 may also include non-volatile memory (NVM) such as read-only memory (ROM), high speed electrically erasable memory (commonly referred to as “flash memory”), and non-volatile RAM such as phase change memory, resistive memory such as magnetoresistive random access memory (MRAM), etc.
[0048] In some implementations, the processor circuitry 1102 and memory circuitry 1104 (and/or storage device 1108) may comprise logic blocks or logic fabric, memory cells, input/output (I O) blocks, and other interconnected resources that may be programmed to perform various functions of the example embodiments discussed herein. The memory cells may be used to store data in lookup-tables (LUTs) that are used by the processor circuitry 1102 to implement various logic functions. The memory cells may include any combination of various levels of memory/storage including, but not limited to, EPROM, EEPROM, flash memory, SRAM, anti-fuses, etc. The memory circuitry 1104 may also comprise persistent storage devices, which may be temporal and/or persistent storage of any type, including, but not limited to, non-volatile memory, optical, magnetic, and/or solid state mass storage, and so forth.
[0049] Storage circuitry 1108 is arranged to provide (with shared or respective controllers) persistent storage of information such as data, applications, operating systems, and so forth. As examples, the storage circuitry 1108 may be implemented as hard disk drive (HDD), a micro HDD, a solid-state disk drive (SSDD), flash memory, flash memory cards (e.g., SD cards, microSD cards, xD picture cards, and the like), USB flash drives, resistance change memories, phase change memories, holographic memories, or chemical memories, and the like. In an example, the storage circuitry 1108 may be or may include memory devices that use chalcogenide glass, multi -threshold level NAND flash memory, NOR flash memory, single or multi-level Phase Change Memory (PCM), a resistive memory, nanowire memory, ferroelectric transistor random access memory (FeTRAM), anti-ferroelectric memory, magnetoresistive random access memory (MRAM) memory that incorporates memristor technology, phase change RAM (PRAM), resistive memory including the metal oxide base, the oxygen vacancy base and the conductive bridge Random Access Memory (CB-RAM), or spin transfer torque (STT)-MRAM, a spintronic magnetic junction memory based device, a magnetic tunneling junction (MTJ) based device, a Domain Wall (DW) and Spin Orbit Transfer (SOT) based device, a thyristor based memory device, or a combination of any of the above, or other memory. As shown, the storage circuitry 1108 is included in the system 1100; however, in other embodiments, storage circuitry 1108 may be implemented as one or more separate devices that are mounted in vehicle 1000 separate from the other elements of system 1100.
[0050] The storage circuitry 1108 is configurable to store computational logic 1183 (or “modules 1183”) in the form of software, firmware, microcode, or hardware-level instructions to implement the techniques described herein. The computational logic 1183 may be employed to store working copies and/or permanent copies of programming instructions for the operation of various components of system 1100 (e.g., drivers, libraries, application programming interfaces (APIs), etc.), an OS of system 1100, one or more applications, and/or for carrying out the embodiments discussed herein. In some embodiments, the computational logic 1183 may include one or more program code or other sequence of instructions for controlling the various components of the system 1000 as discussed previously. The permanent copy of the programming instructions may be placed into persistent storage devices of storage circuitry 1108 in the factory or in the field through, for example, a distribution medium (not shown), through a communication interface (e.g., from a distribution server (not shown)), or over-the-air (OTA). The computational logic 1183 may be stored or loaded into memory circuitry 1104 as instructions 1182, which are then accessed for execution by the processor circuitry 1102 to carry out the functions described herein. The instructions 1182 direct the processor circuitry 1102 to perform a specific sequence or flow of actions, for example, as described with respect to the flowchart(s) and block diagram(s) of operations and functionality depicted herein. The modules/logic 1183 and/or instructions 1180 may be implemented by assembler instructions supported by processor circuitry 1102 or high- level languages that may be compiled into instructions 1180 to be executed by the processor circuitry 1102.
[0051] The computer program code for carrying out operations of the present disclosure (e.g., computational logic 1183, instructions 1182, 1180, etc.) may be written in any combination of one or more programming languages, including an object oriented programming language such as Python, Ruby, Scala, Smalltalk, Java™, C++, C#, or the like; a procedural programming languages, such as the “C” programming language, the Go (or “Golang”) programming language, or the like; a scripting language such as JavaScript, Server-Side JavaScript (SSJS), PHP, Pearl, Python, Ruby or Ruby on Rails, Accelerated Mobile Pages Script (AMPscript), VBScript, and/or the like; a markup language such as HTML, XML, wiki markup or Wikitext, Wireless Markup Language (WML), etc.; a data interchange format/definition such as Java Script Object Notion (JSON), Apache® MessagePack™, etc.; a stylesheet language such as Cascading Stylesheets (CSS), extensible stylesheet language (XSL), or the like; an interface definition language (IDL) such as Apache® Thrift, Abstract Syntax Notation One (ASN.l), Google® Protocol Buffers (protobuf), etc.; or some other suitable programming languages including proprietary programming languages and/or development tools, or any other languages or tools as discussed herein. The computer program code for carrying out operations of the present disclosure may also be written in any combination of the programming languages discussed herein. The program code may execute entirely on the system 1100, partly on the system 1100 as a stand-alone software package, partly on the system 1100 and partly on a remote computer, or entirely on the remote computer. In the latter scenario, the remote computer may be connected to the system 1100 through any type of network (e.g., network 1117).
[0052] The OS of system 1100 manages computer hardware and software resources, and provides common services for various applications (apps) (e.g., HUD apps, mapping applications, turn-by- tum navigation apps, AR apps, gaming apps, on-board diagnostics apps, and/or the like). The OS may include one or more drivers or APIs that operate to control particular devices that are embedded in the system 1100, attached to the system 1100, or otherwise communicatively coupled with the system 1100. The drivers may include individual drivers allowing other components of the system 1100 to interact or control various I/O devices that may be present within, or connected to, the system 1100. For example, the drivers may include a display driver (or HUD system driver) to control and allow access to the HUD system 10000, a touchscreen driver to control and allow access to a touchscreen interface of the system 1100, sensor drivers to obtain sensor readings of sensor circuitry 1120 and control and allow access to sensor circuitry 1120, actuator drivers to obtain actuator positions of the actuators 1124 and/or control and allow access to the actuators 1122, ECU drivers to obtain control system information from one or more of the ECUs 1123, audio drivers to control and allow access to one or more audio devices. The OSs may also include one or more libraries, drivers, APIs, firmware, middleware, software glue, etc., which provide program code and/or software components for one or more applications to obtain and use the data from other applications operated by the system 1100.
[0053] In some embodiments, the OS may be a general purpose OS, while in other embodiments, the OS is specifically written for and tailored to the system 1100. For example, the OS may be Unix or a Unix-like OS such as Linux e.g., provided by Red Hat Enterprise, Windows 10™ provided by Microsoft Corp.®, macOS provided by Apple Inc.®, or the like. In another example, the OS may be a mobile OS, such as Android® provided by Google Inc ®, iOS® provided by Apple Inc.®, Windows 10 Mobile® provided by Microsoft Corp.®, KaiOS provided by KaiOS Technologies Inc., or the like. In another example, the OS may be an embedded OS or a real-time OS (RTOS), such as Windows Embedded Automotive provided by Microsoft Corp ®, Windows 10 For IoT® provided by Microsoft Corp.®, Apache Mynewt provided by the Apache Software Foundation®, Micro-Controller Operating Systems (“MicroC/OS” or “pC/OS”) provided by Micrium®, Inc., FreeRTOS, VxWorks® provided by Wind River Systems, Inc.®, PikeOS provided by Sysgo AG®, Android Things® provided by Google Inc.®, QNX® RTOS provided by BlackBerry Ltd., or any other suitable embedded OS or RTOS, such as those discussed herein. In another example, the OS may be a robotics middleware framework, such as Robot Operating System (ROS), Robotics Technology (RT) -middleware provided by Object Management Group®, Yet Another Robot Platform (YARP), and/or the like.
[0054] In embodiments where the processor circuitry 1102 and memory circuitry 1104 includes hardware accelerators in addition to or alternative to processor cores, the hardware accelerators may be pre-configured (e.g., with appropriate bit streams, logic blocks/fabric, etc.) with the logic to perform some functions of the embodiments herein (in lieu of employment of programming instructions to be executed by the processor core(s)). In one example, the processor circuitry 1102, memory circuitry 1104, and/or storage circuitry 1108 may be packaged together in a suitable SoC or the like.
[0055] The components of system 1100 and/or vehicle 10 communicate with one another over an interconnect (IX) 1106. In various embodiments, IX 1106 is a controller area network (CAN) bus system, a Time-Trigger Protocol (TTP) system, or a FlexRay system, which may allow various devices (e.g., ECUs 1123, sensor circuitry 1120, actuators 1122, etc.) to communicate with one another using messages or frames. Additionally or alternatively, the IX 1106 may include any number of other IX technologies, such as a Local Interconnect Network (LIN), industry standard architecture (ISA), extended ISA (EISA), inter-integrated circuit (I2C), a serial peripheral interface (SPI), point-to-point interfaces, power management bus (PMBus), peripheral component interconnect (PCI), PCI express (PCIe), Ultra Path Interface (UPI), Accelerator Link (IAL), Common Application Programming Interface (CAPI), QuickPath Interconnect (QPI), Omni-Path Architecture (OP A) IX, RapidIO™ system interconnects, Ethernet, Cache Coherent Interconnect for Accelerators (CCIA), Gen-Z Consortium IXs, Open Coherent Accelerator Processor Interface (OpenCAPI), and/or any number of other IX technologies. The IX 1106 may be a proprietary bus, for example, used in a SoC based system.
[0056] The communication circuitry 1109 is a hardware element, or collection of hardware elements, used to communicate over one or more networks (e.g., network 1117) and/or with other devices. The communication circuitry 1109 includes modem 1110 and transceiver circuitry (“TRx”) 1112. The modem 1110 includes one or more processing devices (e.g., baseband processors) to carry out various protocol and radio control functions. Modem 1110 interfaces with application circuitry of system 1100 (e.g., a combination of processor circuitry 1102 and memory 1104) for generation and processing of baseband signals and for controlling operations of the TRx 1112. The modem 1110 handles various radio control functions that enable communication with one or more radio networks 1117 via the TRx 1112 according to one or more wireless communication protocols, such as those discussed herein. The modem 1110 may include circuitry such as, but not limited to, one or more single-core or multi-core processors (e.g., one or more baseband processors) or control logic to process baseband signals received from a receive signal path of the TRx 1112, and to generate baseband signals to be provided to the TRx 1112 via a transmit signal path. In various embodiments, the modem 1110 may implement a real-time OS (RTOS) to manage resources of the modem 1110, schedule tasks, etc.
[0057] The communication circuitry 1109 also includes TRx 1112 to enable communication with wireless networks 1117 using modulated electromagnetic radiation through a non-solid medium. TRx 1112 includes a receive signal path, which comprises circuitry to convert analog RF signals (e.g., an existing or received modulated waveform) into digital baseband signals to be provided to the modem 1110. The TRx 1112 also includes a transmit signal path, which comprises circuitry configurable to convert digital baseband signals provided by the modem 1110 to be converted into analog RF signals (e.g., modulated waveform) that will be amplified and transmitted via an antenna array including one or more antenna elements (not shown). The antenna array is coupled with the TRx 1112 using metal transmission lines or the like. The antenna array may be a one or more microstrip antennas or printed antennas that are fabricated on the surface of one or more printed circuit boards; a patch antenna array formed as a patch of metal foil in a variety of shapes; a glass-mounted antenna array or “on-glass” antennas; or some other known antenna or antenna elements. The TRx 1112 may include one or more radios that are compatible with, and/or may operate according to one or more radio access technologies, protocols, and/or standards including those discussed herein.
[0058] Network interface circuitry/controller (NIC) 1116 may be included to provide wired communication to the network 1117 or to other devices using a standard network interface protocol. In most cases, the NIC 1116 may be used to transfer data over a network (e.g., network 1117) via a wired connection while the vehicle is stationary (e.g., in a garage, testing facility, or the like) The standard network interface protocol may include Ethernet, Ethernet over GRE Tunnels, Ethernet over Multiprotocol Label Switching (MPLS), Ethernet over USB, or may be based on other types of network protocols, such as Controller Area Network (CAN), Local Interconnect Network (LIN), DeviceNet, ControlNet, Data Highway†, PROFIBUS, or PROFINET, among many others. Network connectivity may be provided to/from the system 1100 via NIC 1116 using a physical connection, which may be electrical (e.g., a “copper interconnect”) or optical. The physical connection also includes suitable input connectors (e.g., ports, receptacles, sockets, etc.) and output connectors (e.g., plugs, pins, etc.). The NIC 1116 may include one or more dedicated processors and/or FPGAs to communicate using one or more of the aforementioned network interface protocols. In some implementations, the NIC 1116 may include multiple controllers to provide connectivity to other networks using the same or different protocols. For example, the system 1100 may include a first NIC 1116 providing communications to the cloud over Ethernet and a second NIC 1116 providing communications to other devices over another type of network.
[0059] The input/output (EO) interface 1118 is configurable to connect or coupled the system 1100 with external devices or subsystems. The external interface 1118 may include any suitable interface controllers and connectors to couple the system 1100 with the external components/devices, such as an external expansion bus (e.g., Universal Serial Bus (USB), FireWire, PCIe, Thunderbolt, Lighting™, etc.), used to connect system 1100 with external components/devices, such as sensor circuitry 1120, actuators 1122, electronic control units (ECUs) 1123, positioning system 1145, input device(s) 1186, and picture generation units (PGUs) 1130. In some cases, the EO interface circuitry 1118 may be used to transfer data between the system 1100 and another computer device (e.g., a laptop, a smartphone, or some other user device) via a wired connection. EO interface circuitry 1118 may include any suitable interface controllers and connectors to interconnect one or more of the processor circuitry 1102, memory circuitry 1104, storage circuitry 1108, communication circuitry 1109, and the other components of system 1100. The interface controllers may include, but are not limited to, memory controllers, storage controllers (e.g., redundant array of independent disk (RAID) controllers, baseboard management controllers (BMCs), inpu output controllers, host controllers, etc.). The connectors may include, for example, busses (e.g., IX 1106), ports, slots, jumpers, interconnect modules, receptacles, modular connectors, etc. The I/O interface circuitry 1118 may also include peripheral component interfaces including, but are not limited to, non-volatile memory ports, USB ports, audio jacks, power supply interfaces, on-board diagnostic (OBD) ports, etc.
[0060] The sensor circuitry 1120 includes devices, modules, or subsystems whose purpose is to detect events or changes in its environment and send the information (sensor data) about the detected events to some other a device, module, subsystem, etc. Examples of such sensors 1120 include, inter alia, inertia measurement units (IMU) comprising accelerometers, gyroscopes, and/or magnetometers; microelectromechanical systems (MEMS) or nanoelectromechanical systems (NEMS) comprising 3-axis accelerometers, 3-axis gyroscopes, and/or magnetometers; level sensors; flow sensors; temperature sensors (e.g., thermistors); pressure sensors; barometric pressure sensors; gravimeters; altimeters; image capture devices (e.g., cameras); light detection and ranging (LiDAR) sensors; proximity sensors (e.g., infrared radiation detector and the like), depth sensors, ambient light sensors, ultrasonic transceivers; microphones; etc.
[0061] Some of the sensor circuitry 1120 may be sensors used for various vehicle control systems, and may include, inter alia, exhaust sensors including exhaust oxygen sensors to obtain oxygen data and manifold absolute pressure (MAP) sensors to obtain manifold pressure data; mass air flow (MAF) sensors to obtain intake air flow data; intake air temperature (IAT) sensors to obtain IAT data; ambient air temperature (AAT) sensors to obtain AAT data; ambient air pressure (AAP) sensors to obtain AAP data; catalytic converter sensors including catalytic converter temperature (CCT) to obtain CCT data and catalytic converter oxygen (CCO) sensors to obtain CCO data; vehicle speed sensors (VSS) to obtain VSS data; exhaust gas recirculation (EGR) sensors including EGR pressure sensors to obtain ERG pressure data and EGR position sensors to obtain position/orientation data of an EGR valve pintle; Throttle Position Sensor (TPS) to obtain throttle position/orientation/angle data; a crank/cam position sensors to obtain crank cam/piston position/orientation/angle data; coolant temperature sensors; and/or other like sensors embedded in vehicle 10. The sensor circuitry 1120 may include other sensors such as an accelerator pedal position sensor (APP), accelerometers, magnetometers, level sensors, flow/fluid sensors, barometric pressure sensors, and the like.
[0062] The positioning circuitry 1145 includes circuitry to receive and decode signals transmitted/broadcasted by a positioning network of a global navigation satellite system (GNSS). Examples of navigation satellite constellations (or GNSS) include United States’ Global Positioning System (GPS), Russia’s Global Navigation System (GLONASS), the European Union’s Galileo system, China’s BeiDou Navigation Satellite System, a regional navigation system or GNSS augmentation system (e.g., Navigation with Indian Constellation (NAVIC), Japan’s Quasi -Zenith Satellite System (QZSS), France’s Doppler Orbitography and Radio- positioning Integrated by Satellite (DORIS), etc.), or the like. The positioning circuitry 1145 comprises various hardware elements (e.g., including hardware devices such as switches, filters, amplifiers, antenna elements, and the like to facilitate OTA communications) to communicate with components of a positioning network, such as navigation satellite constellation nodes. In some embodiments, the positioning circuitry 1145 may include a Micro-Technology for Positioning, Navigation, and Timing (Micro-PNT) IC that uses a master timing clock to perform position tracking/estimation without GNSS assistance. The positioning circuitry 1145 may also be part of, or interact with, the communication circuitry 1109 to communicate with the nodes and components of the positioning network. The positioning circuitry 1145 may also provide position data and/or time data to the application circuitry, which may use the data to synchronize operations with various infrastructure (e.g., radio base stations), for tum-by-tum navigation, or the like. Additionally or alternatively, the positioning circuitry 1145 may be incorporated in, or work in conjunction with the communication circuitry to determine the position or location of the vehicle 10 by, for example, implementing the LTE Positioning Protocol (LPP), Wi-Fi positioning system (WiPS or WPS) methods, triangulation, signal strength calculations, and/or some other suitable localization technique(s).
[0063] Individual ECUs 1123 may be embedded systems or other like computer devices that control a corresponding system of the vehicle 10. In embodiments, individual ECUs 1123 may each have the same or similar components as the system 1100, such as a microcontroller or other like processor device, memory device(s), communications interfaces, and the like. In embodiments, the ECUs 1123 may include, inter alia, aDrivetrain Control Unit (DCU), an Engine Control Unit (ECU), an Engine Control Module (ECM), EEMS, a Powertrain Control Module (PCM), a Transmission Control Module (TCM), a Brake Control Module (BCM) including an anti-lock brake system (ABS) module and/or an electronic stability control (ESC) system, a Central Control Module (CCM), a Central Timing Module (CTM), a General Electronic Module (GEM), a Body Control Module (BCM), a Suspension Control Module (SCM), a Door Control Unit (DCU), a Speed Control Unit (SCU), a Human -Machine Interface (HMI) unit, a Telematic Control Unit (TTU), a Battery Management System (which may be the same or similar as battery monitor 1126) and/or any other entity or node in a vehicle system. In some embodiments, the one or more of the ECUs 1123 and/or system 1100 may be part of or included in a Portable Emissions Measurement Systems (PEMS).
[0064] The actuators 1122 are devices that allow system 1100 to change a state, position, orientation, move, and/or control a mechanism or system in the vehicle 10. The actuators 1122 comprise electrical and/or mechanical devices for moving or controlling a mechanism or system, and converts energy (e.g., electric current or moving air and/or liquid) into some kind of motion. The actuators 1122 may include one or more electronic (or electrochemical) devices, such as piezoelectric biomorphs, solid state actuators, solid state relays (SSRs), shape-memory alloy- based actuators, electroactive polymer-based actuators, relay driver integrated circuits (ICs), and/or the like. The actuators 1122 may include one or more electromechanical devices such as pneumatic actuators, hydraulic actuators, electromechanical switches including electromechanical relays (EMRs), motors (e g., linear motors, DC motors, brushless motors, stepper motors, servomechanisms, ultrasonic piezo motor with optional position feedback, screw-type motors, etc.), mechanical gears, magnetic switches, valve actuators, fuel injectors, ignition coils, wheels, thrusters, propellers, claws, clamps, hooks, an audible sound generator, and/or other like electromechanical components. As examples, the translation device or motor 1080 discussed previously may be among the one or more of the actuators 1122. The system 1100 may be configurable to operate one or more actuators 1122 based on one or more captured events and/or instructions or control signals received from various ECUs 1123 or system 1100. The system 1100 may transmit instructions to various actuators 1122 (or controllers that control one or more actuators 1122) to reconfigure an electrical network as discussed herein.
[0065] Additionally or alternatively, the processor circuitry 1102 and/or the ECUs 1123 are configurable to operate one or more actuators 1122 by transmitting/sending instructions or control signals to one or more actuators 1122 based on detected events. Individual ECUs 1123 may be capable of reading or otherwise obtaining sensor data from the sensor circuitry 1120, processing the sensor data to generate control system data, and providing the control system data to the system 1100 for processing. The control system information may be a type of state information discussed previously. For example, an ECU 1123 may provide engine revolutions per minute (RPM) of an engine of the vehicle 10, fuel injector activation timing data of one or more cylinders and/or one or more injectors of the engine, ignition spark timing data of the one or more cylinders (e.g., an indication of spark events relative to crank angle of the one or more cylinders), transmission gear ratio data and/or transmission state data (which may be supplied to the ECU 1123 by the TCU), real-time calculated engine load values from the ECM, etc.; a TCU may provide transmission gear ratio data, transmission state data, etc.; and the like.
[0066] The EO devices 1186 may be present within, or connected to, the system 1100. The EO devices 1186 include input devices and output devices including one or more user interfaces designed to enable user interaction with the system 1100 and/or peripheral component interaction with the system 1100 via peripheral component interfaces. The input devices include any physical or virtual means for accepting an input including, inter alia, one or more physical or virtual buttons (e.g., a reset button), a physical keyboard, keypad, mouse, touchpad, touchscreen, microphones, scanner, headset, and/or the like. It should be noted that user input may comprise voice commands, control input (e.g., via buttons, knobs, switches, etc ), an interface with a smartphone, or any combination thereof.
[0067] The output devices are used to show or convey information, such as sensor readings, actuator position(s), or other like information. Data and/or graphics may be displayed on one or more user interface components of the output devices. The output devices may include any number and/or combinations of audio or visual display, including, inter alia, one or more simple visual outputs/indicators (e g., binary status indicators (e.g., light emitting diodes (LEDs)) and multi character visual outputs, or more complex outputs such as display devices or touchscreens (e.g., Liquid Chrystal Displays (LCD), LED displays, quantum dot displays, projectors, Head-Up Display (HUD) devices, etc ), with the output of characters, graphics, multimedia objects, and the like being generated or produced from the operation of the system 1100. The output devices may also include speakers or other audio emitting devices, printer(s), and/or the like. In some embodiments, the sensor circuitry 1120 may be used as an input device (e.g., an image capture device, motion capture device, or the like) and one or more actuators 1122 may be used as an output device (e.g., an actuator to provide haptic feedback or the like). In another example, near field communication (NFC) circuitry comprising an NFC controller coupled with an antenna element and a processing device may be included as an input device to read electronic tags and/or connect with another NFC-enabled device. Furthermore, the output devices include the HUD system 1000.
[0068] As alluded to previously, the HUD system 1000 is also included in the vehicle 1005. In this example, the HUD system 1000 comprises one or more PGUs 1130 (e.g., PGU 101 of Figure 1) and one or more optical elements (e.g., corrector 102 (or components thereof) and/or combiner 103 of Figure 1) where at least one of the optical elements is a display element (e.g., combiner 103 ofFigure 1).
[0069] The PGU 1130 includes a projection unit (or “projector”) and a computer device. The computer device comprises one or more electronic elements that create/generate digital content to be displayed by the projection unit. The computer device may be the processor circuitry 1102, HUD processor 1010, OBU 1020, and/or a similar processing device as discussed herein. The digital content (e.g., text, images, video, etc.) may be any type of content stored by the storage circuitry 1108, streamed from a remote system/service (e.g., one or more vehicles, one or more roadside units (RSUs), app server(s), cloud computing service, edge network/computing service, etc.) via the network 1117 and/or the communication circuitry 1109, and/or based on outputs from various sensors 1120, ECUs 1124, and/or actuators 1122. The content to be displayed may include, for example, safety messages (e.g., collision warnings, emergency warnings, pre-crash warnings, traffic warnings, and the like), Short Message Service (SMS)/Multimedia Messaging Service (MMS) messages, navigation system information (e.g., maps, turn-by-turn indicator arrows), movies, television shows, video game images, and the like.
[0070] The projector may be the same or similar as the projection device 1030 discussed previously with respect to Figure 10, and in some implementations, the projection unit includes the imaging matrix 1050 discussed previously with respect to Figure 10. The projection unit (or “projector”) is a device or system that projects still or moving images onto the surface(s) of a display surface such as display device 1060 and/or virtual image planes 1070, and/or the surface(s) of an HOE 1131. The projected light may be projected onto the surface(s) via one or more reflection surfaces (e g., mirrors) based on signals received from the computer device. The projection unit includes a light generator (or light source) to generate light based on the digital content, which is focused or (re)directed to one or more HOEs (e.g., display surface(s)). The projection unit also includes various electronic elements (or an electronic system) that convert digital content or signals obtained from the computer device into signals for controlling the light source to generate/output light of different colors and intensities. As examples, the projector is a light emitting diode (LED) projector, a laser diode projector, a liquid crystal display (LCD) projector and/or an LCD with laser illumination, a digital light processing (DLP) projector, a digital micro-mirror device (DMD), a liquid crystal on silicon (LCoS) matrix/projector, micromirror(s), a microelectromechanical (MEMS) and/or microoptoelectromechanical system (MOEMS) scanner, MEMS and/or MOEMS laser scanner, MEMS and/or MOEMS mirror(s), and/or any other suitable projection device, including those discussed elsewhere herein. [0071] Additionally or alternatively, the PGU 1130 (or PGU 101) may also comprise scanning mirrors that copy the image pixel -by-pixel and then project the image for display. Here, the PGU 1130 (or PGU 101) performs scanning using one dual-axis mirror or two single-axis mirrors (sometimes referred to as “tip-tilt”), and/or by the use of some other matrix-type amplitude or phase modulation. Additionally or alternatively, scanning may be performed using an array of semiconductor lasers. Various drive forces may be used to operate such micromirror and/or MEMS/MOEMS devices such as relevant principles for driving such as electromagnetic, electrostatic, thermo-electric, and/or piezo-electric effects.
[0072] The one or more optical elements of the HUD system 1000 include optical components that manipulate light such as lenses, filters, prisms, mirrors, beam splitters, diffraction gratings, multivariate optical elements (MOEs), and/or the like, including the various optical elements discussed previously with respect to corrector 102 of Figure 1. Additionally or alternatively, the one or more optical elements include a collimator (e.g., a one or more lenses, apertures, curved mirror, etc.) to narrow the beam of projected light from the projector. Here, “narrowing” the beam refers to causing the directions of motion to become more aligned in a specific direction (i.e., make collimated light or parallel rays), or to cause the spatial cross section of the beam to become smaller (e.g., a beam limiting device). Additionally or alternatively, the one or more optical elements include a relay lens assembly and another combiner element (which is different than the combiner used for displaying the projected image). The relay lens assembly comprises one or more relay lenses, which re-image images from the projector into an intermediate image that then reaches the HOE 1131 in or on the display element through a reflector.
[0073] The one or more optical elements include a holographic optical element (HOE) 1131. The HOE 1131 is an optical element that is used to produce holographic images. The at least one optical element that is the display element is the combiner 103 of Figure 1. The combiner 103 combines the generated/projected light with external (e.g., natural) light and/or combines different light paths into one light path to define a palette of colors. The combiner 103 may be a beam splitter or semi-transparent display surface located directly in front of a viewer (e.g., operator of vehicle 1005) that redirects the projected image from projector in such a way as to see the FoV and the projected image at the same time. In addition to reflecting the projected light from the projector unit, the combiner 103 also allows other wavelengths of light to pass through the combiner 103. In this way, the combiner 103 (as well as the HOE 1131) mixes the digital images output by the projector with a viewed real-world to facilitate augmented reality.
[0074] The combiner (e.g., combiner 103) may have a flat surface or a curved surface (e.g., concave or convex) to aid in focusing the projected image. The HOE 1131 may be a transmissive optical element, where the transmitted beam (reference beam) hits the HOE 1131 and the diffracted beam(s) go through the HOE 1131. Alternatively, the HOE 1131 may be a reflective optical element, where the transmitted beam (reference beam) hits the HOE 1131 and the diffracted beam(s) reflects off of the HOE 1131 (e.g., the reference beam and diffracted beams are on the same side of the HOE 1131).
[0075] The combiner 103 may be formed or made of a suitable material and includes an HOE 1131 that enables the combiner 103 to reflect the projected light while allowing external (natural) light to pass through the combiner 103. As examples, the combiner 103 may be a windshield or windscreen of the vehicle 1005, a separate semi-reflective surface mounted to a dashboard of the vehicle 1005, a switchable projection screen that switches between high contrast mode (e.g., a frosted or matte) and a transparent (e.g., holographic) mode, an HMD, helmet-mounted display, and/or the like.
[0076] In one implementation, the HOE 1131 is disposed on an inner part of the combiner 103. For example, the combiner 103 may be formed or made of a suitable material with a holographic film that enables the combiner 103 to reflect the projected light while allowing external (natural) light to pass through the combiner 103. Here, the HOE 1131 is the holographic film. The holographic film may cover an entirety of the combiner 103 or a selected portion of the combiner 103. The holographic film is a film composite including one or more substrate films, one or more photopolymer films, one or more protective films, and/or other films/substrates and/or combinations thereof. These films may be arranged in any desired arrangement or configuration. In another implementation, the HOE 1131 is disposed inside the combiner 103. For example, the combiner 103 may be formed or made of one or more suitable materials that surround the HOE 1131 using a duplex production process to form an A-B duplex (where A and B are materials) or a triplex production process to form an A-B-A triplex or an A-B-C triplex (where A, B, and C are materials).
[0077] In the aforementioned implementations, the materials or material composites (including materials A, B, and C) of the HOE 1131 are based on polycarbonate (PC), polyethylene terephthalate (PET), polybutylene terephthalate, polyethylene, polypropylene, cellulose acetate, cellulose hydrate, cellulose nitrate, cyclo olefin polymers (also referred to as cyclic olefin polymers), polystyrene, polyepoxides, polysulphone, cellulose triacetate (CTA), polyamide, polymethyl methacrylate, polyvinyl chloride, polyvinyl butyral (PVB), polydicyclopentadiene, thermoplastic polyurethane (TPU), and/or combinations thereof. Additionally or alternatively, composites of the aforementioned materials may be formed as film laminates, coextrudates, and/or transparent films (e.g. with little or no haze as described in ASTM International, “Standard Test Method for Haze and Luminous Transmittance of Transparent Plastics”, ASTM D1003, West Conshohocken, PA (2021) (“ASTM D1003-21”)). Additionally or alternatively, the suitable material of the combiner 103 may be formed from one or more of glass, plastic(s), polymer(s), and/or other similar material including any one or more of the aforementioned HOE 1131 materials.
[0078] The battery 1124a and/or power block 1124b may power the system 1100. In embodiments, the battery 1124a may be a typical lead-acid automotive battery, although in some embodiments, such as when vehicle 1005 is a hybrid vehicle, the battery 1124a may be a lithium ion battery, a metal-air battery, such as a zinc-air battery, an aluminum-air battery, a lithium-air battery, a lithium polymer battery, and the like. The battery monitor 1126 may be included in the system 1100 to track/monitor various parameters of the battery 1124a, such as a state of charge (SoCh) of the battery 1124, state of health (SoH), and the state of function (SoF) of the battery 1124. The battery monitor 1126 may include a battery monitoring IC, which may communicate battery information to the processor circuitry 1102 over the IX 1106.
[0079] While not shown, various other devices may be present within, or connected to, the system 1100. For example, EO devices, such as a display, a touchscreen, or keypad may be connected to the system 1100 via IX 1106 to accept input and display outputs. In another example, GNSS and/or GPS circuitry and associated applications may be included in or connected with system 1100 to determine a geolocation of the vehicle 1005. In another example, the communication circuitry 1109 may include a Universal Integrated Circuit Card (UICC), embedded UICC (eUICC), and/or other elements/components that may be used to communicate over one or more wireless networks 1117.
3. EXAMPLE IMPLEMENTATIONS
[0080] Some non-limiting example as provided infra. The following examples pertain to further embodiments. Specifics in the examples may be used anywhere in one or more embodiments. All optional features of the apparatus(es) described herein may also be implemented with respect to a method or process.
[0081] Example A01 includes a correction optics subassembly of a holographic head-up display (hHUD) system, comprising: one or more optical elements disposed between a picture generation unit (PGU) of the hHUD system and a display element of the hHUD system, wherein a combination of surfaces of the one or more optical elements includes at least one reflective surface and at least two refractive surfaces.
[0082] Example A02 includes the correction optics subassembly of example A01 and/or some other example(s) herein, wherein a first refractive surface of the at least two refractive surfaces is a surface of a first optical element of the one or more optical elements, a second refractive surface of the at least two refractive surfaces is a surface of a second optical element of the one or more optical elements, and the at least one reflective surface is a surface of a third optical element of the one or more optical elements, wherein the first, second, and third optical elements are different from one another.
[0083] Example A03 includes the correction optics subassembly of example A02 and/or some other example(s) herein, wherein the third optical element is disposed between the first and second optical elements.
[0084] Example A04 includes the correction optics subassembly of examples A02-A03 and/or some other example(s) herein, wherein the first optical element is configured to guide light generated by the PGU to the third optical element, the third optical element is configured to reflect the guided light from the first optical element to the second optical element, and the second optical element is configured to guide the reflected light to the display element.
[0085] Example A05 includes the correction optics subassembly of example A04 and/or some other example(s) herein, wherein a shape of the second optical element is a shape substantially having rotational symmetry.
[0086] Example A06 includes the correction optics subassembly of example A05 and/or some other example(s) herein, wherein the hHUD system is an on-axis optical system. [0087] Example A07 includes the correction optics subassembly of example A04 and/or some other example(s) herein, wherein a shape of the second optical element is a substantially aspherical shape.
[0088] Example A08 includes the correction optics subassembly of example A07 and/or some other example(s) herein, wherein the hHUD system is an off-axis optical system.
[0089] Example A09 includes the correction optics subassembly of example A04 and/or some other example(s) herein, wherein a shape of the second optical element is a substantially prismatic shape.
[0090] Example A10 includes the correction optics subassembly of example A09 and/or some other example(s) herein, wherein the hHUD system is an on-axis optical system or an off-axis optical system.
[0091] Example Al l includes the correction optics subassembly of examples A01-A10 and/or some other example(s) herein, wherein two refractive surfaces of the at least two refractive surfaces and one reflective surface of the at least one reflective surface are part of a single optical element of the one or more optical elements.
[0092] Example A12 includes the correction optics subassembly of example Al l and/or some other example(s) herein, wherein a first refractive surface of the two refractive surfaces is configured to guide light generated by the PGU to the one reflective surface, the one reflective surface is configured to reflect the guided light from the first refractive surface to a second refractive surface of the two refractive surfaces, and the second refractive surface is configured to guide the reflected light to the display element.
[0093] Example A13 includes the correction optics subassembly of example A12 and/or some other example(s) herein, wherein a shape of the second optical element is a freeform shape without rotational symmetry.
[0094] Example A14 includes the correction optics subassembly of examples A02-A13 and/or some other example(s) herein, wherein the second refractive surface is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
[0095] Example A15 includes the correction optics subassembly of examples A02-A14 and/or some other example(s) herein, wherein the first refractive surface is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
[0096] Example A16 includes the correction optics subassembly of examples A14-A15 and/or some other example(s) herein, wherein the freeform surface is formed based on a function selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non-orthogonal polynomial, hybrid stitched representations based on a combination of two or more functions selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non- orthogonal polynomial.
[0097] Example A17 includes the correction optics subassembly of examples A01-A16 and/or some other example(s) herein, wherein the one or more optical elements are arranged with respect to one another and with respect to the PGU and the display element to correct aberrations in images created by projected light from the PGU.
[0098] Example A18 includes the correction optics subassembly of examples A01-A17 and/or some other example(s) herein, wherein each optical element of the one or more optical elements is an optical element selected from a group consisting of a lens, prism, prismatic lens, mirror, and holographic optical element.
[0099] Example A19 includes the correction optics subassembly of examples A01-A18 and/or some other example(s) herein, wherein each optical element of the one or more optical elements are formed into a three-dimensional shape selected from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid, or a combination of any two or more shapes from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid.
[0100] Example A20 includes the correction optics subassembly of examples A01-A19 and/or some other example(s) herein, wherein the one or more optical elements includes a scattering surface on which light representative of a virtual image is projected by the PGU.
[0101] Example A21 includes the correction optics subassembly of example A20 and/or some other example(s) herein, wherein the scattering surface comprises a diffusion screen, a diffuser plate, or an array of microlenses.
[0102] Example A22 includes the correction optics subassembly of examples A01-A21 and/or some other example(s) herein, wherein the display element comprises a holographic optical element disposed on or inside a windshield of a vehicle.
[0103] Example A23 includes a holographic head-up display (hHUD) system, comprising: a picture generation unit (PGU); a display element; and the correction optics assembly of any one of examples A01-A22 and/or some other example(s) herein disposed between of the PGU and the display element.
[0104] Example B01 includes a compact holographic head-up display (hHUD) device, comprising: a picture generation unit (PGU), a combiner comprising a holographic optical element (HOE) with an optical power between 1,1 - 6,6 diopters; and a correction optics assembly, disposed between the PGU and the combiner, the correction optics assembly comprising at least one optical element with at least two refractive surfaces.
[0105] Example B02 includes the compact hHUD device of example B01 and/or some other example(s) herein, wherein the at least one optical element further comprises at least one reflective surface disposed between the at least two refractive surfaces.
[0106] Example B03a includes the compact hHUD device of example B02 and/or some other example(s) herein, further comprising: a refractive surface of the at least two refractive surfaces is positioned and/or oriented between the HOE and the at least one reflective surface; and another refractive surface of the at least two refractive surfaces is positioned and/or oriented between the PGU and the at least one reflective surface.
[0107] Example B03b includes the compact hHUD device of examples B02-B03a and/or some other example(s) herein, further comprising: at least one refractive optical element disposed between the HOE and the at least one reflective surface; and at least one refractive optical element disposed between the PGU and the at least one reflective surface.
[0108] Example B04 includes the compact hHUD device of examples B03a-B03b and/or some other example(s) herein, wherein a form of the at least one optical element is a form substantially having rotational symmetry.
[0109] Example B05 includes the compact hHUD device of examples B03a-B04 and/or some other example(s) herein, wherein the compact hHUD device is an on-axis optical system.
[0110] Example B06 includes the compact hHUD device of examples B03a-B03b and/or some other example(s) herein, wherein a shape of a surface of the at least one optical element is a substantially aspherical shape.
[0111] Example B07 includes the compact hHUD device of examples B03a-B03b, B06, and/or some other example(s) herein, wherein the compact hHUD device is an off-axis optical system. [0112] Example B08 includes the compact hHUD device of example B03 and/or some other example(s) herein, wherein a form of the at least one optical element is a substantially prismatic form.
[0113] Example B09 includes the compact hHUD device of examples B03, B09, and/or some other example(s) herein, wherein the compact hHUD device is an on-axis optical system or an off-axis optical system.
[0114] Example B10 includes the compact hHUD device of example B02 and/or some other example(s) herein, wherein two refractive surfaces of the at least two refractive surfaces and one reflective surface of the at least one reflective surface are part of a single optical element of the at least one optical element.
[0115] Example Bl l includes the compact hHUD device of example B02 and/or some other example(s) herein, wherein a shape of a surface of the at least one optical element is a freeform shape without rotational symmetry.
[0116] Example B 12 includes the compact hHUD device of examples B02-B11 and/or some other example(s) herein, wherein at least one refractive surface of the at least two refractive surfaces is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
[0117] Example B13 includes the compact hHUD device of example B12 and/or some other example(s) herein, wherein at least one other refractive surface of the at least two refractive surfaces is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface. [0118] Example B 14 includes the compact hHUD device of examples B12-B 13 and/or some other example(s) herein, wherein the freeform surface is formed based on a function selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non-orthogonal polynomial, hybrid stitched representations based on a combination of two or more functions selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non-orthogonal polynomial .
[0119] Example B 15 includes the compact hHUD device of examples B02-B14 and/or some other example(s) herein, wherein the correction optics assembly comprises a plurality of optical elements, the plurality of optical elements including the at least one optical element.
[0120] Example B16 includes the compact hHUD device of example B15 and/or some other example(s) herein, wherein the plurality of optical elements are arranged with respect to one another and with respect to the PGU and the combiner to correct aberrations in images created by projected light from the PGU.
[0121] Example B 17 includes the compact hHUD device of examples B02-B16 and/or some other example(s) herein, wherein the at least one optical element comprises one or more of a lens, prism, prismatic lens, mirror, and a holographic optical element.
[0122] Example B 18 includes the compact hHUD device of examples B02-B17 and/or some other example(s) herein, wherein the at least one optical element is formed into a three-dimensional shape selected from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid, or a combination of any two or more shapes from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid.
[0123] Example B 19 includes the compact hHUD device of examples B02-B18 and/or some other example(s) herein, wherein the correction optics assembly further comprises a scattering surface on to which light representative of a virtual image is projected by the PGU.
[0124] Example B20 includes the compact hHUD device of example B19 and/or some other example(s) herein, wherein the scattering surface comprises a diffusion screen, a diffuser plate, or an array of microlenses.
[0125] Example C01 includes a vehicle comprising: one or more control components for controlling operation of the vehicle; a windshield; and the head-up display (hHUD) system of example A23 and/or the hHUD device of any one or more of examples B01-B20.
4. TERMINOLOGY
[0126] For the purposes of the present document, the phrase “A or B” means (A), (B), or (A and B). The phrases “A/B” and “A or B” mean (A), (B), or (A and B), similar to the phrase “A and/or B.” For the purposes of the present disclosure, the phrase “at least one of A and B” means (A), (B), or (A and B). The description may use the terms “embodiment” or “embodiments,” which may each refer to one or more of the same or different embodiments. Furthermore, the terms “comprising,” “including,” “having,” and the like, as used with respect to one or more embodiments, are synonymous, are generally intended as "open" terms (e g., the term "including" should be interpreted as "including but not limited to," the term "having" should be interpreted as "having at least," the term "includes" should be interpreted as "includes but is not limited to," etc.), and specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The terms “comprises,” “comprising,” “includes,” and/or “including,” The phrase “in various embodiments,” “in some embodiments,” and the like are used repeatedly. The phrase generally does not refer to the same embodiments; however, it may. The present disclosure may use the phrases “in an embodiment,” “in embodiments,” “in some embodiments,” and/or “in various embodiments,” which may each refer to one or more of the same or different embodiments.
[0127] The terms “coupled,” “communicatively coupled,” along with derivatives thereof are used herein. The term “coupled” may mean two or more elements are in direct physical or electrical contact with one another, may mean that two or more elements indirectly contact each other but still cooperate or interact with each other, and/or may mean that one or more other elements are coupled or connected between the elements that are said to be coupled with each other. The term “directly coupled” may mean that two or more elements are in direct contact with one another. The term “communicatively coupled” may mean that two or more elements may be in contact with one another by a means of communication including through a wire or other interconnect connection, through a wireless communication channel or ink, and/or the like.
[0128] The term “fabrication” refers to the creation of a metal structure using fabrication means. The term “fabrication means” as used herein refers to any suitable tool or machine that is used during a fabrication process and may involve tools or machines for cutting (e.g., using manual or powered saws, shears, chisels, routers, torches including handheld torches such as oxy-fuel torches or plasma torches, and/or computer numerical control (CNC) cutters including lasers, mill bits, torches, water jets, routers, etc ), bending (e.g., manual, powered, or CNC hammers, pan brakes, press brakes, tube benders, roll benders, specialized machine presses, etc.), assembling (e.g., by welding, soldering, brazing, crimping, coupling with adhesives, riveting, using fasteners, etc.), molding or casting (e.g., die casting, centrifugal casting, injection molding, extrusion molding, matrix molding, three-dimensional (3D) printing techniques including fused deposition modeling, selective laser melting, selective laser sintering, composite filament fabrication, fused filament fabrication, stereolithography, directed energy deposition, electron beam freeform fabrication, etc ), and PCB and/or semiconductor manufacturing techniques (e g., silk-screen printing, photolithography, photoengraving, PCB milling, laser resist ablation, laser etching, plasma exposure, atomic layer deposition (ALD), molecular layer deposition (MLD), chemical vapor deposition (CVD), rapid thermal processing (RTP), and/or the like).
[0129] The term “fastener”, “fastening means”, or the like refers to device that mechanically joins or affixes two or more objects together, and may include threaded fasteners (e.g., bolts, screws, nuts, threaded rods, etc.), pins, linchpins, r-clips, clips, pegs, clamps, dowels, cam locks, latches, catches, ties, hooks, magnets, molded or assembled joineries, and/or the like.
[0130] The term “lateral” refers to directions or positions relative to an object spanning the width of a body of the object, relating to the sides of the object, and/or moving in a sideways direction with respect to the object. The term “longitudinal” refers to directions or positions relative to an object spanning the length of a body of the object; relating to the top or bottom of the object, and/or moving in an upwards and/or downwards direction with respect to the object. The term “linear” refers to directions or positions relative to an object following a straight line with respect to the object, and/or refers to a movement or force that occurs in a straight line rather than in a curve. The term “lineal” refers to directions or positions relative to an object following along a given path with respect to the object, wherein the shape of the path is straight or not straight. [0131] The terms “flexible,” “flexibility,” and/or “pliability” refer to the ability of an object or material to bend or deform in response to an applied force; “the term “flexible” is complementary to “stiffness.” The term “stiffness” and/or “rigidity” refers to the ability of an object to resist deformation in response to an applied force. The term “elasticity” refers to the ability of an object or material to resist a distorting influence or stress and to return to its original size and shape when the stress is removed. Elastic modulus (a measure of elasticity) is a property of a material, whereas flexibility or stiffness is a property of a structure or component of a structure and is dependent upon various physical dimensions that describe that structure or component.
[0132] The term “wear” refers to the phenomenon of the gradual removal, damaging, and/or displacement of material at solid surfaces due to mechanical processes (e.g., erosion) and/or chemical processes (e.g., corrosion). Wear causes functional surfaces to degrade, eventually leading to material failure or loss of functionality. The term “wear” as used herein may also include other processes such as fatigue (e.g., he weakening of a material caused by cyclic loading that results in progressive and localized structural damage and the growth of cracks) and creep (e.g., the tendency of a solid material to move slowly or deform permanently under the influence of persistent mechanical stresses). Mechanical wear may occur as a result of relative motion occurring between two contact surfaces. Wear that occurs in machinery components has the potential to cause degradation of the functional surface and ultimately loss of functionality. Various factors, such as the type of loading, type of motion, temperature, lubrication, and the like may affect the rate of wear.
[0133] The term “circuitry” refers to a circuit or system of multiple circuits configurable to perform a particular function in an electronic device. The circuit or system of circuits may be part of, or include one or more hardware components, such as a logic circuit, a processor (shared, dedicated, or group) and/or memory (shared, dedicated, or group), an Application Specific Integrated Circuit (ASIC), a field-programmable gate array (FPGA), programmable logic device (PLD), System-on-Chip (SoC), System-in-Package (SiP), Multi-Chip Package (MCP), digital signal processor (DSP), etc., that are configurable to provide the described functionality. In addition, the term “circuitry” may also refer to a combination of one or more hardware elements with the program code used to carry out the functionality of that program code. Some types of circuitry may execute one or more software or firmware programs to provide at least some of the described functionality. Such a combination of hardware elements and program code may be referred to as a particular type of circuitry.
[0134] As used herein, the term “element” may refer to a unit that is indivisible at a given level of abstraction and has a clearly defined boundary, wherein an element may be any type of entity. The term “entity” may refer to (1) a distinct component of an architecture or device, or (2) information transferred as a payload. As used herein, the term “device” may refer to a physical entity embedded inside, or attached to, another physical entity in its vicinity, with capabilities to convey digital information from or to that physical entity. The term “controller” may refer to an element or entity that has the capability to affect a physical entity, such as by changing its state or causing the physical entity to move.
[0135] The term “computer device” may describe any physical hardware device capable of sequentially and automatically carrying out a sequence of arithmetic or logical operations, equipped to record/store data on a machine readable medium, and transmit and receive data from one or more other devices in a communications network. A computer device may be considered synonymous to, and may hereafter be occasionally referred to, as a computer, computing platform, computing device, etc. The term “computer system” may include any type interconnected electronic devices, computer devices, or components thereof. Additionally, the term “computer system” and/or “system” may refer to various components of a computer that are communicatively coupled with one another. Furthermore, the term “computer system” and/or “system” may refer to multiple computer devices and/or multiple computing systems that are communicatively coupled with one another and configurable to share computing and/or networking resources. Examples of “computer devices,” “computer systems,” etc. may include cellular phones or smart phones, feature phones, tablet personal computers, wearable computing devices, an autonomous sensors, laptop computers, desktop personal computers, video game consoles, digital media players, handheld messaging devices, personal data assistants, an electronic book readers, augmented reality devices, server computer devices (e.g., stand-alone, rack-mounted, blade, etc.), cloud computing services/systems, network elements, in-vehicle infotainment (IVI), in-car entertainment (ICE) devices, an Instrument Cluster (IC), head-up display (HUD) devices, onboard diagnostic (OBD) devices, dashtop mobile equipment (DME), mobile data terminals (MDTs), Electronic Engine Management System (EEMS), electronic/engine control units (ECUs), electronic/engine control modules (ECMs), embedded systems, microcontrollers, control modules, engine management systems (EMS), networked or “smart” appliances, machine-type communications (MTC) devices, machine-to-machine (M2M), Internet of Things (IoT) devices, and/or any other like electronic devices. Moreover, the term “vehicle-embedded computer device” may refer to any computer device and/or computer system physically mounted on, built in, or otherwise embedded in a vehicle.
[0136] As used herein, the term “network element” refers to a physical computing device of a wired or wireless communication network and be configurable to host a virtual machine. Furthermore, the term “network element” may describe equipment that provides radio baseband functions for data and/or voice connectivity between a network and one or more users. The term “network element” may be considered synonymous to and/or referred to as a networked computer, networking hardware, network equipment, network appliance, router, switch, hub, bridge, radio network controller, firewall, radio access network (RAN) node, base station, gateway, server, and/or any other like device.
[0137] The term “network element” may be considered synonymous to and/or referred to as a “base station.” As used herein, the term “base station” may be considered synonymous to and/or referred to as a node B, an enhanced or evolved node B (eNB), next generation nodeB (gNB), base transceiver station (BTS), access point (AP), roadside unit (RSU), etc., and may describe equipment that provides the radio baseband functions for data and/or voice connectivity between a network and one or more users. As used herein, the terms “vehicle-to-vehicle” and “V2V” may refer to any communication involving a vehicle as a source or destination of a message. Additionally, the terms “vehicle-to-vehicle” and “V2V” as used herein may also encompass or be equivalent to vehicle-to-infrastructure (V2I) communications, vehicle-to-network (V2N) communications, vehicle-to-pedestrian (V2P) communications, or V2X communications.
[0138] As used herein, the term “channel” refers to any transmission medium, either tangible or intangible, which is used to communicate data or a data stream. The term “channel” may be synonymous with and/or equivalent to “communications channel,” “data communications channel,” “transmission channel,” “data transmission channel,” “access channel,” “data access channel,” “link,” “data link,” “carrier,” “radiofrequency carrier,” and/or any other like term denoting a pathway or medium through which data is communicated. Additionally, the term “link” may refer to a connection between two devices through a Radio Access Technology (RAT) for the purpose of transmitting and receiving information.
[0139] As used herein, the term “optical element” refers to any component, object, substance, and/or material used for, or otherwise related to the genesis and propagation of light, the changes that light undergoes and produces, and/or other phenomena associated with the principles that govern the image-forming properties of various devices that make use of light and/or the nature and properties of light itself.
[0140] As used herein, the term “lens” refers to a transparent substance or material (usually glass) that is used to form an image of an object by focusing rays of light from the object. A lens is usually circular in shape, with two polished surfaces, either or both of which is/are curved and may be either convex (bulging) or concave (depressed). The curves are almost always spherical; i.e., the radius of curvature is constant.
[0141] As used herein, the term “mirror” refers to a surface of a material or substance that diverts a ray of light according to the law of reflection.
[0142] As used herein, the term “prism” refers to a transparent optical element with flat, polished surface(s) that refract light. Additionally or alternatively, the term “prism” refers to a polyhedron comprising an //-sided polygon base, a second base that is a translated copy (rigidly moved without rotation) of the first base, and n other faces j oining corresponding sides of the two bases.
[0143] As used herein, the term “holographic optical element” or “HOE” refers to an optical component (e g., mirrors, lenses, filters, beam splitters, directional diffusers, diffraction gratings, etc.) that produces holographic images using holographic imaging processes or principles, such as the principles of diffraction. The shape and structure of an HOE is dependent on the piece of hardware it is needed for, and the coupled wave theory is a common tool used to calculate the diffraction efficiency or grating volume that helps with the design of an HOE.
[0144] As used herein, the term “focus” refers to a point where light rays originating from a point on the object converge. Sometimes, the term “focus” may be referred to as an “image point”. [0145] As used herein, the term “optical power” refers to the degree to which an optical element or optical system converges or diverges light. The optical power of an optical element is equal to the reciprocal of the focal length of the device. High optical power corresponds to short focal length. The SI unit for optical power is the inverse meter (m_1), which is commonly referred to as a Diopter (or “Dioptre”). The term “optical power” is sometimes referred to as dioptric power, refractive power, focusing power, or convergence power.
[0146] As used herein, the term “vergence” refers to the angle formed by rays of light that are not perfectly parallel to one another. Additionally or alternatively, the term “vergence” refers to the curvature of optical wavefronts. The terms “convergence”, “convergent”, and “converging” refer to light rays that move closer to the optical axis as they propagate. Additionally or alternatively, the terms “convergence”, “convergent”, and “converging” refer to wavefronts propagating toward a single point and/or wavefronts that yield a positive vergence. The terms “divergence”, “divergent”, and “diverging” refer to light rays that move away from the optical axis as they propagate. Additionally or alternatively, the terms “divergence”, “divergent”, and “diverging” refer to wavefronts propagating away from a single source point and/or wavefronts that yield a negative vergence. Typically, convex lenses and concave mirrors cause parallel rays to converge, and concave lenses and convex mirrors cause parallel rays to diverge.
[0147] As used herein, the term “wavefront” refers to a set (locus) of all points where a wave has the same phase and/or a surface or medium over which an optical wave has a constant phase. [0148] As used herein, the term “coincidence” in the context of optics refers to an instance of rays of light striking a surface at the same point and at the same time.
[0149] As used herein, the term “normal” refers to a line, ray, or vector that is perpendicular to a given object. As used herein, the term “normal ray” is the outward-pointing light ray perpendicular to the surface of an optical medium and/or optical element at a given point.
[0150] As used herein, the term “optical axis” refers to a line along which there is some degree of rotational symmetry in an optical system. Additionally or alternatively, the term “optical axis” refers to a straight line passing through the geometrical center of an optical element. The path of light ray(s) along the optical axis is perpendicular to the surface(s) of the optical element. The term “optical axis” may also be referred to as a “principal axis”. All other ray paths passing through the optical element and its optical center (the geometrical center of the optical element) may be referred to as “secondary axes”. The optical axis of a lens is a straight line passing through the geometrical center of the lens and joining the two centers of curvature of its surfaces. The optical axis of a curved mirror passes through the geometric center of the mirror and its center of curvature.
[0151] As used herein, the term “off-axis optical system” refers to an optical system in which the optical axis of the aperture is not coincident with the mechanical center of the aperture. [0152] As used herein, the term “mechanical axis” refers to an axis that passes through the physical center of an optical element and/or is perpendicular to the outside edges of the optical element.
[0153] As used herein, the term “aperture” refers to a hole or an opening through which light travels. Additionally or alternatively, the “aperture” and focal length of an optical system determine the cone angle of a bundle of rays that come to a focus in the image plane.
[0154] As used herein, the term “curvature” refers to a rate of change of direction of a curve with respect to distance along the curve.
[0155] As used herein, the term “spherical” refers to an object having a shape that is or is substantially similar to a sphere. A “sphere” is a set of all points in three-dimensional space lying the same distance (the radius) from a given point (the center), or the result of rotating a circle about one of its diameters.
[0156] As used herein, the term “toroidal” refers to an object having a shape that is or is substantially similar to a torus. A “torus” is a surface of revolution generated by revolving a circle in three-dimensional space about an axis that is coplanar with the circle.
[0157] As used herein, the term “anamorphic surface” refers to a non-symmetric surface with bi axial symmetry. As used herein, the terms “anamorphic element” and/or “anamorphic optical element” refer to an optical element with at least one anamorphic surface and/or an optical element with a combination of spherical, aspherical, and toroidal surfaces.
[0158] As used herein, the term “freeform surface” refers to a geometric element that does not have rigid radial dimensions. Additionally or alternatively, the term “freeform surface” refers to a surface with no axis of rotational invariance. Additionally or alternatively, the term “freeform surface” refers to a non-symmetric surface whose asymmetry goes beyond bi-axial symmetry, spheres, rotationally symmetric aspheres, off-axis conics, and toroids. Additionally or alternatively, the term “freeform surface” refers to a freeform surface may be identified by a comatic-shape component or higher-order rotationally variant terms of the orthogonal polynomial pyramids (or equivalents thereof). Additionally or alternatively, the term “freeform surface” refers to a specially shaped surface that refracts an incident light beam in a predetermined way. Freeform surfaces have more degrees of freedom in comparison with rotationally symmetric surfaces. [0159] As used herein, the term “freeform optical element” and/or “FOE” refers to an optical element with at least one freeform surface. Additionally or alternatively, the term “freeform optical element” and/or “FOE” refers to an optical element that has no translational or rotational symmetry about axes normal to the mean plane. Additionally or alternatively, the term “freeform optical element” and/or “FOE” refers to an optical element with specially shaped surface(s) that refract an incident light beam in a predetermined way. In contrast to diffractive optical elements (DOEs), the FOE surface structure is smooth, without abrupt height jumps or high-frequency modulations. Similar to classical lenses, FOEs affect a light beam by refraction at their curved surface structures. FOE refraction behavior is determined by geometrical optics (e.g., ray tracing), in contrast to DOEs, which are described by a wave optical model. Various aspects of freeform optics are discussed in Rolland et ah, "Freeform optics for imaging," Optica, vol. 8, pp. 161-176 (2021), which is hereby incorporated by reference in its entirety.
[0160] As used herein, the terms “rotational symmetry” and “radial symmetry” refer to a property of a shape or surface that looks the same after some rotation by a partial turn. An object's degree of rotational symmetry is the number of distinct orientations in which it looks exactly the same for each rotation.
[0161] As used herein, the terms “biaxial symmetry” or “bi-axial symmetry” refers to a property of a shape or surface that contains symmetrical designs on both horizontal and vertical axes. [0162] As used herein, the term “optical aberration” and/or “aberration” refers to a property of optical systems and/or optical elements that causes light to be spread out over some region of space rather than focused to a point. An aberration can be defined as a departure of the performance of an optical system from a predicted level of performance (or the predictions of paraxial optics).
[0163] As used herein, the term “laser” refers to light amplification by stimulated emission of radiation. Additionally or alternatively, the term “laser” refers to a device that emits light through a process of optical amplification based on stimulated emission of electromagnetic radiation. The term “laser” as used herein may refer to the device that emits laser light, the light produced by such a device, or both.
[0164] As used herein, the terms “speckle noise”, “speckle pattern”, or “speckle” refers to a granular pattern of bright and dark regions of intensity that occurs when laser light is scattered (or reflected) from a rough surface.
[0165] As used herein, the term “diffuser” refers to any device or material that diffuses or scatters light in some manner. A “diffuser” may include materials that reflect light, translucent materials (e.g., glass, ground glass, reflon/reflow, opal glass, greyed glass, etc.), and/or other materials. The term “diffractive diffuser” refers to a diffuser or diffractive optical element (DOE) that exploits the principles of diffraction and refraction. As used herein, the term “speckle diffuser devices (also referred to as “speckle diffusers”) refers to devices used in optics to destroy spatial coherence (or coherence interference) of laser light prior to reflection from a surface.
[0166] The above detailed description refers to the accompanying drawings, which shown, by way of illustration, embodiments that may be practiced. The same reference numbers may be used in different drawings to identify the same or similar elements. Various operations may be described as multiple discrete operations in turn, in a manner that may be helpful in understanding embodiments; however, the order of description should not be construed to imply that these operations are order-dependent. The present disclosure may use perspective-based descriptions such as up/down, back/front, top/bottom, and the like. Such descriptions are merely used to facilitate the understanding and are not intended to restrict the application to the disclosed embodiments.
[0167] The foregoing description of one or more implementations provides illustration and description of various example embodiments, but is not intended to be exhaustive or to limit the scope of embodiments to the precise form disclosed. Modifications and variations are possible in light of the above teachings or may be acquired from practice of various embodiments. Where specific details are set forth in order to describe aspects of the disclosure, it should be apparent to one skilled in the art that the disclosure can be practiced without, or with variation of, these specific details. The description is thus to be regarded as illustrative instead of limiting.

Claims

1. A compact holographic head-up display (hHUD) device, comprising: a picture generation unit (PGU), a combiner comprising a holographic optical element (HOE) with an optical power between 1,1 - 6,6 diopters; and a correction optics assembly, disposed between the PGU and the combiner, the correction optics assembly comprising at least one optical element with at least two refractive surfaces.
2. The compact hHUD device of claim 1, wherein the at least one optical element further comprises at least one reflective surface disposed between the at least two refractive surfaces.
3. The compact hHUD device of claim 2, further comprising: at least one refractive optical element disposed between the HOE and the at least one reflective surface; and at least one refractive optical element disposed between the PGU and the at least one reflective surface.
4. The compact hHUD device of claim 3, wherein a form of the at least one optical element is a form substantially having rotational symmetry.
5. The compact hHUD device of claim 3 or 4, wherein the compact hHUD device is an on-axis optical system.
6. The compact hHUD device of claim 3, wherein a shape of a surface of the at least one optical element is a substantially aspherical shape.
7. The compact hHUD device of claim 3 or 6, wherein the compact hHUD device is an off-axis optical system.
8. The compact hHUD device of claim 3 or 7, wherein a form of the at least one optical element is a substantially prismatic form.
9. The compact hHUD device of claim 3, wherein the compact hHUD device is an on-axis optical system or an off-axis optical system.
10. The compact hHUD device of claim 2, wherein two refractive surfaces of the at least two refractive surfaces and one reflective surface of the at least one reflective surface are part of a single optical element of the at least one optical element.
11. The compact hHUD device of claim 2, wherein a shape of a surface of the at least one optical element is a freeform shape without rotational symmetry
12. The compact hHUD device of any one of claims 2-11, wherein at least one refractive surface of the at least two refractive surfaces is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
13. The compact hHUD device of claim 12, wherein at least one other refractive surface of the at least two refractive surfaces is a spherical surface, an aspherical surface, an anamorphic surface, or a freeform surface.
14. The compact hHUD device of claim 12 or 13, wherein the freeform surface is formed based on a function selected from a group consisting of radial basis function, basis spline, wavelet, non- uniform rational basis spline, orthogonal polynomial, noil-orthogonal polynomial, hybrid stitched representations based on a combination of two or more functions selected from a group consisting of radial basis function, basis spline, wavelet, non-uniform rational basis spline, orthogonal polynomial, non -orthogonal polynomial.
15. The compact hHUD device of any one of claims 2-14, wherein the correction optics assembly comprises a plurality of optical elements, wherein the plurality of optical elements includes the at least one optical element.
16. The compact hHUD device of claim 15, wherein the plurality of optical elements are arranged with respect to one another and with respect to the PGU and the combiner to correct aberrations in images created by projected light from the PGU.
17. The compact hHUD device of any one of claims 2-16, wherein the at least one optical element comprises one or more of a lens, prism, prismatic lens, mirror, and a holographic optical element.
18. The compact hHUD device of any one of claims 2-17, wherein the at least one optical element is formed into a three-dimensional shape selected from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid, or a combination of any two or more shapes from a group consisting of planar, sphere, asphere, prism, pyramid, ellipsis, cone, cylinder, toroid.
19. The compact hHUD device of any one of claims 2-18, wherein the correction optics assembly further comprises a scattering surface on to which light representative of a virtual image is projected by the PGU.
20. The compact hHUD device of claim 19, wherein the scattering surface comprises a diffusion screen, a diffuser plate, or an array of microlenses.
PCT/IB2021/056977 2021-07-30 2021-07-30 Compact holographic head-up display device WO2023007230A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2021/056977 WO2023007230A1 (en) 2021-07-30 2021-07-30 Compact holographic head-up display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2021/056977 WO2023007230A1 (en) 2021-07-30 2021-07-30 Compact holographic head-up display device

Publications (1)

Publication Number Publication Date
WO2023007230A1 true WO2023007230A1 (en) 2023-02-02

Family

ID=77265128

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/056977 WO2023007230A1 (en) 2021-07-30 2021-07-30 Compact holographic head-up display device

Country Status (1)

Country Link
WO (1) WO2023007230A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4218111A (en) * 1978-07-10 1980-08-19 Hughes Aircraft Company Holographic head-up displays
JPH02179604A (en) * 1988-12-29 1990-07-12 Shimadzu Corp Head up display device
US20040108971A1 (en) * 1998-04-09 2004-06-10 Digilens, Inc. Method of and apparatus for viewing an image
WO2014045340A1 (en) * 2012-09-18 2014-03-27 パイオニア株式会社 Optical element, light source unit, and headup display
US20160216517A1 (en) * 2014-01-21 2016-07-28 Osterhout Group, Inc. Compact optical system with improved illumination
US20180299672A1 (en) * 2015-10-09 2018-10-18 Maxell, Ltd. Projection optical system and head-up display device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4218111A (en) * 1978-07-10 1980-08-19 Hughes Aircraft Company Holographic head-up displays
JPH02179604A (en) * 1988-12-29 1990-07-12 Shimadzu Corp Head up display device
US20040108971A1 (en) * 1998-04-09 2004-06-10 Digilens, Inc. Method of and apparatus for viewing an image
WO2014045340A1 (en) * 2012-09-18 2014-03-27 パイオニア株式会社 Optical element, light source unit, and headup display
US20160216517A1 (en) * 2014-01-21 2016-07-28 Osterhout Group, Inc. Compact optical system with improved illumination
US20180299672A1 (en) * 2015-10-09 2018-10-18 Maxell, Ltd. Projection optical system and head-up display device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CONI PHILIPPE ET AL: "The Future of Holographic Head-Up Display", IEEE CONSUMER ELECTRONICS MAGAZINE, IEEE, PISCATAWAY, NJ, USA, vol. 8, no. 5, 1 September 2019 (2019-09-01), pages 68 - 73, XP011743266, ISSN: 2162-2248, [retrieved on 20190830], DOI: 10.1109/MCE.2019.2923935 *
ROLLAND ET AL.: "Freeform optics for imaging", OPTICA, vol. 8, 2021, pages 161 - 176

Similar Documents

Publication Publication Date Title
US20210294944A1 (en) Virtual environment scenarios and observers for autonomous machine applications
JP2021022354A (en) Enhanced high-dynamic-range imaging and tone mapping
JP2022531092A (en) Simulating realistic test data from transformed real-world sensor data for autonomous machine applications
US20130314793A1 (en) Waveguide optics focus elements
CN109716244B (en) Holographic wide field-of-view display
US20240053163A1 (en) Graphical user interface and user experience elements for head up display devices
CN113260430B (en) Scene processing method, device and system and related equipment
CN104340068B (en) Radial type instrument board
CN102246083A (en) Diffractive combiner for multicolour and monochrome display, production method, and head-up display device using same
JP2022105256A (en) Image composition in multiview automotive and robotics systems
US20220057644A1 (en) Torsion spring speckle diffuser
US20220366568A1 (en) Adaptive eye tracking machine learning model engine
EP3605191B1 (en) Virtual image display device
JP2023021911A (en) Performing multiple point table lookups in single cycle in system on chip
CN115335756A (en) Image generation device, mirror, and head-up display
US10642043B2 (en) Holographic optical element design and manufacturing
JP2023024945A (en) Setting of direct memory access system for characteristics trace action in system-on-chip using vector processor
JP2023021912A (en) Accelerating table lookups using decoupled lookup table accelerators in system on chip
JP2023021914A (en) Built-in self test for programmable vision accelerator in system on chip
WO2023007230A1 (en) Compact holographic head-up display device
JP7390558B2 (en) Virtual image display system, image display method, head-up display, and moving object
WO2020110598A1 (en) Head-up display
CN114326119A (en) Head-up display device and head-up display method
US20230391189A1 (en) Synchronized rendering
US20240036311A1 (en) Head-up display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21752211

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE