WO2023192656A1 - Lunettes intelligentes à structures optiques améliorées pour des applications de réalité augmentée - Google Patents

Lunettes intelligentes à structures optiques améliorées pour des applications de réalité augmentée Download PDF

Info

Publication number
WO2023192656A1
WO2023192656A1 PCT/US2023/017227 US2023017227W WO2023192656A1 WO 2023192656 A1 WO2023192656 A1 WO 2023192656A1 US 2023017227 W US2023017227 W US 2023017227W WO 2023192656 A1 WO2023192656 A1 WO 2023192656A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
eyepiece
forming
planar waveguide
pattern
Prior art date
Application number
PCT/US2023/017227
Other languages
English (en)
Inventor
Barry David Silverstein
Kieran Connor Kelly
Original Assignee
Meta Platforms Technologies, Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/064,772 external-priority patent/US20230316672A1/en
Application filed by Meta Platforms Technologies, Llc filed Critical Meta Platforms Technologies, Llc
Publication of WO2023192656A1 publication Critical patent/WO2023192656A1/fr

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Definitions

  • the present application is directed to smart glasses having enhanced optical structures that, in addition to having a selected optical functionality, provide an aesthetically appealing view to an onlooker. More specifically, embodiments as disclosed herein include smart glasses having visually appealing optical structures to provide augmented reality (AR) applications.
  • AR augmented reality
  • a device comprising: an image generation engine to generate multiple light rays forming an image; an eyepiece including a display to project the image to a user for an augmented reality application, the eyepiece comprising a planar waveguide configured to transmit the light rays from the image generation engine; at least one optical element configured to couple the light rays into, and to provide the light rays from, the planar waveguide and through an eyebox limiting a volume that includes a pupil of the user; and an onlooker-functional portion of the eyepiece surrounding the at least one optical element, and shaped according to a pre-selected format, optionally to make the eyepiece aesthetically appealing to an onlooker, when the device is worn by a user.
  • the image generation engine comprises any one of a spatial light modulator, a direct arrayed light source, a light emitting diode array, and a point scanner.
  • the image generation engine includes a structure embedded within the optical element.
  • the at least one optical element includes a photonic integrated circuit.
  • the at least one optical element includes a diffractive, refractive, or a meta material structure.
  • the image generation engine is either internal or external.
  • the at least one optical element includes a lens stack having a diffractive or holographic reflecting surface to provide the light rays through the eyebox.
  • the at least one optical element includes a diffractive element to couple light into the planar waveguide.
  • the at least one optical element includes an at least partially reflective element to couple light into the planar waveguide.
  • the at least one optical element includes a diffractive element to couple light out of the planar waveguide and through the eyebox.
  • a method of fabricating an eyepiece for an immersive reality device comprising: forming, on a planar waveguide, an optical pattern configured to transmit light propagating in the planar waveguide through an eyebox delimiting a volume that includes a pupil of a user of the immersive reality device; forming a functional pattern on the eyepiece, the functional pattern shaped according to a preselected format, that is optionally aesthetically appealing to an onlooker, when the immersive reality device is worn by a user; and forming a transition pattern between the optical pattern and the functional pattern.
  • forming the functional pattern on the eyepiece comprises adding a cutout layer on a surface of the eyepiece.
  • said surface of the eyepiece is configured to face an onlooker.
  • forming the functional pattern on the eyepiece comprises edging the functional pattern on the planar waveguide around the optical pattern.
  • forming the functional pattern comprises including an aesthetically pleasing feature in the pre-selected format.
  • forming the functional pattern comprises including a feature in the preselected format that is transparent to a light configured to propagate through the planar waveguide.
  • forming the functional pattern comprises including a feature in the preselected format that reflects a light, optionally to an onlooker, when a user is wearing the immersive reality device.
  • forming the transition pattern comprises including, in the transition pattern, a feature having a dimension that is larger than a wavelength of light configured to propagate through the planar waveguide.
  • forming the transition pattern comprises including a partially deformed feature from the optical pattern and a partially deformed feature from the pre-selected format.
  • the method may further comprise placing a display on a side of the eyepiece facing the user of the immersive reality device, to proj ect an image to the user of the immersive reality device, for an immersive reality application.
  • the method may further comprise optically coupling a display in the eyepiece with an image generation engine to generate multiple light rays forming an image.
  • a device comprising: an image generation engine to generate multiple light rays forming an image; an eyepiece including a display to project the image to a user for an augmented reality application, the eyepiece comprising a planar waveguide configured to transmit the light rays from the image generation engine; at least one optical element configured to couple the light rays into, and to provide the light rays from, the planar waveguide and through an eyebox limiting a volume that includes a pupil of the user; and an onlooker-functional portion of the eyepiece surrounding the at least one optical element, and shaped according to a pre-selected format when the device is worn by a user.
  • a method of fabricating an eyepiece for an immersive reality device comprising: forming, on a planar waveguide, an optical pattern configured to transmit light propagating in the planar waveguide through an eyebox delimiting a volume that includes a pupil of a user of the immersive reality device; forming a functional pattern on the eyepiece, the functional pattern shaped according to a pre-selected format when the immersive reality device is worn by a user; and forming a transition pattern between the optical pattern and the functional pattern.
  • FIG. 1 illustrates an architecture for the use of a smart glass with enhanced structures for augmented reality /virtual reality (AR/VR), according to some embodiments.
  • AR/VR augmented reality /virtual reality
  • FIG. 2 illustrates a smart glass configured for AR/VR applications, according to some embodiments.
  • FIG. 3 illustrates a smart glass for AR/VR applications with a selectively diffractive layer that is cut out to form a user-functional portion having an aesthetically pleasing shape for an onlooker, according to some embodiments.
  • FIG. 4 illustrates a smart glass for AR/VR applications with a selectively reflective layer that is cut out to form a user-functional portion having an aesthetically pleasing shape for an onlooker, according to some embodiments.
  • FIG. 5 illustrates a smart glass for AR/VR applications with a coating that is cut out to an aesthetically pleasing shape for an onlooker to clear a user-functional portion, according to some embodiments.
  • FIG. 6 illustrates a user-functional portion and an onlooker-functional portion in an eyepiece of a smart glass for AR applications, according to some embodiments.
  • FIG. 7 is a flowchart illustrating steps in a method of fabricating an eyepiece for an immersive reality device, according to some embodiments.
  • Smart glasses for use in AR applications typically include optical elements and structures such as gratings, nano-structures, and beam-splitters embedded in the eyepieces.
  • these optical elements leak light from the display.
  • the optical structures reflect, refract, or diffract light from external sources.
  • the optical elements are visually pronounced and appear aesthetically unusual and unappealing.
  • some embodiments embrace the optical element to either cover an entire eyepiece surface with it, with both user-functional portions and onlooker-functional portions such that the surface looks uniform or gradient, or otherwise decorative and aesthetically appealing or acceptable for an onlooker.
  • a user-functional portion is morphed seamlessly into a decorative and aesthetically appealing or acceptable shape or pattern in an onlooker-functional portion of the eyepiece. In general, this means removing or softening any sharp lines of discontinuity and creating smooth function shapes of change. This can reduce the requirements from “perfectly” transparent glasses to an aesthetically pleasing compromise.
  • some embodiments include a gradient sunglass coating from top to bottom. This is considered acceptable and fashionable but is a non-uniform functional area of an overall eyepiece.
  • a user-functional portion may include a display component that provides the optical operation to deliver the AR optical display light to the eye.
  • the optical operation of the user-functional portion may create an un-desirable optical artifact to an onlooker of the user wearing a smart glass as disclosed herein.
  • a user-functional portion may include odd, unusual, or unappealing shapes that are distracting and visible to the onlooker. This makes the user look unattractive and cause some discomfort or lack of desire to wear the smart glass.
  • embodiments as disclosed herein include onlooker-functional portions that provide a matching “onlooker” appearance to the user-functional portion but is not optically coupled to the functional aspect of the display (e.g., providing a virtual image to the user).
  • an onlookerfunctional portion provides a fully matched surface, or a segment of the lens or eyepiece that is attractively patterned, or a transit! onary structure to match to the aesthetic of the smart glass frame. While an onlooker-functional portion is not optically coupled to the frame or display and may not have a functionality from the user perspective, it is optically operational to match an outside perspective of the display component, and therefore is functional (e.g., optically functional) from the onlooker’s perspective.
  • a user-functional portion in the display or eyepiece of a smart glass may unintentionally release, scatter, or leak light to the onlooker. Accordingly, an onlooker-functional portion can be driven by either lost light from the display or by a separate simple lighting element to match the unintentional leakage from the user-functional portion in the display. In some embodiments, the onlooker-functional portion may provide either full lens matching leakage, or partial lens appealing pattern leakage as seen by the onlooker.
  • FIG. 1 illustrates an architecture 10 for the use of a smart glass 100 with enhanced structures 105-1 and 105-2 (hereinafter, collectively referred to as “enhanced structures 105”) for augmented reality /virtual reality (AR/VR), according to some embodiments.
  • Smart glass 100 includes a user-functional portion 105-1 that is formed in an aesthetically pleasing shape for an onlooker 102 (e.g., a flower profile, a Mickey Mouse profile, and the like).
  • Userfunctional portion 105-1 may include an optically active component 105-2 that directs an AR image to a user 101.
  • the AR image may be a computer-generated image provided in a dataset 103-1 to smart glass 100 by a mobile device 110 or a remote server 130 via wireless communication and a computer network 150 (e.g., dataset 103-2).
  • Remote server 130 may also retrieve or store data (e.g., dataset 103-3) from a database 152 that is communicatively coupled to the computer network.
  • Datasets 103-1, 103-2, and 103-3 will be collectively referred to, hereinafter, as “datasets 103.”
  • Communications module 118 may be configured to interface with network 150 to send and receive information, such as datasets 103, requests, responses, and commands to other devices on network 150.
  • communications module 118 can include, for example, modems or Ethernet cards.
  • Client devices 110 may in turn be communicatively coupled with a remote server 130 and a database 152, through network 150, and transmit/share information, files, and the like with one another (e.g., dataset 103-2 and dataset 103-3). Datasets 103-1, 103-2, and 103-3 will be collectively referred to, hereinafter, as “datasets 103.”
  • Network 150 may include, for example, any one or more of a local area network (LAN), a wide area network (WAN), the Internet, and the like.
  • the network can include, but is not limited to, any one or more of the following network topologies, including a bus network, a star network, a ring network, a mesh network, a star-bus network, tree or hierarchical network, and the like.
  • the aesthetically pleasing shape of the user-functional portion 105-1 promotes a positive reaction towards user 101 from onlookers 102 and other people in the surroundings of user 101, or at least avoids a negative reaction produced by extraneous optical features reflected off of smart glass 100.
  • FIG. 2 illustrates a smart glass 200 configured for AR/VR applications, according to some embodiments.
  • smart glass 200 includes image generation engines 210-1 and 210-2 (hereinafter, collectively referred to as “image generation engines 210”) to generate multiple light rays forming an image, and eyepieces 207 having a display 235 to project the image to a user for an augmented reality application, all mounted on a frame 209.
  • eyepieces 207 include a planar waveguide 225 configured to transmit the image from image generation engines 210 (e.g., including an array source).
  • Eyepieces 207 include at least one optical element in a user-functional portion 205, the optical element configured to couple the light rays into, and to provide the light rays from, planar waveguide 225 and through eyeboxes 251-1 and 251-2 (hereinafter, collectively referred to as “eyeboxes 251”), limiting a volume that includes a pupil of the user.
  • userfunctional portion 205 is shaped according to a pre-selected pattern or format, offering an aesthetically pleasing view to an onlooker.
  • the optical element includes a diffractive element built into the lens or eyepiece 207 to direct light to the eye through an output coupler to create eyeboxes 251.
  • the functional diffractive element will have a shape to support this.
  • a transition pattern into an aesthetically pleasing pattern that matches the functional element (e.g., the diffractive element) from an onlooker perspective can be added around userfunctional portion 205 to provide a fashionably pleasing shape or uniformize the entire lens.
  • the functional element may include image optics to couple light from image generation engines 210 into planar waveguide 225.
  • a functional element may include a beamsplitter that reflects or redirects light, rather than waveguides and outcouples. The same treatment as above can be made to make user-functional portion a transition portion around user-functional portion 205.
  • eyepieces 207 include a transparent substrate layer that is at least partially transmissive to ambient light.
  • the combination of the ambient light and a computer-generated image provided by image generation engines 210 creates an immersive, augmented reality view for the user.
  • image generation engines 210 are external and coupled into planar waveguide 225 via an input coupler like a prism or grating.
  • image generation engines 210 are structures embedded within the optical element (e.g. , a lens).
  • the structure embedded in the optical element may include photonic integrated circuits, and diffractive, refractive, or meta material structures. Accordingly, while there may be no need for an input coupler, some embodiments may still include an output coupler.
  • image generation engines 210 are either internal or external, but instead of output-coupled, it is diffraction or holographically reflected off a surface in a lens stack forming the optical elements.
  • FIG. 3 illustrates a smart glass 300 for AR/VR applications with a selectively diffractive layer 315 that includes a cut out 320 to form a user-functional portion 305 having an aesthetically pleasing shape for an onlooker, according to some embodiments.
  • selectively diffractive layer 315 includes a surface relief grating, a volumeholographic grating, a polarization grating, and the like, configured to direct the light rays out of a planar waveguide in the eyepieces 307 and into the eyebox of smart glass 300. Eyepieces 307 are mounted on a frame 309.
  • FIG. 4 illustrates a smart glass 400 with a frame 409 for AR/VR applications with a selectively reflective layer 415 that is cut out to form a user-functional portion 405 having an aesthetically pleasing shape for an onlooker, according to some embodiments.
  • selectively reflective layer 415 includes micro- or nano-structured reflection layers configured to reflect image light in a predetermined narrow wavelength band, a particular polarization state, or both, in a direction through an eyebox of smart glass 400 (e g., eyeboxes 251).
  • the micro- or nano-structured reflection layer 415 may include a stack of multiple dielectric layers, or a two-dimensional array of plasmonic resonant structures, or a three-dimensional photonic crystal structure, a Bragg reflecting structure, or any combination thereof.
  • FIG. 5 illustrates a smart glass 500 with a frame 509 for AR/VR applications with a coating 515 that is cut out to an aesthetically pleasing shape 520 for an onlooker to clear a userfunctional portion, according to some embodiments.
  • coating 515 may include a stray light reducing structured substrate to prevent undesirable stray light from passing through an eyebox (e g, eyeboxes 251).
  • smart glass 500 may include a functional portion 505.
  • Functional portion 505 may include a uniform or gradient colored surface that is aesthetically pleasing to an onlooker. Accordingly, functional portion 505 may be cut out in an aesthetically pleasing shape or pattern disposed around an optical element in eyepieces 507.
  • FIG. 6 illustrates a user-functional portion 605-1 and an onlooker-functional portion 605-2 in an eyepiece 607 of a smart glass 600 for AR applications, according to some embodiments.
  • an optical element 606 includes an optical pattern 615 that is continuously deformed from an optically functional shape overlapping userfunctional portion 605-1 into an aesthetically pleasing shape overlapping onlooker-functional portion 605-2 via a transition pattern 617.
  • optical pattern 615 may include relief grating grooves, or micro-nano reflectors
  • transition pattern 617 may include similar elements but having a reduced refractive power (e.g., shallower grating grooves, lower index contrast, reduced reflectivity, and the like). Sharp ends and curvatures are gradually smoothened in transition pattern 617 to match, in the outer edges, a decorative or aesthetically pleasing shape that has little to no optical functionality but is visible to an onlooker.
  • FIG. 7 is a flowchart illustrating steps in a method 700 of fabricating an eyepiece for an immersive reality device, according to some embodiments.
  • Methods consistent with the present disclosure may include one or more steps in method 700 performed in a different order, simultaneously, quasi-simultaneously, or overlapping in time.
  • Step 702 includes forming, on a planar waveguide, an optical pattern configured to transmit light propagating in the planar waveguide through an eyebox delimiting a volume that includes a pupil of a user of the immersive reality device.
  • Step 704 includes forming a functional pattern on the eyepiece, the functional pattern shaped according to a pre-selected format that is aesthetically appealing to an onlooker when the immersive reality device is worn by a user.
  • step 704 includes adding a cutout layer on a surface of the eyepiece configured to face an onlooker.
  • step 704 includes edging the functional pattern on the planar waveguide around the optical pattern.
  • step 704 includes forming an aesthetically pleasing feature in the pre-selected format.
  • step 704 includes forming a feature in the pre-selected format that is transparent to a light configured to propagate through the planar waveguide.
  • step 704 includes having a feature in the pre-selected format that reflects a light to an onlooker when a user is wearing the immersive reality device.
  • Step 706 includes forming a transition pattern between the optical pattern and the functional pattern. In some embodiments, step 706 includes forming, in the transition pattern, a feature having a dimension that is larger than a wavelength of light configured to propagate through the planar waveguide. In some embodiments, step 706 includes forming a partially deformed feature from the optical pattern and a partially deformed feature from the pre-selected format.
  • step 706 includes placing a display on a side of the eyepiece facing the user of the immersive reality device, to project an image to the user of the immersive reality device, for an immersive reality application.
  • step 706 includes optically coupling a display in the eyepiece with an image generation engine to generate multiple light rays forming an image.
  • the phrase “at least one of’ preceding a series of items, with the terms “and” or “or” to separate any of the items, modifies the list as a whole, rather than each member of the list (e.g., each item).
  • the phrase “at least one of’ does not require selection of at least one item; rather, the phrase allows a meaning that includes at least one of any one of the items, and/or at least one of any combination of the items, and/or at least one of each of the items.
  • phrases “at least one of A, B, and C” or “at least one of A, B, or C” each refer to only A, only B, or only C; any combination of A, B, and C; and/or at least one of each of A, B, and C.
  • exemplary is used herein to mean “serving as an example, instance, or illustration.” Any embodiment described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments. Phrases such as an aspect, the aspect, another aspect, some aspects, one or more aspects, an implementation, the implementation, another implementation, some implementations, one or more implementations, an embodiment, the embodiment, another embodiment, some embodiments, one or more embodiments, a configuration, the configuration, another configuration, some configurations, one or more configurations, the subject technology, the disclosure, the present disclosure, other variations thereof and alike are for convenience and do not imply that a disclosure relating to such phrase(s) is essential to the subject technology or that such disclosure applies to all configurations of the subject technology.
  • a disclosure relating to such phrase(s) may apply to all configurations, or one or more configurations.
  • a disclosure relating to such phrase(s) may provide one or more examples.
  • a phrase such as an aspect or some aspects may refer to one or more aspects and vice versa, and this applies similarly to other foregoing phrases.
  • a reference to an element in the singular is not intended to mean “one and only one” unless specifically stated, but rather “one or more.”
  • Pronouns in the masculine include the feminine and neuter gender (e.g., her and its) and vice versa.
  • the term “some” refers to one or more.
  • Underlined and/or italicized headings and subheadings are used for convenience only, do not limit the subject technology, and are not referred to in connection with the interpretation of the description of the subject technology. Relational terms such as first and second and the like may be used to distinguish one entity or action from another without necessarily requiring or implying any actual such relationship or order between such entities or actions.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

L'invention concerne un dispositif portable pour des applications de réalité augmentée comprenant un moteur de génération d'image pour générer de multiples rayons lumineux formant une image, et un oculaire comprenant un dispositif d'affichage pour projeter l'image à un utilisateur pour une application de réalité augmentée. L'oculaire comprend un guide d'ondes plan configuré pour transmettre les rayons lumineux provenant du moteur de génération d'image. Le dispositif pouvant être porté comprend également au moins un élément optique configuré pour coupler les rayons lumineux dans, et pour fournir les rayons lumineux à partir du guide d'ondes plan et à travers un oculaire limitant un volume qui comprend une pupille de l'utilisateur, et une partie fonctionnelle d'utilisateur de l'oculaire comprenant l'au moins un élément optique, la partie fonctionnelle d'utilisateur étant formée selon un format présélectionné qui est esthétiquement attrayant pour un spectateur lorsque le dispositif de réalité immersive est porté par un utilisateur.
PCT/US2023/017227 2022-04-01 2023-04-01 Lunettes intelligentes à structures optiques améliorées pour des applications de réalité augmentée WO2023192656A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202263326566P 2022-04-01 2022-04-01
US63/326,566 2022-04-01
US18/064,772 US20230316672A1 (en) 2022-04-01 2022-12-12 Smart glasses with enhanced optical structures for augmented reality applications
US18/064,772 2022-12-12

Publications (1)

Publication Number Publication Date
WO2023192656A1 true WO2023192656A1 (fr) 2023-10-05

Family

ID=86286195

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/017227 WO2023192656A1 (fr) 2022-04-01 2023-04-01 Lunettes intelligentes à structures optiques améliorées pour des applications de réalité augmentée

Country Status (1)

Country Link
WO (1) WO2023192656A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH062369U (ja) * 1992-06-12 1994-01-14 セントラル硝子株式会社 ホログラフィックオーナメント
DE4234567A1 (de) * 1992-10-09 1994-04-14 Telefilter Tft Gmbh Reflektierende Fläche mit Bildmuster
US20160054565A1 (en) * 2013-03-29 2016-02-25 Sony Corporation Information processing device, presentation state control method, and program
US20190086674A1 (en) * 2017-09-21 2019-03-21 Magic Leap, Inc. Augmented reality display with waveguide configured to capture images of eye and/or environment
US20190139290A9 (en) * 1990-12-07 2019-05-09 Dennis J. Solomon Integrated 3d-d2 visual effects dispay
US10545714B2 (en) * 2015-09-04 2020-01-28 Samsung Electronics Co., Ltd. Dual screen head mounted display
US20200088999A1 (en) * 2018-09-17 2020-03-19 Apple Inc. Electronic Device With Inner Display and Externally Accessible Input-Output Device
US10613330B2 (en) * 2013-03-29 2020-04-07 Sony Corporation Information processing device, notification state control method, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190139290A9 (en) * 1990-12-07 2019-05-09 Dennis J. Solomon Integrated 3d-d2 visual effects dispay
JPH062369U (ja) * 1992-06-12 1994-01-14 セントラル硝子株式会社 ホログラフィックオーナメント
DE4234567A1 (de) * 1992-10-09 1994-04-14 Telefilter Tft Gmbh Reflektierende Fläche mit Bildmuster
US20160054565A1 (en) * 2013-03-29 2016-02-25 Sony Corporation Information processing device, presentation state control method, and program
US10613330B2 (en) * 2013-03-29 2020-04-07 Sony Corporation Information processing device, notification state control method, and program
US10545714B2 (en) * 2015-09-04 2020-01-28 Samsung Electronics Co., Ltd. Dual screen head mounted display
US20190086674A1 (en) * 2017-09-21 2019-03-21 Magic Leap, Inc. Augmented reality display with waveguide configured to capture images of eye and/or environment
US20200088999A1 (en) * 2018-09-17 2020-03-19 Apple Inc. Electronic Device With Inner Display and Externally Accessible Input-Output Device

Similar Documents

Publication Publication Date Title
US11921292B2 (en) Systems, devices, and methods for waveguide-based eyebox expansion in wearable heads-up displays
US10379358B2 (en) Optical see-through display element and device utilizing such element
AU2010238336B2 (en) Optical waveguide and display device
JP6720315B2 (ja) 反射型転換アレイを有する結像光ガイド
US20170102544A1 (en) Reducing stray light transmission in near eye display using resonant grating filter
EP3443403A1 (fr) Dilatateur de pupille de sortie de guide d'ondes ayant une distribution d'intensité améliorée
US11099404B2 (en) Systems, devices, and methods for embedding a holographic optical element in an eyeglass lens
CN104956253A (zh) 具有眼睛处方的透视近眼式显示器
US11314092B2 (en) Systems, devices, and methods for light guide based wearable heads-up displays
WO2021098374A1 (fr) Guide d'ondes à réseau de diffraction pour réalité augmentée
US20210356909A1 (en) Systems, devices, and methods for side lobe control in holograms
CN110036235A (zh) 具有用于再循环光的外围侧面几何形状的波导
WO2022093391A1 (fr) Réflecteur à large bande pour ensemble guide d'ondes dans un visiocasque
US20230316672A1 (en) Smart glasses with enhanced optical structures for augmented reality applications
WO2023192656A1 (fr) Lunettes intelligentes à structures optiques améliorées pour des applications de réalité augmentée
FI128552B (en) Wavelength display element with reflector surface
de la Perrière Understanding waveguide-based architecture and ways to robust monolithic optical combiner for smart glasses
CN113189704A (zh) 一种光波导及近眼显示系统
CN113189779A (zh) 一种阵列光波导模组及增强现实显示设备
US20240012261A1 (en) Compact beam expander for vr/ar headsets
US20230359050A1 (en) Low residual layer thickness waveguide with high-index coating
US11892640B1 (en) Waveguide combiner with stacked plates
US10890770B2 (en) Waveguide having partial reflector
CN218788114U (zh) 一种近眼显示设备
Ryu et al. Development of a light guide with a stair micromirror structure for augmented reality glasses

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23721082

Country of ref document: EP

Kind code of ref document: A1