WO2023009657A1 - Blue light mitigation and color correction in augmented reality (ar), virtual reality (vr), or extended reality (xr) systems - Google Patents

Blue light mitigation and color correction in augmented reality (ar), virtual reality (vr), or extended reality (xr) systems Download PDF

Info

Publication number
WO2023009657A1
WO2023009657A1 PCT/US2022/038569 US2022038569W WO2023009657A1 WO 2023009657 A1 WO2023009657 A1 WO 2023009657A1 US 2022038569 W US2022038569 W US 2022038569W WO 2023009657 A1 WO2023009657 A1 WO 2023009657A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
lens
reality
display system
optical material
Prior art date
Application number
PCT/US2022/038569
Other languages
French (fr)
Inventor
Davin Saderholm
Original Assignee
Eyesafe Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Eyesafe Inc. filed Critical Eyesafe Inc.
Publication of WO2023009657A1 publication Critical patent/WO2023009657A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/22Absorbing filters
    • G02B5/223Absorbing filters containing organic substances, e.g. dyes, inks or pigments
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility

Definitions

  • AR VIRTUAL REALITY
  • XR EXTENDED REALITY
  • the present disclosure relates to blue light filtering and color correction for display systems used for augmented reality (AR), virtual reality (VR), or extended reality (XR) systems.
  • AR augmented reality
  • VR virtual reality
  • XR extended reality
  • Head-borne augmented reality (AR) systems are poised to be the new "eye gate” for devices in order to increase the hands-free capability, to allow for multi-screen capability, and to allow transmission of information to an augmented reality space. These qualities are currently not possible with present stationary or “geographically located” screens.
  • Tasks that require interaction with electronic devices and displays can be separated from the actual electronics allowing for truly mobile access to data and interaction with computer interfaces. These tasks can include providing on-site access to assembly or training animations such as, for example, how to assemble an engine on the fly or allowing medical personnel to access test data, such as MRI data in situ on a patient without the need for display screens or keyboards. Additionally, in vehicles, safety information can be displayed dependent upon the position of the driver's head and the information can be provided remotely for AR and virtual reality (VR) and also which is one of the ultimate goals of extended reality (XR). In some embodiments using these technologies, unsafe spaces can be "entered", in simulated, but real settings and real work with real things with a remote outcome or action can be evaluated.
  • assembly or training animations such as, for example, how to assemble an engine on the fly or allowing medical personnel to access test data, such as MRI data in situ on a patient without the need for display screens or keyboards.
  • safety information can be displayed dependent upon the position of the driver's head
  • the present disclosure relates to articles and methods directed toward the application of display and display covers for display devices and reality altering devices.
  • Various embodiments of a display system for use with reality displays comprising an electronic reality display device.
  • the electronic reality display device may have a backlight unit, wherein the backlight unit comprises a light-emitting array included in a projector, a reflector adjacent to the light-emitting array, a first lens for guiding wavelengths based on at least the light emitting array, and an optical material in the backlight unit comprising at least one light conversion material or at least one light absorbing material.
  • the at least one light conversion material or at least one light absorbing material is structured and configured to reduce hazardous blue light emissions between about 400 nm to about 500 nm
  • the electronic reality display device is at least one of a virtual reality device, an augmented reality device, an extended reality device, and any combination thereof.
  • the electronic reality display devices are part of a network for accessing software and other display reality platforms.
  • the display system comprises a liquid crystal panel or a diode array illumination assembly. At least one projector and the first lens have the at least one light conversion material or at least one light absorbing material, and the at least one light conversion material or at least one light absorbing material includes dyes for at least one of light absorption and color correction.
  • the system may further have a second lens for combining wavelengths for output, wherein the second lens has the optical material for light absorption and color correction.
  • the electronic reality display device further has a combiner and a dye is applied to the optical material for absorbing blue light at the combiner.
  • the light conversion materials or light absorbing materials are solubly or insolubly dispersed throughout the optical material
  • the projector is further comprised of a panel of glass and a light-emitting array having encapsulants, and the projector contains the optical material on at least one of the panels of glass and the light-emitting array.
  • the optical material is on a relay lens.
  • the optical material comprises index matched light conversion materials or light absorbing materials.
  • the optical material may be an optical film or an optical coating that is on the first lens or incorporated onto the first lens.
  • the optical film may have inorganic nanoparticles that are index matched to the optical material and coupled to an organic adhesive applied to the optical material.
  • the electronic reality display device absorbs light from at least one of outside of the electronic reality device, an output display, or a combination thereof.
  • the electronic reality display device may have a holographic waveguide wherein the projector projects light to the first lens that reflect the light along a waveguide to the holographic waveguide.
  • the first lens may be a prism for at least one of guiding the light waves, combining the light waves, or any combination thereof.
  • a dye may be applied to the optical material for absorbing blue light at the first lens.
  • a shutter containing the optical material is located between the projector and a waveguide lens.
  • the at least one light conversion material or at least one light absorbing material is structured and configured to reduce hazardous blue light emissions between about 400 nm to about 500 nm.
  • the electronic reality display device is at least one of a virtual reality device, an augmented reality device, an extended reality device, and any combination thereof.
  • the method further includes providing a dye to the optical material for absorbing blue light at the first lens.
  • the method further may further include providing the optical material on the projector which is further comprised of a panel of glass and a light-emitting array having encapsulants, and the projector contains the optical material on at least one of the panels of glass and the light-emitting array.
  • the method may further include having at least one brightness enhancing layer, and providing a polarizing filter adjacent to the at least one brightness enhancing layer.
  • FIG. 1 is an illustration of an embodiment of the use of blue light mitigation and color correction in a real-world display.
  • FIG. 2 is an illustration of embodiments of block diagrams of AR in projector/lens/combiner systems.
  • FIGs. 3a and 3b is an illustration of embodiments of visual examples of opportunities of the disclosed technologies in typical AR systems.
  • FIG. 4 is an illustration comparing virtual reality (VR) technology with augmented reality/extended reality (AR/XR).
  • VR virtual reality
  • AR/XR augmented reality/extended reality
  • FIG. 5 is an illustration of a basic waveguide concept showing embodiments of the disclosed system that include use of blue light mitigation and color correction that can be applied to a projection system, light reflection waveguides/lenses, and/or waveguide combiner lens systems.
  • FIG. 6 is an illustration of embodiments of diagrams of VR/XR in proj ector/1 ens/active shutter/combiner systems.
  • FIG. 7 is a graph of light absorption at identified wavelengths.
  • FIG. 8 is an illustration of embodiments projector/freeform prism combiner systems.
  • the present disclosure relates to articles and methods directed toward the mitigation of hazardous blue light in AR, VR, and/or XR display devices.
  • Various embodiments of biomarker detector systems and methods are be described in detail with reference to the drawings, wherein like reference numerals may represent like parts and assemblies throughout the several views. Reference to various embodiments does not limit the scope of the biomarker detector disclosed herein. Additionally, any examples set forth in this specification are not intended to be limiting and merely set forth some of the many possible embodiments for the biomarker detector. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but these are intended to cover applications or embodiments without departing from the spirit or scope of the disclosure. Also, it is to be understood that the phraseology and terminology used herein are for the purpose of description and should not be regarded as limiting.
  • the current disclosure provides a blue light/color management filter that can be inserted into the image projection chain of an AR, VR, or XR system — likely a head-borne apparatus or headset designed to interact with one or both of the wearer’s eyes.
  • the blue light filter and color correction technology can come in various forms including, but not limited to dyes, films, deposited layers, coated layers, or tints included in the actual lenses or waveguides of the AR, VR, XR apparatus.
  • the blue light filter can act on the key high energy visible light (HEV) of the spectrum, which includes the blue light portion from 415 nm to 455 nm which is known to be toxic to the retina and the blue-turquoise portion from 460 nm to 500 nm which is linked to non visual physiological responses, to provide both eye health protection and minimize the circadian rhythm impact often caused by blue light on users of portable electronics with visible screens.
  • HEV high energy visible light
  • the blue light filter can affect the visible appearance of the visible screen, the color correction technology acts to re-adjust the color spectrum so that the user experiences a near restored color spectrum in the display and/or displayed images.
  • the blue light filtration and color filtration methodology presented here uses a non-software-based approach with the dyes, coatings, deposited layers, etc.
  • the bluelight/color management filter may include some elements like photochromic or electrochromic systems to enhance functionality over a broader set of use conditions including bright light or dim light conditions and user-specified settings/adjustments.
  • the current disclosure provides a combination of blue light mitigation with the addition of color correction to the displayed/projected image per the current EYE SAFE technology applied to embedded display filters and display aftermarket filters.
  • a virtual image is diffused by a system comprising a light source (screen, display or microdisplay), a lens (vary-focus lens or ocular lens or a relay optical lens module or optical prism), a light combiner or freeform combiner, and also, optionally, a tunable lens, a top active shutter and a front active shutter.
  • At least one light conversion material or at least one light absorbing material can be structured and configured to reduce hazardous blue light emissions between about 400 nm to about 500 nm at the level of light source, optical lens or relay optical lens module, tunable lens, optical prism and combiners.
  • the passive methods may dyes or photochromic versions of the dyes incorporated into prisms, optical combiners, lenses, that are situation between the light emission source and the wearer's eye •
  • the disclosure of this invention goes to addressing the challenges of the future and applying a filter to smart eyewear, such as smart glasses, halo glasses/lenses, and VR goggle sets for an immersive experience.
  • smart eyewear such as smart glasses, halo glasses/lenses, and VR goggle sets for an immersive experience.
  • These smart optical devices may be coupled to other computing devices.
  • Some optical solution may be an extension of glasses, computers, and gaming consoles.
  • Head-borne “augmented” reality systems are poised to be the new “eye gate” for devices to increase hands free capability, allow for multi-screen capability, and to transmit information to an augmented reality space for purposes not currently possible with stationary or “geographically located screens”.
  • the disclosure related to more than smart glasses and applies to all optics systems. Including helmet mounted systems, goggles, glasses, and halos. When applied to this changed reality viewing, the resulting view is that the projected or emitted image is still visible in combination with the 360 view outside of the device.
  • Different fields and application can use the light filtering, such as gaming or aviation simulation. It’s not just simulators, but actual travel, training that the technology can be used in, but even medical applications where a patient can walk through an internal health issue or surgical procedure with viewing the entire plan through special optical sources. Imagine something similar for vehicle drivers — no blind spots and will allow for more autonomous driving systems to be implemented that will allow more reasonable human supervision.
  • BL blue light
  • color correction may be defined as follows “A” Projection system near the light source display, “B” Light reflection waveguides/lenses and “C” waveguide/combiner lens.
  • a projector may be combined with a lens in a combiner system.
  • a projection source “lens” may include BL and color correction opportunity, dyes coatings, depositions to manage UV transmission and color. These materials may be on any of the lenses or combiners or shutters.
  • the transmission may then travel through an image focusing lens or lenses (similarly with the dyes, coatings, depositions to manage UV transmission and color).
  • the image may then travel and reflect off an image viewing waveguide.
  • the image viewing waveguide may combine with viewed image in space. This is then the viewable image received by the user’s eye.
  • VR application where the image is confined in the headset or near eye displays (“NED”) to create a virtual reality.
  • the VR NED filters light and correct then the display transmits through a lens to the human eye.
  • filtering at the display or lens to filter and correct light is possible.
  • Augmented Reality NED the image is viewed through the lens to augmented reality of the actual space in front of the viewer.
  • the display once again transmits the projected light thought a lens and this is then combined with outside (reality) optical, so that the real-world scene is combined with virtual information.
  • the projected light is from light emitted from LEDs or other light sources.
  • a freeform prism to combine light with projected light.
  • Light or image is projected through a projection source “lens” to a freeform prism combiner.
  • the combiner combines the projection with the viewed image in space through a corrector lens.
  • the optical prism(s) or wave guide may result in the image viewable by the user.
  • the blue light and color correction opportunity is using dyes, coatings, depositions to manage UV transmission and color.
  • a source display may project at a point through a lens to a flat lens.
  • the projection may reflect off a holographic reflective optic (around 2 mm) down along the flat lens.
  • the projection may deflect off a second holographic reflective optics and be received or visible to the user.
  • Areas of blue light and color correction on a real-world diagram use a projection system, light reflection waveguides/lenses, and waveguide combiner lens.
  • the waveguide embodiment may be applied to goggles or lenses in a functional layout.
  • the image may project, such as though an image system. This may implement a POD.
  • the light may reflect off such as by light reflection waveguides/lenses. There may be more than on light reflection waveguides/lenses in the coupled in optics.
  • the waveguide combiner lens may then couple the out optics viewable to the user.
  • the equipment may include, but is not limited to, the designs for AR multi layered, AR glasses, VR goggles, AR (add-on) equipment, and XR headsets, NEDs, simulators, attachments thereto, and halos.
  • Computing devices may be placed on the equipment or in communication with a system for control and operations.
  • AR systems may use the BL filtering.
  • a NIR sensor may be used.
  • a freeform prism may be utilized for reflecting light.
  • the light may exit or be received by the eye (e.g. the pupil), and the eye may see the resulting virtual image.
  • the display screen may generate an image that is transmitted through a vary-focus lens.
  • a combiner adds the reality into the display or allows the user to see reality (at least a promotion). Then the combination of these produces in a virtual image.
  • the second situation is with multiple image sources project through an ocular lens.
  • the combiner takes the image through the ocular lends with reality and the result is that the user sees the virtual image through the combiner.
  • the third situation is a display source projects through ta relay optical lends module. Which then transmits through an active shutter to the light combiner. The user sees thought a tunable lens, seeing see-through real scenes after they are transmitted throughout a front active shutter and light combiner with the display source result.
  • the system may include a projector, lens, active shutter and combiner.
  • the projection source “lens” may first receive the light or projection which then travels to a relay optical lens system. The image or light is then transmitted to a top active shutter. After, the light is then reflected off a light combiner to a tunable lens. The light combiner passes some light or control from and to a front active shutter. Both the shutter-controlled light and the reflected (or deflected) light together git the tunable lens which then results in the image as seen or received by the user’s eye.
  • Each of the lenses and shutters have a BL and color correction opportunity. To achieve this, dyes, coatings, depositions to manage UV transmission and color are used.
  • FIG. 1 is an illustration of an embodiment of the use of blue light mitigation and color correction in a real-world or wearable display.
  • a user may wear display glasses for many virtual reality (“VR”), augmented reality (“AR”), and extended reality (“XR”) displays.
  • display glasses 100 can be worn by a user. Applying optical materials or dyes to elements of display glasses 100 may help to absorb blue light generated by an image source 102. Added dyes, material, or coatings may use compounds or elements that are microparticles or nanoparticles, so that they are not visible and do not interfere with the image. The emitted light then may be reflected off of at least one lens 104.
  • Lens 104 may be a waveguide lens that reflect the light toward lenses of glasses 106 and viewable by the user. Light may be coupled into the system either from the environment outside of the glasses or by adding additional light sources. In the embodiment of Figure 1, lenses of the glasses 106 may be transparent, allowing light to enter. In other embodiments, lenses of glasses 106 may be opaque, so that the user cannot see the environment outside of the display glasses 100. Lenses of glasses 106 may couple image together by receiving the proj ected display image from image source 102 or reflector 104, and allowing the user to see the environment outside, which is the combined or coupled image viewable by the user.
  • Optical material or dye may be applied to image source 102, filtering light at the source.
  • the dye, film, coating, or deposition material may target blue light or other harmful light, absorbing in a range for instance of 400-500 nm.
  • Another point of light absorption may be at the light reflection lenses (waveguides/reflectors) 104, so that the light moving away from waveguide(s)/reflector(s) 104 may contain less harmful blue light.
  • Another possible point of light absorption may be lenses of glasses 106.
  • FIG. 2 is an illustration of embodiments of block diagrams of AR in projector, lens, and combiner systems.
  • points of light absorption are identified by the components of the AR light circuit.
  • Light source 202 shines through projection source “lens” 204 and projected image or light 216 continues through an image focusing lens or lenses 206.
  • the light transmission is the reflected by a reflector or combiner 212.
  • the user 214 can see the image light 218, but also externally or other images 210. This is another source of “external” light that travels through the combiner 212.
  • External images or light 210 and projected lights 216 are received by the user’s eye as the combined image or augmented image is viewed.
  • adding absorption dye, material, or coatings to manage UV transmission and certain colors may help to protect the user’s eyes.
  • Points of adding such material may include at the projection source lens 204 and image focusing lens(es) 206.
  • Adding absorption dye, material, or particles at image viewing combiner or waveguide 212 may further filter projected image light as well as external or environmental light such that the light that reaches the user’s eye has less harmful light in the target wavelength range, such as 400-500 nm or other ranges for UV light.
  • FIG. 3a illustrates a first possible arrangement may have a display screen 302 emitting light.
  • the display screen can be a light- emitting array, light crystals, projection display, or any light emitting device.
  • a lens 304 such as a vary focus lens.
  • the lens may be one of a plurality of lenses and may be a first of the plurality.
  • the light may shine through lens 304.
  • lens 304 may be a focus lens where the light waves are directed in a direction at an angle from the direction that the respective wavelength enters lens 304 and towards combiner 310.
  • the user of the AR device may look through an opening or at a point near combiner 310.
  • a user 306 may see the image at the exit of the pupil 308, which sight from user’s pupil 306 extends in a cone shape with the tip at user’s pupil 306. In this way, user 306 sees the virtual image through combiner 310 and augments or superimposes the virtual projected image originating at display screen 302 and creating a virtual image 312, which may be one layer, more than one layer, or a plurality of layers.
  • the resulting virtual image 312 is then a combination of the view without the AR device and the projected image.
  • User 306 is exposed to harmful light emitted from the device by viewing virtual image 312, and possibly from the outside environment beyond AR system 301a.
  • an optical material in AR system 301a then filters and/or absorbs, and in some embodiments converts the light waves to different lengths, light waves may reduce exposure to harmful light.
  • Such an optical material can be applied to display screen 302 (or projector).
  • the optical material may also be applied to lens 304 to absorb light transmitted through lens 304.
  • optical material may also or alternatively be applied at combiner 310 to filter light received by user 306.
  • the optical material may include dye material that absorbs, converts, etc., light waves of a particular wavelength, such as within the range of 400-500 nm.
  • Application of the optical material may be applied to the various elements, and in other embodiments it may be part of the element as a layer or within the material. In other examples, it may be a coating applied to the respective element.
  • a second embodiment or AR system 301b illustrates a first possible arrangement may have a display screen 320 emitting light.
  • Display screen 320 may be comprised of more than one image source that emits light. In this way, there may be parts of the AR image that is output at a display, or it may me the same or similar image 328.
  • the display screen or image source that can be one or more than one image sources 320 can be a light-emitting array, light crystals, projection display, or any light emitting device. Once emitted, light may emit to a lens 322 such as a vary focus lens.
  • the lens may be one of a plurality of lenses and may be a first of the plurality.
  • the light may shine through lens 322.
  • lens 322 may be a focus lens where the light waves are directed in a direction at an angle from the direction that the respective wavelength enters lens 322 and towards combiner 326.
  • User 324 of the AR device may look through an opening or at a point near combiner 326. A user 324 may see the image at the exit of the pupil 308, which sight from user’s pupil 324 extends in a cone shape with the tip at user’s pupil 324. In this way, user 324 sees the virtual image 328 through combiner 326 and augments or superimposes the virtual projected image originating at display screen(s) 320 and creating a virtual image 328, which may be one layer, more than one layer, or a plurality of layers. The resulting virtual image 328 is then a combination of the view without the AR device and projected image(s) 320.
  • an optical material in AR system 301b filters and/or absorbs, and in some embodiments converts the light waves to different lengths, light waves may reduce exposure to harmful light.
  • Such an optical material can be applied to display screen 320, or all of the multiple image sources 320 (or projector).
  • the optical material may also be applied to lens 322 to absorb light transmitted through lens 322.
  • optical material may also or alternatively be applied at the combiner 326 to filter light received by user 306.
  • the optical material may include dye material that absorbs, converts, etc., light waves of a particular wavelength, such as within the range of 400-500 nm.
  • a display source 334 may project an image, or a light emission, that may include optical material 332, the light from display source 334 emitted and to an optical relay lens module 336, which helps with light emission and waveguide transmission, and in some instances direction, amplification and conversion.
  • Optical relay lens 336 may also have optical material 332 on it in some embodiments to further assist with absorbing blue light. After the optical relay lens 336, light goes through a top active shutter 340. Top active shutter 340 may control the light transmission amount through this area for a period of time.
  • Light wave 344 that make up the virtual image emitted from display source 334 may then travel through the top active shutter 340, which may be controlled by software to allow certain light levels for period of time, to Tunable lens 350.
  • User 342 can see the image through tunable lens 350. In AR systems, user 342 may also see outside of the device and the environment 346 around user 342.
  • the light and view from environment 334 may enter through a front active shutter 348, which controls light and timing of the view of user 342.
  • the incoming environment light 352 may join with display source 334 (projector) image or light 344 and both enter through turnable lens 350 to a display of the combination viewable by user 342.
  • Turnable lens 350 may be moveable for optimizing the resulting output image viewed by user 342.
  • AR system 301c outputs the combination of external environment 352 and display image 344. Because there is more than one source of light, light absorption material may be placed at points to absorb harmful light. In the embodiment of 301c, absorption material, such as optimizing material or light absorbing dyes, may be place at display source 334. Relay optical lens module 336 may also have optimizing material to further absorb light from display source 334. The display light may be further absorbed at top active shutter. Additionally, in some embodiments, front active shutter 348 may further absorb harmful light waves. Both external/environment light 352 and display light 344 may then further be optimized by applying optimizing material at turnable lens 350. Application of optical material, dye, encapsulant, or other light absorbing material at any of these points in AR system 301c can absorb harmful light in wavelengths 400-500 nm and optimize the output light received or viewed by user 342
  • FIG. 3b illustrates using a prism in an AR system for guiding light.
  • Fig. 3b illustrates user 368.
  • a freeform prism 372 can be used and shape of the prism is determined by the direction and guide needs of the AR system. Prisms reflect and deflect light.
  • a freeform corrector 366 may be a used to transmit certain wavelength into the prism to help the prismatic light movement and guide wavelength.
  • a display, such as a microdisplay 362 may add light into prism 372 at prism side 374 in certain embodiments.
  • An NIR LED 364 may also be used to add light into the prism or leaving prism 372.
  • “NIR” is near infrared sensing-radiant and can sense a variety of things from moisture, light, and other elements.
  • Blue light absorbing dye 370 may be applied in certain embodiments to one or more points to optimize light and reduce harmful light.
  • Dye, or optical material may be applied on prism 372. It may also be applied to NIR sensor 360, display/microdi splay 362, and NIR LED 364. Microdisplays may be in a millimeter scale or smaller.
  • the absorption material or dye absorbs blue light, or targeted light, so that the viewable light that reaches the user’s eye 368 has reduced amounts of blue light.
  • the image can be enhanced with adding other light wavelengths to correct the color and improve the image quality, such as freeform correction 366, or added an external light source, such as NIR LED 364.
  • FIG 4 is an illustration comparing virtual reality (VR) technology with augmented reality/extended reality (AR/XR).
  • VR 402 is where the image is confined in the headset to create a virtual reality.
  • the display 404 projects and image, and the light 410 travels through a lens 406 to be viewed or received by a user’s eye 408.
  • display 420 projects light 430 though lens 422 received by a user’s eye 426.
  • External light 424 enters the headset and combines with projected light 430 in an optical combiner 424.
  • NED near eye displays
  • absorption dye at display 404 and 420 is important, and possibly to lenses 406 and 422.
  • Absorption material may also be added to optical combiner 424. With added absorption dye to one or more locations, then harmful light is taken out of the received light, received by eye 408 and 426 in each respective system.
  • Figure 5 is an illustration of a basic waveguide concept 500 showing embodiments of the disclosed system that include use of blue light mitigation and color correction that can be applied to a projection system, light reflection waveguides/lenses, and/or waveguide combiner lens systems.
  • light is emitted by a source display 518 of the projection system and shine through a lens 520 to a light reflection waveguide/lens 522.
  • a reflector or waveguide 516 may be adhered above the surface of the (first) waveguide/lens 522.
  • the emitted image or light shines down the waveguide/lens 522 to a second area containing a holographic reflective optics or lens 510 or optics, which may be added at a different end of the light reflection waveguide(s)/lens(es) 522.
  • An enlarged view of holographic reflective optics 504 which enlarges a portion within enlarged view 512.
  • a 2 mm width 514 may be one embodiment of a possible sizing.
  • the reflected image or light 508 may reflect off of the second lens 510, or the waveguide or combiner lens and be viewed by the user 506.
  • color absorbing material may be added at the projection system, including the source display 518 and first lens 516.
  • Other possible absorption location(s) may include at light reflection waveguide/lens 522.
  • Holographic reflective optics may also have added absorption material. With the addition at one or more that one of these locations, reflected light 508 will have less harmful light after absorption.
  • FIG. 6 is an illustration of embodiments of diagrams of VR/XR in projector/lens/active shutter/combiner systems 600 and is similar to Figure 2, but with the addition of shutters.
  • FIG. 6 is an illustration of embodiments of block diagrams of VR/XR in projector, lens, and combiner systems.
  • points of light absorption are identified by the components of the AR light circuit.
  • Light source 602 shines through projection source “lens” 604 and the projected image or light 612 continues through a relay optical lens or lenses 608.
  • Light 612 then travel through a first or top active shutter 610, which may control the light amount and timing that light is transmitted.
  • Light combiner 614 also allows light from external source 616 to join projected light 612. External light may first enter through a front active shutter 618 that also controls amount of light and timing transmission is then joined at light combiner 614. reflected by a reflector or combiner 614. External images or light 616 and projected light 612 may then transmit through a tunable lens 618 just before reaching user 620 are received by the user’s eye as the combined image or augmented image is viewed.
  • Points of adding such material may include at the projection source lens 604 and relay optical lens system lens 608. Another possible location of abortion is at top active shutter 610, front active shutter 616, light combiner 614, and runnable lens 618. Adding absorption dye, material, or particles at these locations may further filter projected image light as well as external or environmental light such that the light that reaches the user’s eye has less harmful light in the target wavelength range, such as 400-500 nm.
  • Figure 7 is a graph of light absorption in the range of 400-500 nm. Figure 7 has the absorption for various dyes added including absorbing blue, green and red light.
  • the ranges vary for each of the added dyes and the resulting light emission at wavelengths listed on the X-Axis.
  • Figure 7 is a general illustration and one result of testing, and the range or absorption/emission curve may change based on the amount and type of dye used for wavelength absorption.
  • Figure 8 is an illustration of embodiments projector/freeform prism combiner systems 800.
  • light source 806 may emitted light 814 by a display source 808 and transmit to a prism 810 and reflect light in the prism. Reflected prism light 816 may then be viewed by the user 804.
  • a corrector lens 812 may be included and light transmitted through corrector lens 812 may be directed through prism 810 as an optical prism(s) waveguide 818.
  • absorption material such as dyes, coatings, depositions to manage UV transmission and color may be added at location points 802, such as source display 808, prism 810, so that reflected light 816 may be optimized before seen by user 804.

Abstract

A display system and method are disclosed that include an electronic reality display device and a backlight comprising a light-emitting array. An electronic reality display device may have a backlight unit, which has a light-emitting array included in a projector, a reflector adjacent to the light-emitting array, a first lens for guiding wavelengths based on at least the light emitting array. The system may have an optical material in the backlight unit with at least one light conversion material or at least one light absorbing material, configured to reduce hazardous blue light emissions between 400nm to 500nm. Electronic reality display devices may be at least one of a virtual reality device, an augmented reality device, an extended reality device, and may have one or more optical materials that incorporate one or more light conversion or light absorbing materials, and positioned between the layers of the disclosed display device.

Description

BLUE LIGHT MITIGATION AND COLOR CORRECTION IN AUGMENTED REALITY
(AR), VIRTUAL REALITY (VR), OR EXTENDED REALITY (XR) SYSTEMS
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to U.S. Provisional Patent Application No. 63/226,368, filed on July 28, 2021, the entire disclosure of which is hereby incorporated by reference.
FIELD OF THE DISCLOSURE
The present disclosure relates to blue light filtering and color correction for display systems used for augmented reality (AR), virtual reality (VR), or extended reality (XR) systems.
BACKGROUND
Head-borne augmented reality (AR) systems are poised to be the new "eye gate" for devices in order to increase the hands-free capability, to allow for multi-screen capability, and to allow transmission of information to an augmented reality space. These qualities are currently not possible with present stationary or "geographically located" screens.
Tasks that require interaction with electronic devices and displays can be separated from the actual electronics allowing for truly mobile access to data and interaction with computer interfaces. These tasks can include providing on-site access to assembly or training animations such as, for example, how to assemble an engine on the fly or allowing medical personnel to access test data, such as MRI data in situ on a patient without the need for display screens or keyboards. Additionally, in vehicles, safety information can be displayed dependent upon the position of the driver's head and the information can be provided remotely for AR and virtual reality (VR) and also which is one of the ultimate goals of extended reality (XR). In some embodiments using these technologies, unsafe spaces can be "entered", in simulated, but real settings and real work with real things with a remote outcome or action can be evaluated.
SUMMARY
The present disclosure relates to articles and methods directed toward the application of display and display covers for display devices and reality altering devices. Various embodiments of a display system for use with reality displays comprising an electronic reality display device.
The electronic reality display device may have a backlight unit, wherein the backlight unit comprises a light-emitting array included in a projector, a reflector adjacent to the light-emitting array, a first lens for guiding wavelengths based on at least the light emitting array, and an optical material in the backlight unit comprising at least one light conversion material or at least one light absorbing material. In some embodiments, the at least one light conversion material or at least one light absorbing material is structured and configured to reduce hazardous blue light emissions between about 400 nm to about 500 nm, and wherein the electronic reality display device is at least one of a virtual reality device, an augmented reality device, an extended reality device, and any combination thereof.
The electronic reality display devices are part of a network for accessing software and other display reality platforms. The display system comprises a liquid crystal panel or a diode array illumination assembly. At least one projector and the first lens have the at least one light conversion material or at least one light absorbing material, and the at least one light conversion material or at least one light absorbing material includes dyes for at least one of light absorption and color correction. The system may further have a second lens for combining wavelengths for output, wherein the second lens has the optical material for light absorption and color correction. The electronic reality display device further has a combiner and a dye is applied to the optical material for absorbing blue light at the combiner.
In some embodiments, the light conversion materials or light absorbing materials are solubly or insolubly dispersed throughout the optical material the projector is further comprised of a panel of glass and a light-emitting array having encapsulants, and the projector contains the optical material on at least one of the panels of glass and the light-emitting array. The optical material is on a relay lens. The optical material comprises index matched light conversion materials or light absorbing materials.
The optical material may be an optical film or an optical coating that is on the first lens or incorporated onto the first lens. The optical film may have inorganic nanoparticles that are index matched to the optical material and coupled to an organic adhesive applied to the optical material. The electronic reality display device absorbs light from at least one of outside of the electronic reality device, an output display, or a combination thereof.
In some embodiments, the electronic reality display device may have a holographic waveguide wherein the projector projects light to the first lens that reflect the light along a waveguide to the holographic waveguide. The first lens may be a prism for at least one of guiding the light waves, combining the light waves, or any combination thereof. A dye may be applied to the optical material for absorbing blue light at the first lens. Also, a shutter containing the optical material is located between the projector and a waveguide lens.
A method of enhancing blue light absorption (400 nm - 500 nm) in an electronic reality display devices having the electronic reality display device comprising a backlight unit, wherein the backlight unit comprises a light-emitting array included in a projector; a reflector adjacent to the light-emitting array, a first lens for guiding wavelengths based on at least the light emitting array; and an optical material in the backlight unit comprising at least one light conversion material or at least one light absorbing material. The at least one light conversion material or at least one light absorbing material is structured and configured to reduce hazardous blue light emissions between about 400 nm to about 500 nm. The electronic reality display device is at least one of a virtual reality device, an augmented reality device, an extended reality device, and any combination thereof. The method further includes providing a dye to the optical material for absorbing blue light at the first lens. The method further may further include providing the optical material on the projector which is further comprised of a panel of glass and a light-emitting array having encapsulants, and the projector contains the optical material on at least one of the panels of glass and the light-emitting array. The method may further include having at least one brightness enhancing layer, and providing a polarizing filter adjacent to the at least one brightness enhancing layer.
It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but these are intended to cover applications or embodiments without departing from the spirit or scope of the disclosure. Also, it is to be understood that the phraseology and terminology used herein are for the purpose of description and should not be regarded as limiting.
The above summary is not intended to describe each disclosed embodiment of every implementation of the present disclosure. The brief description of the drawings and the detailed description which follow more particularly exemplifies illustrative embodiments.
BRIEF DESCRIPTION OF THE DRAWINGS
The drawings are schematic illustrations and are not intended to limit the scope of the invention in any way. The drawings are not necessarily to scale. FIG. 1 is an illustration of an embodiment of the use of blue light mitigation and color correction in a real-world display.
FIG. 2 is an illustration of embodiments of block diagrams of AR in projector/lens/combiner systems.
FIGs. 3a and 3b is an illustration of embodiments of visual examples of opportunities of the disclosed technologies in typical AR systems.
FIG. 4 is an illustration comparing virtual reality (VR) technology with augmented reality/extended reality (AR/XR).
FIG. 5 is an illustration of a basic waveguide concept showing embodiments of the disclosed system that include use of blue light mitigation and color correction that can be applied to a projection system, light reflection waveguides/lenses, and/or waveguide combiner lens systems.
FIG. 6 is an illustration of embodiments of diagrams of VR/XR in proj ector/1 ens/active shutter/combiner systems.
FIG. 7 is a graph of light absorption at identified wavelengths.
FIG. 8 is an illustration of embodiments projector/freeform prism combiner systems.
DETAILED DESCRIPTION
The present disclosure relates to articles and methods directed toward the mitigation of hazardous blue light in AR, VR, and/or XR display devices. Various embodiments of biomarker detector systems and methods are be described in detail with reference to the drawings, wherein like reference numerals may represent like parts and assemblies throughout the several views. Reference to various embodiments does not limit the scope of the biomarker detector disclosed herein. Additionally, any examples set forth in this specification are not intended to be limiting and merely set forth some of the many possible embodiments for the biomarker detector. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient, but these are intended to cover applications or embodiments without departing from the spirit or scope of the disclosure. Also, it is to be understood that the phraseology and terminology used herein are for the purpose of description and should not be regarded as limiting.
The current disclosure provides a blue light/color management filter that can be inserted into the image projection chain of an AR, VR, or XR system — likely a head-borne apparatus or headset designed to interact with one or both of the wearer’s eyes. The blue light filter and color correction technology can come in various forms including, but not limited to dyes, films, deposited layers, coated layers, or tints included in the actual lenses or waveguides of the AR, VR, XR apparatus.
The blue light filter can act on the key high energy visible light (HEV) of the spectrum, which includes the blue light portion from 415 nm to 455 nm which is known to be toxic to the retina and the blue-turquoise portion from 460 nm to 500 nm which is linked to non visual physiological responses, to provide both eye health protection and minimize the circadian rhythm impact often caused by blue light on users of portable electronics with visible screens. Because the blue light filter can affect the visible appearance of the visible screen, the color correction technology acts to re-adjust the color spectrum so that the user experiences a near restored color spectrum in the display and/or displayed images. The blue light filtration and color filtration methodology presented here uses a non-software-based approach with the dyes, coatings, deposited layers, etc. which provides a high level of performance with a minimal impact on the projection system or image processing system. In some embodiments, the bluelight/color management filter may include some elements like photochromic or electrochromic systems to enhance functionality over a broader set of use conditions including bright light or dim light conditions and user-specified settings/adjustments.
The current disclosure provides a combination of blue light mitigation with the addition of color correction to the displayed/projected image per the current EYE SAFE technology applied to embedded display filters and display aftermarket filters. In a typical augmented reality (AR) system, a virtual image is diffused by a system comprising a light source (screen, display or microdisplay), a lens (vary-focus lens or ocular lens or a relay optical lens module or optical prism), a light combiner or freeform combiner, and also, optionally, a tunable lens, a top active shutter and a front active shutter. In this disclosure, at least one light conversion material or at least one light absorbing material can be structured and configured to reduce hazardous blue light emissions between about 400 nm to about 500 nm at the level of light source, optical lens or relay optical lens module, tunable lens, optical prism and combiners.
• The passive methods may dyes or photochromic versions of the dyes incorporated into prisms, optical combiners, lenses, that are situation between the light emission source and the wearer's eye • There may be an implementation of the invention that involves photochromic or electrochromic (or otherwise active) system as a part of the final bluelight/color management system
The disclosure of this invention goes to addressing the challenges of the future and applying a filter to smart eyewear, such as smart glasses, halo glasses/lenses, and VR goggle sets for an immersive experience. These smart optical devices may be coupled to other computing devices. Some optical solution may be an extension of glasses, computers, and gaming consoles. Head-borne “augmented” reality systems are poised to be the new “eye gate” for devices to increase hands free capability, allow for multi-screen capability, and to transmit information to an augmented reality space for purposes not currently possible with stationary or “geographically located screens”.
Augmented reality- a simple visual example of the value. Tasks requiring interaction with electronic devices and displays can be separated from the actual electronics allowing for truly mobile access to data and interaction with computer interfaces. Imagine:
• On site access to assembly or training animations — assemble an engine on the fly
• Medical personnel able to access MRI data in-situ “on the patient”
• No need for screens or keyboards in industrial settings
• In-vehicle safety information displayed dependent upon position of the driver’s head
• AR, like VR, can be done remotely, which is the vision for XR; unsafe spaces can be “entered” in simulated, but real settings to perform real work with real things with a remote outcome or action
The disclosure related to more than smart glasses and applies to all optics systems. Including helmet mounted systems, goggles, glasses, and halos. When applied to this changed reality viewing, the resulting view is that the projected or emitted image is still visible in combination with the 360 view outside of the device. Different fields and application can use the light filtering, such as gaming or aviation simulation. It’s not just simulators, but actual travel, training that the technology can be used in, but even medical applications where a patient can walk through an internal health issue or surgical procedure with viewing the entire plan through special optical sources. Imagine something similar for vehicle drivers — no blind spots and will allow for more autonomous driving systems to be implemented that will allow more reasonable human supervision. In an augmented reality system AR where the light is diffused through a waveguide, three areas of blue light (“BL”) and color correction may be defined as follows “A” Projection system near the light source display, “B” Light reflection waveguides/lenses and “C” waveguide/combiner lens. In an augmented reality light circuit, a projector may be combined with a lens in a combiner system. In this embodiment, a projection source “lens” may include BL and color correction opportunity, dyes coatings, depositions to manage UV transmission and color. These materials may be on any of the lenses or combiners or shutters. The transmission may then travel through an image focusing lens or lenses (similarly with the dyes, coatings, depositions to manage UV transmission and color). The image may then travel and reflect off an image viewing waveguide. In some aspects, the image viewing waveguide may combine with viewed image in space. This is then the viewable image received by the user’s eye.
VR application where the image is confined in the headset or near eye displays (“NED”) to create a virtual reality. The VR NED filters light and correct then the display transmits through a lens to the human eye. Thus, filtering at the display or lens to filter and correct light is possible. For Augmented Reality NED, the image is viewed through the lens to augmented reality of the actual space in front of the viewer. The display once again transmits the projected light thought a lens and this is then combined with outside (reality) optical, so that the real-world scene is combined with virtual information. The projected light is from light emitted from LEDs or other light sources.
Another embodiment is using a freeform prism to combine light with projected light. Light or image is projected through a projection source “lens” to a freeform prism combiner. The combiner combines the projection with the viewed image in space through a corrector lens. The optical prism(s) or wave guide may result in the image viewable by the user. The blue light and color correction opportunity is using dyes, coatings, depositions to manage UV transmission and color. In another waveguide system, a source display may project at a point through a lens to a flat lens. The projection may reflect off a holographic reflective optic (around 2 mm) down along the flat lens. The projection may deflect off a second holographic reflective optics and be received or visible to the user. Areas of blue light and color correction on a real-world diagram use a projection system, light reflection waveguides/lenses, and waveguide combiner lens. The waveguide embodiment may be applied to goggles or lenses in a functional layout. In this real- world lay-out concept, the image may project, such as though an image system. This may implement a POD. The light may reflect off such as by light reflection waveguides/lenses. There may be more than on light reflection waveguides/lenses in the coupled in optics. The waveguide combiner lens may then couple the out optics viewable to the user.
The equipment may include, but is not limited to, the designs for AR multi layered, AR glasses, VR goggles, AR (add-on) equipment, and XR headsets, NEDs, simulators, attachments thereto, and halos. Computing devices may be placed on the equipment or in communication with a system for control and operations.
The disclosure has different embodiments possible for the blue light (“BL”) application. In one embodiment, AR systems may use the BL filtering. In one embodiment, there are opportunities for blue light dye usage in typical AR system arrangements. In this embodiment, a NIR sensor may be used. Also, a freeform prism may be utilized for reflecting light. There may also be a freeform corrector. The light may exit or be received by the eye (e.g. the pupil), and the eye may see the resulting virtual image. In another embodiment, there are opportunities for blue light dye usage in typical AR system arrangements. The display screen may generate an image that is transmitted through a vary-focus lens. A combiner adds the reality into the display or allows the user to see reality (at least a promotion). Then the combination of these produces in a virtual image. The second situation is with multiple image sources project through an ocular lens. The combiner takes the image through the ocular lends with reality and the result is that the user sees the virtual image through the combiner. The third situation is a display source projects through ta relay optical lends module. Which then transmits through an active shutter to the light combiner. The user sees thought a tunable lens, seeing see-through real scenes after they are transmitted throughout a front active shutter and light combiner with the display source result.
In another embodiment of VR and XR, the system may include a projector, lens, active shutter and combiner. The projection source “lens” may first receive the light or projection which then travels to a relay optical lens system. The image or light is then transmitted to a top active shutter. After, the light is then reflected off a light combiner to a tunable lens. The light combiner passes some light or control from and to a front active shutter. Both the shutter-controlled light and the reflected (or deflected) light together git the tunable lens which then results in the image as seen or received by the user’s eye. Each of the lenses and shutters have a BL and color correction opportunity. To achieve this, dyes, coatings, depositions to manage UV transmission and color are used.
In the embodiments of Figure 1 is an illustration of an embodiment of the use of blue light mitigation and color correction in a real-world or wearable display. A user may wear display glasses for many virtual reality (“VR”), augmented reality (“AR”), and extended reality (“XR”) displays. In the embodiment of Figure 1, display glasses 100 can be worn by a user. Applying optical materials or dyes to elements of display glasses 100 may help to absorb blue light generated by an image source 102. Added dyes, material, or coatings may use compounds or elements that are microparticles or nanoparticles, so that they are not visible and do not interfere with the image. The emitted light then may be reflected off of at least one lens 104. Lens 104 may be a waveguide lens that reflect the light toward lenses of glasses 106 and viewable by the user. Light may be coupled into the system either from the environment outside of the glasses or by adding additional light sources. In the embodiment of Figure 1, lenses of the glasses 106 may be transparent, allowing light to enter. In other embodiments, lenses of glasses 106 may be opaque, so that the user cannot see the environment outside of the display glasses 100. Lenses of glasses 106 may couple image together by receiving the proj ected display image from image source 102 or reflector 104, and allowing the user to see the environment outside, which is the combined or coupled image viewable by the user.
When the user is viewing a projected image, harmful light may reach the eye. Thus, applying color absorbing material to specific points in the system. Optical material or dye may be applied to image source 102, filtering light at the source. The dye, film, coating, or deposition material may target blue light or other harmful light, absorbing in a range for instance of 400-500 nm. Another point of light absorption may be at the light reflection lenses (waveguides/reflectors) 104, so that the light moving away from waveguide(s)/reflector(s) 104 may contain less harmful blue light. Another possible point of light absorption may be lenses of glasses 106. Not only does the projected image reflect off the inner side of the lenses, so absorption is possible at this point of light interaction, but external light may enter through lenses of glasses 106. Thus, applying absorption material to lenses of glasses 106 would further filter projected image light as well as external or environmental light such that the light that reaches the user’s eye has less harmful light in the target wavelength range, such as 400-500 nm.
In the embodiments of Figure 2 is an illustration of embodiments of block diagrams of AR in projector, lens, and combiner systems. In this embodiment, points of light absorption are identified by the components of the AR light circuit. Light source 202 shines through projection source “lens” 204 and projected image or light 216 continues through an image focusing lens or lenses 206. The light transmission is the reflected by a reflector or combiner 212. In the embodiment of combiner 212, the user 214 can see the image light 218, but also externally or other images 210. This is another source of “external” light that travels through the combiner 212. External images or light 210 and projected lights 216 are received by the user’s eye as the combined image or augmented image is viewed.
Thus, adding absorption dye, material, or coatings to manage UV transmission and certain colors may help to protect the user’s eyes. Points of adding such material may include at the projection source lens 204 and image focusing lens(es) 206. Adding absorption dye, material, or particles at image viewing combiner or waveguide 212 may further filter projected image light as well as external or environmental light such that the light that reaches the user’s eye has less harmful light in the target wavelength range, such as 400-500 nm or other ranges for UV light.
In the embodiments of Figures 3a and 3b, these are non-limiting examples of augmented reality (“AR”) system arrangements that may use blue light dye for light absorption in certain wavelengths. The illustration may be visual examples of opportunities of the disclosed technologies in typical AR systems. In Figure 3a, the AR systems 300 illustrate locations of adding dye for blue light absorption 330. First AR system 301a, illustrates a first possible arrangement may have a display screen 302 emitting light. The display screen can be a light- emitting array, light crystals, projection display, or any light emitting device. Once emitted, light may emit to a lens 304 such as a vary focus lens. In some embodiments, the lens may be one of a plurality of lenses and may be a first of the plurality. In this embodiment, the light may shine through lens 304. In some embodiments, lens 304 may be a focus lens where the light waves are directed in a direction at an angle from the direction that the respective wavelength enters lens 304 and towards combiner 310. The user of the AR device may look through an opening or at a point near combiner 310. A user 306 may see the image at the exit of the pupil 308, which sight from user’s pupil 306 extends in a cone shape with the tip at user’s pupil 306. In this way, user 306 sees the virtual image through combiner 310 and augments or superimposes the virtual projected image originating at display screen 302 and creating a virtual image 312, which may be one layer, more than one layer, or a plurality of layers. The resulting virtual image 312 is then a combination of the view without the AR device and the projected image. User 306 is exposed to harmful light emitted from the device by viewing virtual image 312, and possibly from the outside environment beyond AR system 301a.
Thus, applying an optical material in AR system 301a then filters and/or absorbs, and in some embodiments converts the light waves to different lengths, light waves may reduce exposure to harmful light. Such an optical material can be applied to display screen 302 (or projector). The optical material may also be applied to lens 304 to absorb light transmitted through lens 304. In some embodiments, optical material may also or alternatively be applied at combiner 310 to filter light received by user 306. The optical material may include dye material that absorbs, converts, etc., light waves of a particular wavelength, such as within the range of 400-500 nm. Application of the optical material may be applied to the various elements, and in other embodiments it may be part of the element as a layer or within the material. In other examples, it may be a coating applied to the respective element.
In another embodiment with multiple image sources 301b, a second embodiment or AR system 301b, illustrates a first possible arrangement may have a display screen 320 emitting light. Display screen 320 may be comprised of more than one image source that emits light. In this way, there may be parts of the AR image that is output at a display, or it may me the same or similar image 328. The display screen or image source that can be one or more than one image sources 320 can be a light-emitting array, light crystals, projection display, or any light emitting device. Once emitted, light may emit to a lens 322 such as a vary focus lens. In some embodiments, the lens may be one of a plurality of lenses and may be a first of the plurality. In this embodiment, the light may shine through lens 322. In some embodiments, lens 322 may be a focus lens where the light waves are directed in a direction at an angle from the direction that the respective wavelength enters lens 322 and towards combiner 326. User 324 of the AR device may look through an opening or at a point near combiner 326. A user 324 may see the image at the exit of the pupil 308, which sight from user’s pupil 324 extends in a cone shape with the tip at user’s pupil 324. In this way, user 324 sees the virtual image 328 through combiner 326 and augments or superimposes the virtual projected image originating at display screen(s) 320 and creating a virtual image 328, which may be one layer, more than one layer, or a plurality of layers. The resulting virtual image 328 is then a combination of the view without the AR device and projected image(s) 320.
Similar to first AR system 301a, applying an optical material in AR system 301b then filters and/or absorbs, and in some embodiments converts the light waves to different lengths, light waves may reduce exposure to harmful light. Such an optical material can be applied to display screen 320, or all of the multiple image sources 320 (or projector). The optical material may also be applied to lens 322 to absorb light transmitted through lens 322. In some embodiments, optical material may also or alternatively be applied at the combiner 326 to filter light received by user 306. The optical material may include dye material that absorbs, converts, etc., light waves of a particular wavelength, such as within the range of 400-500 nm.
In another embodiment 301c, a display source 334 may project an image, or a light emission, that may include optical material 332, the light from display source 334 emitted and to an optical relay lens module 336, which helps with light emission and waveguide transmission, and in some instances direction, amplification and conversion. Optical relay lens 336 may also have optical material 332 on it in some embodiments to further assist with absorbing blue light. After the optical relay lens 336, light goes through a top active shutter 340. Top active shutter 340 may control the light transmission amount through this area for a period of time. Light wave 344 that make up the virtual image emitted from display source 334 may then travel through the top active shutter 340, which may be controlled by software to allow certain light levels for period of time, to Tunable lens 350. User 342 can see the image through tunable lens 350. In AR systems, user 342 may also see outside of the device and the environment 346 around user 342. The light and view from environment 334 may enter through a front active shutter 348, which controls light and timing of the view of user 342. The incoming environment light 352 may join with display source 334 (projector) image or light 344 and both enter through turnable lens 350 to a display of the combination viewable by user 342. Turnable lens 350 may be moveable for optimizing the resulting output image viewed by user 342. AR system 301c outputs the combination of external environment 352 and display image 344. Because there is more than one source of light, light absorption material may be placed at points to absorb harmful light. In the embodiment of 301c, absorption material, such as optimizing material or light absorbing dyes, may be place at display source 334. Relay optical lens module 336 may also have optimizing material to further absorb light from display source 334. The display light may be further absorbed at top active shutter. Additionally, in some embodiments, front active shutter 348 may further absorb harmful light waves. Both external/environment light 352 and display light 344 may then further be optimized by applying optimizing material at turnable lens 350. Application of optical material, dye, encapsulant, or other light absorbing material at any of these points in AR system 301c can absorb harmful light in wavelengths 400-500 nm and optimize the output light received or viewed by user 342
In embodiment of Figure 3b, illustrates using a prism in an AR system for guiding light. Fig. 3b illustrates user 368. A freeform prism 372 can be used and shape of the prism is determined by the direction and guide needs of the AR system. Prisms reflect and deflect light. A freeform corrector 366 may be a used to transmit certain wavelength into the prism to help the prismatic light movement and guide wavelength. A display, such as a microdisplay 362 may add light into prism 372 at prism side 374 in certain embodiments. An NIR LED 364 may also be used to add light into the prism or leaving prism 372. “NIR” is near infrared sensing-radiant and can sense a variety of things from moisture, light, and other elements. An NIR sensor 360 for sensing light waves entering prism 372. Blue light absorbing dye 370 may be applied in certain embodiments to one or more points to optimize light and reduce harmful light. Dye, or optical material, may be applied on prism 372. It may also be applied to NIR sensor 360, display/microdi splay 362, and NIR LED 364. Microdisplays may be in a millimeter scale or smaller. The absorption material or dye absorbs blue light, or targeted light, so that the viewable light that reaches the user’s eye 368 has reduced amounts of blue light. The image can be enhanced with adding other light wavelengths to correct the color and improve the image quality, such as freeform correction 366, or added an external light source, such as NIR LED 364.
Figure 4 is an illustration comparing virtual reality (VR) technology with augmented reality/extended reality (AR/XR). VR 402 is where the image is confined in the headset to create a virtual reality. In a first embodiment, the display 404 projects and image, and the light 410 travels through a lens 406 to be viewed or received by a user’s eye 408. In this example, there is not environment or external light from outside the headset. In another embodiment of AR 400, display 420 projects light 430 though lens 422 received by a user’s eye 426. External light 424 enters the headset and combines with projected light 430 in an optical combiner 424. For these near eye displays (“NED”), it is important to protect the eye from harmful light. So, applying absorption dye at display 404 and 420 is important, and possibly to lenses 406 and 422. Absorption material may also be added to optical combiner 424. With added absorption dye to one or more locations, then harmful light is taken out of the received light, received by eye 408 and 426 in each respective system.
Figure 5 is an illustration of a basic waveguide concept 500 showing embodiments of the disclosed system that include use of blue light mitigation and color correction that can be applied to a projection system, light reflection waveguides/lenses, and/or waveguide combiner lens systems. In this embodiment, light is emitted by a source display 518 of the projection system and shine through a lens 520 to a light reflection waveguide/lens 522. A reflector or waveguide 516 may be adhered above the surface of the (first) waveguide/lens 522. The emitted image or light shines down the waveguide/lens 522 to a second area containing a holographic reflective optics or lens 510 or optics, which may be added at a different end of the light reflection waveguide(s)/lens(es) 522. An enlarged view of holographic reflective optics 504, which enlarges a portion within enlarged view 512. Within expanded view 512, a 2 mm width 514 may be one embodiment of a possible sizing. The reflected image or light 508 may reflect off of the second lens 510, or the waveguide or combiner lens and be viewed by the user 506.
To protect the user’s eye from harmful light and add color correction, color absorbing material may be added at the projection system, including the source display 518 and first lens 516. Other possible absorption location(s) may include at light reflection waveguide/lens 522. Holographic reflective optics may also have added absorption material. With the addition at one or more that one of these locations, reflected light 508 will have less harmful light after absorption.
FIG. 6 is an illustration of embodiments of diagrams of VR/XR in projector/lens/active shutter/combiner systems 600 and is similar to Figure 2, but with the addition of shutters. In the embodiments of Figure 6, is an illustration of embodiments of block diagrams of VR/XR in projector, lens, and combiner systems. In this embodiment, points of light absorption are identified by the components of the AR light circuit. Light source 602 shines through projection source “lens” 604 and the projected image or light 612 continues through a relay optical lens or lenses 608. Light 612 then travel through a first or top active shutter 610, which may control the light amount and timing that light is transmitted. The light 612 is then directed or reflected off of light combiner 614, which directs or guides light waves towards the user 620. Light combiner 614 also allows light from external source 616 to join projected light 612. External light may first enter through a front active shutter 618 that also controls amount of light and timing transmission is then joined at light combiner 614. reflected by a reflector or combiner 614. External images or light 616 and projected light 612 may then transmit through a tunable lens 618 just before reaching user 620 are received by the user’s eye as the combined image or augmented image is viewed.
Thus, adding absorption dye, material, or coatings to manage UV transmission and certain colors may help to protect the user’s eyes. Points of adding such material may include at the projection source lens 604 and relay optical lens system lens 608. Another possible location of abortion is at top active shutter 610, front active shutter 616, light combiner 614, and runnable lens 618. Adding absorption dye, material, or particles at these locations may further filter projected image light as well as external or environmental light such that the light that reaches the user’s eye has less harmful light in the target wavelength range, such as 400-500 nm. Figure 7 is a graph of light absorption in the range of 400-500 nm. Figure 7 has the absorption for various dyes added including absorbing blue, green and red light. The ranges vary for each of the added dyes and the resulting light emission at wavelengths listed on the X-Axis. Figure 7 is a general illustration and one result of testing, and the range or absorption/emission curve may change based on the amount and type of dye used for wavelength absorption.
Figure 8 is an illustration of embodiments projector/freeform prism combiner systems 800. In this embodiment, light source 806 may emitted light 814 by a display source 808 and transmit to a prism 810 and reflect light in the prism. Reflected prism light 816 may then be viewed by the user 804. A corrector lens 812 may be included and light transmitted through corrector lens 812 may be directed through prism 810 as an optical prism(s) waveguide 818.
Some absorption material such as dyes, coatings, depositions to manage UV transmission and color may be added at location points 802, such as source display 808, prism 810, so that reflected light 816 may be optimized before seen by user 804.
Various modifications and alterations to this disclosure will become apparent to those skilled in the art without departing from the scope and spirit of this disclosure. It should be understood that this disclosure is not intended to be unduly limited by the illustrative embodiments set forth herein and that such embodiments are presented by way of example only with the scope of the disclosure intended to be limited only by the claims set forth herein as follows. All references cited in this disclosure are herein incorporated by reference in their entirety.

Claims

1. A display system for use with reality displays comprising: an electronic reality display device comprising: a backlight unit, wherein the backlight unit comprises a light-emitting array included in a projector; a reflector adjacent to the light-emitting array; a first lens for guiding wavelengths based on at least the light emitting array; and an optical material in the backlight unit comprising at least one light conversion material or at least one light absorbing material, wherein the at least one light conversion material or at least one light absorbing material is structured and configured to reduce hazardous blue light emissions between about 400 nm to about 500 nm; and wherein the electronic reality display device is at least one of a virtual reality device, an augmented reality device, an extended reality device, and any combination thereof.
2. A display system according to claim 1, wherein the electronic reality display devices are part of a network for accessing software and other display reality platforms.
3. A display system according to claim 1, wherein the display system comprises a liquid crystal panel or a diode array illumination assembly.
4. A display system according to claim 1, wherein at least one of the projectors and the first lens comprises the at least one light conversion material or at least one light absorbing material, and the at least one light conversion material or at least one light absorbing material includes dyes for at least one of light absorption and color correction.
5. A display system according to claim 1, further comprising a second lens for combining wavelengths for output, wherein the second lens has the optical material for light absorption and color correction.
6. A display system according to claim 1, wherein the electronic reality display device further comprises a combiner and a dye is applied to the optical material for absorbing blue light at the combiner.
7. A display system according to claim 1, wherein the light conversion materials or light absorbing materials are solubly or insolubly dispersed throughout the optical material.
8. A display system according to claim 1, wherein the projector is further comprised of a panel of glass and a light-emitting array having encapsulants, and the projector contains the optical material on at least one of the panels of glass and the light-emitting array.
9. A display system according to claim 1, wherein the optical material is on a relay lens.
10. A display system according to claim 1, wherein the optical material comprises index matched light conversion materials or light absorbing materials.
11. A display system according to claim 1, wherein the optical material is an optical film or an optical coating that is on the first lens or incorporated onto the first lens.
12. A display system according to claim 1, comprising inorganic nanoparticles that are index matched to the optical material and coupled to an organic adhesive applied to the optical material.
13. A display system according to claim 1, wherein the electronic reality display device absorbs light from at least one of outside of the electronic reality device, an output display, or a combination thereof.
14. A display system according to claim 1, wherein the electronic reality display device comprises a holographic waveguide wherein the projector projects light to the first lens that reflect the light along a waveguide to the holographic waveguide.
15. A display system according to claim 1, wherein the first lens may be a prism for at least one of guiding the light waves, combining the light waves, or any combination thereof.
16. A display system according to claim 1, wherein a shutter containing the optical material is located between the projector and a waveguide lens.
17. A display system according to claim 1, wherein a dye is applied to the optical material for absorbing blue light at the first lens.
18. A method of enhancing blue light absorption (400 nm - 500 nm) in an electronic reality display devices comprising: the electronic reality display device comprising: a backlight unit, wherein the backlight unit comprises a light-emitting array included in a projector; a reflector adjacent to the light-emitting array; a first lens for guiding wavelengths based on at least the light emitting array; and an optical material in the backlight unit comprising at least one light conversion material or at least one light absorbing material, wherein the at least one light conversion material or at least one light absorbing material is structured and configured to reduce hazardous blue light emissions between about 400 nm to about 500 nm; and wherein the electronic reality display device is at least one of a virtual reality device, an augmented reality device, an extended reality device, and any combination thereof.
19. A method of enhancing blue light absorption (400 nm - 500 nm) in the electronic reality display device claim 18, wherein the method further includes providing a dye to the optical material for absorbing blue light at the first lens.
20. A method of enhancing blue light absorption (400 nm - 500 nm) in the electronic reality display device according to claim 18, wherein the method further comprises: providing the optical material on the projector which is further comprised of a panel of glass and a light-emitting array having encapsulants, and the projector contains the optical material on at least one of the panels of glass and the light-emitting array.
21. A method of enhancing blue light absorption (400 nm - 500 nm) in an electronic reality display device according to claim 18, wherein the method further comprises: having at least one brightness enhancing layer; and providing a polarizing filter adjacent to the at least one brightness enhancing layer.
PCT/US2022/038569 2021-07-28 2022-07-27 Blue light mitigation and color correction in augmented reality (ar), virtual reality (vr), or extended reality (xr) systems WO2023009657A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163226368P 2021-07-28 2021-07-28
US63/226,368 2021-07-28

Publications (1)

Publication Number Publication Date
WO2023009657A1 true WO2023009657A1 (en) 2023-02-02

Family

ID=85088022

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/038569 WO2023009657A1 (en) 2021-07-28 2022-07-27 Blue light mitigation and color correction in augmented reality (ar), virtual reality (vr), or extended reality (xr) systems

Country Status (2)

Country Link
TW (1) TW202321774A (en)
WO (1) WO2023009657A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117471658A (en) * 2023-12-27 2024-01-30 荣耀终端有限公司 Optical lens, camera module and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150002528A1 (en) * 2013-06-28 2015-01-01 David D. Bohn Display efficiency optimization by color filtering
US20160216517A1 (en) * 2014-01-21 2016-07-28 Osterhout Group, Inc. Compact optical system with improved illumination
US20200142298A1 (en) * 2018-06-05 2020-05-07 Facebook Technologies, Llc Light Source with Plurality of Waveguides

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150002528A1 (en) * 2013-06-28 2015-01-01 David D. Bohn Display efficiency optimization by color filtering
US20160216517A1 (en) * 2014-01-21 2016-07-28 Osterhout Group, Inc. Compact optical system with improved illumination
US20200142298A1 (en) * 2018-06-05 2020-05-07 Facebook Technologies, Llc Light Source with Plurality of Waveguides

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117471658A (en) * 2023-12-27 2024-01-30 荣耀终端有限公司 Optical lens, camera module and electronic equipment

Also Published As

Publication number Publication date
TW202321774A (en) 2023-06-01

Similar Documents

Publication Publication Date Title
US11782274B2 (en) Stray light suppression for head worn computing
US11900554B2 (en) Modification of peripheral content in world-locked see-through computer display systems
US11789269B2 (en) See-through computer display systems
US20230400927A1 (en) See-through computer display systems
US20230045175A1 (en) See-through computer display systems
US9298002B2 (en) Optical configurations for head worn computing
US11226489B2 (en) See-through computer display systems with stray light management
US10191284B2 (en) See-through computer display systems
US10578869B2 (en) See-through computer display systems with adjustable zoom cameras
US9316833B2 (en) Optical configurations for head worn computing
WO2023009657A1 (en) Blue light mitigation and color correction in augmented reality (ar), virtual reality (vr), or extended reality (xr) systems
US20240127553A1 (en) Modification of peripheral content in world-locked see-through computer display systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22850275

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE