WO2021142486A1 - A head mounted system with color specific modulation - Google Patents
A head mounted system with color specific modulation Download PDFInfo
- Publication number
- WO2021142486A1 WO2021142486A1 PCT/US2021/070008 US2021070008W WO2021142486A1 WO 2021142486 A1 WO2021142486 A1 WO 2021142486A1 US 2021070008 W US2021070008 W US 2021070008W WO 2021142486 A1 WO2021142486 A1 WO 2021142486A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- green
- head mounted
- color
- resolution
- Prior art date
Links
- 239000003086 colorant Substances 0.000 claims abstract description 48
- 230000003287 optical effect Effects 0.000 claims description 65
- 238000001228 spectrum Methods 0.000 claims description 9
- 210000001508 eye Anatomy 0.000 description 29
- 238000012545 processing Methods 0.000 description 13
- 238000013461 design Methods 0.000 description 10
- 239000000463 material Substances 0.000 description 9
- 230000004044 response Effects 0.000 description 9
- 238000003860 storage Methods 0.000 description 9
- 238000000034 method Methods 0.000 description 7
- 230000002829 reductive effect Effects 0.000 description 7
- 230000003190 augmentative effect Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 6
- 230000003595 spectral effect Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 5
- 239000000049 pigment Substances 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 238000013500 data storage Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 210000003128 head Anatomy 0.000 description 4
- 208000013057 hereditary mucoepithelial dysplasia Diseases 0.000 description 4
- 230000004456 color vision Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 229920000642 polymer Polymers 0.000 description 3
- 230000004304 visual acuity Effects 0.000 description 3
- 230000004308 accommodation Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000017525 heat dissipation Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000010287 polarization Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 239000000758 substrate Substances 0.000 description 2
- 206010019233 Headaches Diseases 0.000 description 1
- 230000002411 adverse Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 210000004027 cell Anatomy 0.000 description 1
- 210000003986 cell retinal photoreceptor Anatomy 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000003889 eye drop Substances 0.000 description 1
- 229940012356 eye drops Drugs 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 231100000869 headache Toxicity 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 206010025482 malaise Diseases 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 210000000608 photoreceptor cell Anatomy 0.000 description 1
- 230000001766 physiological effect Effects 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
- 238000001429 visible spectrum Methods 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0081—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0112—Head-up displays characterised by optical features comprising device for genereting colour display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/013—Head-up displays characterised by optical features comprising a combiner of particular shape, e.g. curvature
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0147—Head-up displays characterised by optical features comprising a device modifying the resolution of the displayed image
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
- G02B2027/0174—Head mounted characterised by optical features holographic
Definitions
- the present application relates to head mounted displays, and more particularly to color specific modulation in head mounted displays.
- HMD low-weight low-power head mounted displays
- the core challenge is creating a high resolution, full color, large field of view (FOV), low power, high heat dissipation display that can be comfortably worn on the head.
- FOV large field of view
- PPD pixel per degree
- displays with large numbers of pixels are generally required. For instance, 60 pixels per degree is at the limit of the angular resolution of the typical human eye.
- a display panel with this resolution is typically very large because individual pixels have a minimum size. This requires compromises in the industrial design of the head mounted display.
- the display panel also requires a lot of power to drive the pixels and perform the computation for each pixel value at the frame rates for head mounted displays. The tradeoffs get worse as the field of view gets larger.
- the field of view of a typical human eye is 135° H by 180° V, but the eye cannot resolve 60 pixels per degree across this field of view.
- the field of view where the eye can resolve maximum acuity is typically 30° H by 30° V and maximally 70° H by 55° V.
- the maximal case would require a display panel with a resolution of 4,200 x 3,300, or ⁇ 14 Megapixels just to cover the high resolution area of the FOV of the eye. To cover the peripheral space beyond that would require even more pixels, and thus more space, computation, and power. With current technology, the display size and power requirements make comfortable, attractive form factors impossible.
- Figure 1 illustrates one embodiment of spectral responses to various colors.
- Figure 2 is a block diagram of one embodiment of the system.
- FIG. 3 illustrates one embodiment of a virtual reality (VR) HMD system, in which one or more colors are displayed using a VR display in combination with a separate display engine for one or more higher frequency colors.
- VR virtual reality
- Figure 4 illustrates one embodiment of a system in which a single optical propagator is used with separate display engines.
- Figure 5 illustrates one embodiment of an augmented reality (AR) HMD, in which one or more colors use a first optical propagator, while one or more higher frequency colors use a second optical propagator.
- AR augmented reality
- Figure 6A illustrates one embodiment of a three-propagator configuration, in which each color has a separate optical propagator.
- Figure 6B illustrates one embodiment of a three-propagator configuration, in which each color has a separate display engine and optical propagator.
- Figure 6C illustrates one embodiment of a two-propagator configuration, in which each optical propagator has an associated display engine.
- Figure 7 illustrates one embodiment of a multi-focal waveguide in which a green-only waveguide provides a second focal distance.
- Figure 8 illustrates one embodiment of a multi-focal waveguide in which a red-green waveguide and a blue-green waveguide are used.
- Figure 9 illustrates another embodiment of a multi-focal waveguide in which a red-green waveguide and a blue-green waveguide are used.
- Figure 10 illustrates one embodiment of a multi-focal waveguide in which separate inputs that are not in-line are used.
- Figure 11 illustrates one embodiment of a multi-FOV waveguide.
- Figure 12 illustrates another embodiment of the multi-FOV display.
- Figure 13 is a block diagram of one embodiment of a computer system that may be used with the present invention. DETAILED DESCRIPTION OF THE INVENTION
- the system applies color specific modulation based on visual perception of wavelength, such that visual information is treated differently based on its color/wavelength.
- the system applies settings to one color channel to alter its format.
- the settings applied to a subset of colors alter its resolution, focal distance, field of view, and/or foveation.
- this change is applied to the green color channel.
- the change is applied to another subset of colors.
- the visual information comprises an alteration of focal distance, field of view, and/or pixel density by color. Other changes to one or two of the three colors in a display may be applied.
- the wavelength based modulation takes advantage of the color perception of the human eye to create a display that has an improved quality, reduced cost, reduced power consumption, and/or reduced weight.
- This improved HMD structure and design utilizes optical elements and color encoding in a new way, which reduces the size, power (battery), and processing requirements, and the heat around the user’s head while retaining the perceived pixel density (PePD) or visual acuity of the images.
- This improved design can be used with either a virtual reality (VR) system, an augmented reality (AR) system, or any other mixed reality or “XR” system in which virtual objects are generated and displayed.
- Dynamic foveated displays take advantage of the fact that the eye can only sense at its highest resolution within the foveal region, which is only a few degrees wide near the center of the field of view. The resolving power of the eye drops off very quickly, to 1 ⁇ 2 resolution at ⁇ 2.5° away from the center, all the way to ⁇ 1/15 at the edge of the field of view. Dynamic foveated displays place the high resolution image portion where the eye is looking. In this way, they are able to reduce the total number of pixels needed by many orders of magnitude to cover the full field of view of an eye.
- high resolution displays can be designed to be even more compact and efficient.
- the eye For most people, the eye’s pixel resolution is higher for the green/yellow spectrum, and lower for the red and blue portions of the spectrum.
- the system displays a higher resolution image in the green/yellow colors than the red/blue colors. This results in the eye perceiving a higher resolution image, because most of the sensors near the fovea are sensitive to green/yellow.
- this color compression of the data stream reduces the amount of data that is processed and displayed and can also simplify the optics used.
- the system splits the focal distance by color, with a red/green combiner at a first focal distance, and a blue/green combiner at a second focal distance.
- the combiners are waveguides. This permits the use of a system with two combiners (red/green and blue/green) instead of six combiners to provide a multi-focal display. It is well known in the art that a combiner design must transmit three colors, for a full color image to be perceived. Having combiners with only a subset of the three colors at different distances, designed to produce full color multifocal images, is an unexpected redesign with many benefits, such as lower cost, lighter weight, and reduced power consumption for longer runtimes for head mounted displays.
- Rods and cones are the two main photoreceptor cells in the eye that make sight possible. Rods are very sensitive to light and will respond to a single photon, however, they provide no information about color to the brain; color information is provided by the cones. Each cone has a wavelength sensitive pigment that has a specific spectral response. There are three types of cones in a typical human eye: short (S), medium (M), and long (L).
- Figure 1 illustrates a typical spectral response of the rods and cones of an eye.
- the short pigment’s peak response is at shorter wavelengths in the blue portion of the color spectrum
- the medium pigment’s peak response is at medium wavelengths at the high end of the green portion of the spectrum
- the long pigment’s peak response is at longer wavelengths near the yellow-orange portion of the spectrum.
- the spectral response from each cone is broad and there is significant overlap, especially between the medium and long cones. This means that there is a section of wavelengths from the short green section to the yellow portion of the visible spectrum that will stimulate both medium and long cones, but not the short cones.
- the spatial distribution of each type of cone can be used to design a more efficient, lighter, cheaper head mounted display.
- the back of the retina in the highest resolution area, the fovea includes three types of cones: long cones are red, medium cones are green, and short cones are blue. There are significantly fewer short cones than medium and long cones in the fovea.
- the typical ratio of M+L cones to S cones is ⁇ 14:1.
- Most of the resolving power of the eye comes from the light sensed by medium and long cones because their spatial density is so much higher, with the more sporadically spaced S cones providing spectral information at the smaller end of the visible range.
- Typical displays create a color image by blending the light from separate color sources to create all of the colors in the display.
- a typical display uses one red (R) source, one blue (B) source, and one (sometimes two) green (G) source.
- R red
- B blue
- G green
- These sources can be light emitting diodes (LEDs), microLEDs, lasers, a scanning laser, a single light source and a rapidly rotating wheel with sections of different color filters, etc.
- a group of RGB light sources, and/or a single mirror in a digital micromirror device are used to display one pixel.
- the light from each of these sources stimulates the cones and rods in the eye according to the spectral response of the pigment for each of those sensors.
- the vision system translates the response of the cones into the millions of colors a typical human can see. Different hues are created by setting different output intensities for each of the individual colors.
- the intensity of each color is encoded with a certain bit-precision. For 3-bit color, 8 levels of each color can be chosen for 512 distinctive colors. On modern displays, this is typically extended to 16.7 million colors by assigning 8-bits to each color channel.
- Figure 2 is a block diagram of one embodiment of the head mounted display system.
- the generation of the virtual image created by a head mounted display 200 starts in the computation system 200.
- This system 200 can be a desktop computer with a video card or a system on a chip that includes a processor and graphics processor, similar to those used in cell phones, or a cloud-based system in which distributed computers provide the processing.
- the graphics engine 220 in some embodiments takes in data from sensor inputs 250, such as cameras 252, eye-tracking sensors 254, ambient light sensors 256, and biosensors 258, to encode the appropriate color for each individual pixel into an array of values that constitute one frame of data.
- the graphics engine 220 in one embodiment generates pixel data for all three color values.
- graphics engine 220 includes resolution selector 225, to select the resolution for each of the colors. In one embodiment, the resolution may differ by color. In another embodiment, one color may have a higher resolution than the other colors. In one embodiment, the higher resolution color is green.
- the system in one embodiment includes a modulator 230 which modulates a portion of the light from the graphics engine 220.
- the modulation may be to alter the resolution, focal distance, and/or foveation.
- the modulation may be part of the graphics engine 220.
- the computation system 210 provides the settings for the light data, which may include one or more of: the resolution, focal position for a foveated image, focal distance, and field of view for each of the colors.
- the green light which is perceived at the highest resolution by the human eye has the highest resolution, while blue and red light have a lower resolution. In one embodiment, this may be achieved by using a down sampler 233 to down-sample the blue and red light.
- the resolution selector 225 in the graphics engine 220 may be a separate light engine for the first subset of light, which is at a higher resolution than the image data generated for the remaining portion of the light.
- the modulation comprises the positioning of a foveated image, using foveated image positioner 238.
- the foveated image positioner 238 utilizes data from sensors 250 to position the foveated image for the user.
- a subset of the light may have a different focal distance.
- the green light may be at a near distance, while the red and blue light are at an infinite focal distance.
- red/green may be at one focal distance, while blue/green are at another focal distance.
- the focal distance logic 236 selects the focal distance for each of the colors.
- the system includes a subset of the colors of the light which is altered.
- the settings for the light may alter its foveated position, focal distance, field of view, and/or resolution, by color.
- the remaining unaltered light may include all colors as well.
- This data is sent over a high-speed data channel 245 from the computation system 210 to the optics system 260.
- Computing the pixel values and encoding them into this array must be done very quickly to prevent simulator sickness in VR/AR and to present an object locked to the real world in AR.
- Frame rates are typically around 90 Hz, or a new frame every 0.011 seconds.
- This computation is an intensive process that uses a lot of energy and generates a lot of heat. Both of these are challenges for a mobile HMD because batteries 240 to provide the necessary power are heavy and heat around the user’s head is uncomfortable.
- a virtual reality (VR) HMD blocks out the light from the real world and presents an entirely virtual reality to the user.
- the optical architecture of a VR display is, in simple terms, an opaque display 270, such as an organic light emitting diode array, with a magnifying lens 275 in front of it.
- VR HMD’s are usually very large because they have a large FOV and need a lot of pixels to create even a blocky image for the user. A large number of pixels requires a large display, with a lot of computing power, which requires a lot of energy to drive.
- An augmented reality (AR) HMD creates a virtual image that mixes with incoming light and augments what a user would already see in the world.
- the optical design of an AR system is more complicated than VR because it combines the virtual image with the real image of the world. This can be accomplished many ways. In one way, the system uses cameras to capture the light coming from the real world, then combines that with the AR images in the graphics processing unit, which is then displayed in the HMD. This is referred to as a passthrough AR. Another way is to combine the photons from the real world directly with the generated AR images using a transparent optical combiner, such as a waveguide, birdbath partial mirror, or holographic optical element. This is deferred to as a see-through AR.
- the optics system 260 may include an opaque virtual reality (VR) display 270 or may include lenses 275 to enable an augmented reality (AR) display.
- VR virtual reality
- AR augmented reality
- the AR system is a see-through system in which the display elements are transparent so that the real world can be perceived directly.
- the optics system includes optical combiner assembly 280 which includes one or more optical combiners.
- the optical combiners in one embodiment, are one or more waveguides.
- the optical combiner assembly 280 directs the light to the user’s eye.
- the system includes one or more display engines 285.
- the optical combiner assembly 280 may determine the focal distance for the portion of the light that utilizes the optical combiner. Thus, with two or more optical combiners, the light may be shown at two or more focal distances.
- the display engines 285 generate the light which is passed through the optical combiner(s).
- the system may include a foveated image, which is a smaller image with a higher resolution.
- foveated display element 290 is provided to move the foveated display within the field of view, to position it.
- Other elements such as positioning mirrors and lenses may be used, as is known in the art.
- Waveguides are one kind of optical combiner that is used to mix the virtual image of the head mounted display with other light. In an AR system, that light is mixed with light coming from the real world. In a VR system, that light could be mixed with another opaque display, such as an OLED or LCD panel.
- One or more waveguides which transmit data associated with a single pixel may be referred to as a waveguide assembly, or optical combiner assembly 280. While the present system generally is discussed with a waveguide, one of skill in the art would understand that other optical combiners may be used, in any of the below embodiments.
- optical combiners may include reflective holographic optical elements (HOEs), curved mirrors, computational holographic displays, birdbath optics including a semi-transparent mirror and beam splitter, or other designs.
- HOEs reflective holographic optical elements
- curved mirrors curved mirrors
- computational holographic displays birdbath optics including a semi-transparent mirror and beam splitter
- birdbath optics including a semi-transparent mirror and beam splitter
- the source display is coupled into the waveguide of optical material by an input coupler.
- the light rays bounce inside the optical material because their angle is less than the critical angle for that material. This is known as total internal reflection (TIR).
- TIR total internal reflection
- the light rays continue to travel via TIR down the waveguide until they interact with an out-coupler that causes the light rays to leave the waveguide and go towards the user’s eye.
- eyebox expanders There may be other elements inside a waveguide to move the light in other directions to make the eyebox of the system larger. These are known as eyebox expanders.
- In-couplers, out-couplers, and eye box expanders are referred to as diffractive optical elements (DOEs).
- DOEs diffractive optical elements
- DOEs are one type of DOE structure.
- Surface relief gratings have very small grooves and are placed in the areas to diffract light in a different direction.
- These gratings can be made, for example, by nano-imprinting polymer on top of an optical substrate, they can be etched directly into the substrate, or they can be made in many other ways.
- the gratings can be perpendicular to the surface of the waveguide, or they can be slanted.
- the gratings can be pillars or grooves.
- Another way to make DOEs is with holographic films. These films can be polymers that have been exposed to create diffraction sites inside the polymer.
- the films are laminated to the waveguide in the in-coupling, expander, or outcoupling regions, the light diffracts off of the sites, turning it in the necessary direction to TIR down the waveguide or be presented to the eye.
- DOEs are known in the art. Other methods of making DOEs in a waveguide or optical combiner may be used.
- the waveguide may have looser tolerances.
- the waveguide may have thickness variation less than 4 miti and warp less than 20 pm.
- other materials such as plastic rather than glass, and other manufacturing methods, such as injection molding, can be used to make the waveguide for lower resolutions.
- the system may also enable the use of magnification to reduce pixel density, as will be described below.
- Minimizing the number of individual waveguides is advantageous because it reduces the cost, complexity, and weight, and will increase the transparency of the HMD.
- a multiresolution optical combiner assembly provides data at two or more resolutions, based on wavelength. That is, the resolution of the image presented in one color will be different than the resolution presented in a different color. In one embodiment, because human eyes perceive green colored data at a higher resolution, the highest resolution portion of the image is in the green wavelength range.
- a higher resolution single color display engine is combined with a lower resolution display of the other colors.
- the higher resolution single color display engine is foveated, meaning it is directed to the user’s fovea.
- the combination provides the perception that the system has the field of view of the VR display and the resolution of the single color display engine.
- the red and blue channels are presented with a first, lower resolution, while the green channel is presented at a second, higher resolution to the user. Despite the lower resolution of two of the three channels, the perceived resolution is the resolution of the green channel. In one embodiment, the lower resolution is 5-40 pixel per degree (PPD), and the higher resolution is 30-60 ppd.
- the blue, red, and green channels are each presented at different resolutions, from lowest to highest. In one embodiment, the blue channel is presented at the lowest resolution (5-20 ppd), the red channel is presented at an intermediate resolution (10-40 ppd), and the green channel is presented at the highest resolution (30 - 120 ppd).
- each display engine is used for each color.
- the resolutions correspond to the resolution ranges above.
- each single-color light engine directs its image to an in-coupling grating that is not in the path of any other color.
- the three displays are combined together into one image using optical elements, such as an X-cube orX-plate, or other arrangements of dichroic mirrors, or other optical elements, and that image is sent to an optical combiner.
- two display panels may be used, one for red and blue, having the same resolution, and one for green with a higher resolution.
- a single three color display engine may be used.
- the output of a display engine may be separated for input to different optical combiners.
- the configuration of the display engines, whether one, two, or three display engines are used, is not determinative. It may be altered in any of the configurations below.
- the initial image has the resolution of the green channel, and the red and blue channels are down-sampled (reducing the pixel count of a frame) while the green channel is kept at a high resolution.
- the down-sampling is in the range of 1 ⁇ 4 - % of the green channel resolution.
- the red and blue channels are down-sampled at the same rate. Alternatively, they may be down-sampled at different rates. This reduces the computing power needed to generate each frame, and the power used to present the image to the user.
- the resolution of the red/blue channels resolution is reduced by increasing the magnification, and thus having larger pixels (e.g. fewer pixels per degree). In one embodiment, this may be used to increase the field of view of the red/blue channels, providing a larger field of view with the same display engine.
- the magnification may be differential magnification, such that the magnification level varies by distance from the fovea/image focus.
- FIG. 3 illustrates one embodiment of a VR HMD which has a microdisplay, such as a virtual reality display panel 310, which in one embodiment is an OLED panel, with a lower resolution array of LEDs which is combined with a higher resolution display 360.
- a microdisplay such as a virtual reality display panel 310, which in one embodiment is an OLED panel, with a lower resolution array of LEDs which is combined with a higher resolution display 360.
- this illustration shows only one eyebox, and a single light ray.
- this is a simplification to make the figure easier to understand.
- a waveguide 330 or other optical combiner projects images from the higher resolution single-color pixels 360.
- the images from the panel 310 pass through VR optics 320.
- the light from the panel 310 passes through waveguide 330, but is not directed along the waveguide 330.
- the lower resolution array of the RGB OLED 310 includes two of the three colors. In one embodiment, the colors of the lower resolution array of the OLED are only red and blue.
- the lower resolution array of the OLED 310 may be a standard three color OLED, and the system sets the green channel to not send data.
- the high resolution single color display 360 is green.
- the higher resolution color is another color wavelength that substantially stimulates both the M and L cones.
- the color is yellow or orange. Having a lower resolution display for at least some of the colors reduces the overall HMD power consumption and weight because there are fewer total pixel values to compute, while the perceived resolution is high because of the higher resolution of the green image displayed through the waveguide 330.
- the higher resolution single color display engine has a resolution of 40- 60 ppd.
- the high resolution image from display engine 360 is coupled into the waveguide 330 through in-coupler 340, and out-coupled through out-coupler 350.
- the high resolution image sent through the waveguide 330 is dynamically foveated. Dynamic foveation targets a high resolution image to the user’s fovea, which has the highest perceived resolution.
- the system can reduce the field of view of the high resolution image, which lowers the pixel count of the high resolution image, while maintaining the perceived resolution at the high resolution level.
- the reduced field of view reduces the power consumption while maintaining high perceived resolution over the entire field of view.
- the opaque VR display panel 310 when the high resolution image is dynamically foveated, provides a separate field image that is also displayed.
- the field (or lower resolution) image also includes the green channel.
- the green waveguide transmitting the high resolution foveated image also includes the lower resolution OLED image outside the foveal area.
- the VR display panel 310 includes green for the field image area.
- the field display may have a cutout for the foveated display area.
- Figure 4 illustrates one embodiment of system in which a single optical propagator is used with separate display engines.
- the system includes three separate display engines 430, 440, 450.
- the light output by each of the different display engines 430 may have different resolutions.
- the light from the green display engine 430 has a higher resolution than the light output of the red display engine 440 and blue display engine 450.
- the system includes a waveguide 410, which includes in-couplers 435, 445, 455 for each of the display engines.
- the in-couplers do not overlap, and are physically displaced from each other.
- the waveguide 410 includes a single out-coupler 420, in one embodiment.
- the waveguide 410 is optimized for the green light. Waveguides 410 may be optimized for certain frequency ranges. In one embodiment, the waveguide 410 is optimized for the green channel.
- Figure 5 illustrates one embodiment of an augmented reality (AR) head mounted device (HMD), in which one or more colors use a first waveguide 510, while one or more higher frequency colors use a second waveguide 520.
- AR augmented reality
- the red and blue color information 515 is presented through the first waveguide 510 at a lower resolution and the green color information 525 is presented through the second waveguide 520 at a higher resolution.
- a three color display engine 530 may generate the image data for both the red and blue light 515 and the green light 525.
- the appropriate waveguide is selected based on frequency.
- a single display engine 530 may be used, and the system can separate the outputs by frequency (wavelength).
- the in-couplers for the waveguides 510, 520 are frequency selective, and in-couple the appropriate color channels.
- the color channels with lower resolution can be sent through cheaper, lower quality waveguides with the green light sent through a waveguide 520 with better imaging capabilities.
- the waveguides for the lower resolution colors are made of plastic.
- the waveguides for the color channels with the lower resolution are made from glass with looser flatness specifications.
- the green light is dynamically foveated, as discussed above.
- each waveguide has a separate in-coupler.
- the in-couplers are color-selective, such that each set of wavelengths is coupled into the appropriate waveguide. This improves on current products because cheaper, lighter, lower quality imaging materials can be used for the red and blue waveguides, such as plastic.
- Figure 6B illustrates one embodiment of a three-propagator configuration, in which each color channel has a separate display engine 650, 655, 660 and optical propagator 654, 659, 665.
- the three waveguide system utilizes separate display engines 650, 655, 660, for each of the colors.
- the in-couplers for each of the colors are displaced with respect to each other. Having separate display engines enables the green display engine 660 to be a higher resolution, different focal distance, or be foveated.
- Figure 6C illustrates one embodiment of a two-propagator configuration, in which each optical propagator has an associated display engine.
- each optical propagator has an associated display engine.
- the quality of the green waveguide 695 may be higher than the quality of the red/blue waveguide 690.
- the problem compounds when the focus of light is considered.
- the human eye can change its focal depth by distorting its lens; this is called accommodation.
- accommodation distance needs to match the distance of the gaze point, which is the point at which the gaze vectors from both eyes intersect in space.
- vergence-accommodation conflict causes headaches and other adverse physiological effects.
- the head mounted display can only display virtual objects at one focal distance, the range at which these objects can be displayed needs to be severely limited so as not to cause a vergence-accommodation conflict.
- the out-coupler of a waveguide creates a virtual image at an infinite focus.
- optical power can be added to the out-coupler to change the focal point from infinity to bring it closer to the head, however, this change has to be applied to each out-coupler and is fixed for that waveguide.
- an optic is put between the out-coupler and the eye to move the focus in.
- a compensating optic is required on the far side of the waveguide such that the light from the real world isn’t affected by the thin-lens.
- One way to provide a multifocal display is to use two sets of waveguides, one set for RGB at one focal point and another set for RGB at a different focal point. However, this doubles the total waveguide count and increases system complexity as well as weight.
- the present system includes two or more waveguides which have different focal distances.
- Figure 7 illustrates one embodiment in which a first waveguide 710 guides RGB light 715 at a far focus and a second waveguide 720 guides green-only light 725 at a near focus.
- the RGB far focus light is focused in the range of 0.5- ⁇ meters
- the green-only near focus light is focused at a distance within the range of 0.25-1 meters.
- a waveguide selector 730 directs the light to the appropriate waveguide.
- the waveguide selector 730 may use polarization to guide a portion of the green light to the RGB waveguide 710 and to the green-only waveguide 720.
- Figure 9 illustrates one embodiment in which a first waveguide 910 guides the blue and green light 915 of the image and creates a virtual image that is focused at a further Z distance.
- the second waveguide 920 guides red and green light 925 and is focused at a nearer Z distance.
- the further (blue green) Z distance is in the range of 0.5-°° meters
- the nearer (red green) Z distance is in 0.25-1 meter.
- a waveguide selector 940 may use optical techniques, such as polarization control, to cause the light to couple into only one of the two waveguides.
- the waveguide selector 940 is a beam splitter.
- a color filter is used to cause the light to couple into the appropriate waveguide.
- the color filter is a reflective filter.
- the display engine 930 alternates displaying red-green frames and blue-green frames and the waveguide selector 940 is a time based selector.
- the red-green and blue- green waveguides are switched such that the waveguide displaying the image data nearer to the user is the blue-green waveguide and the waveguide for the image data further away is the red-green waveguide.
- FIG. 10 there is a waveguide with R,G, and B 1010 displaying data at a first farther focal length at a distance within the range of 0.5-°° meters and another green-only waveguide 1020 displaying data at a second nearer focal length a distance within the range of 0.25 - 1 meters.
- the display data for the RGB light 1015 is produced by three color display engine 1040, while the display data for the green-only light 1025 is produced by green-only display engine 1030.
- the image for the green-only display engine 1030 is dynamically foveated.
- the RGB display engine 1040 is foveated.
- both display engines are foveated.
- more waveguides that are focused at different distances are combined to produce more than two focal lengths, e.g. 3 waveguides could provide focal planes within the ranges of: 0.5- meters, 0.25-1 meter, and 0.1- 0.5 meter.
- Each focal plane has at least one wavelength in high resolution, in one embodiment this wavelength is green, to provide the spatial information at that focal plane.
- Some of the other focal lengths will have other colors to provide the color information of the image.
- the pupils are spatially separated. That is, the in-coupler for the first waveguide 1010 is spatially separated from the in-coupler for the second waveguide 1020.
- the human visual system senses colors differently across the field of view (FOV) of the eye, because the distribution of pigmented cones varies across the field of view.
- the design of the optical combiner assembly can take this distribution into account in order to create large fields of view with lower total color pixel count. For instance, the medium and long cones are found in high concentrations near the fovea. The region outside the fovea is dominated by rods and short cones.
- one waveguide could carry green and red light over the field of view that is scanned by the fovea of a rotating eyeball, minimally 30 FI by 30 V degrees up to 70 FI by 55 V degrees, and another waveguide could display blue light over a much larger field of view, up to 135 FI by 180 V degrees to create the perception of a FOV of 135 FI by 180 V degrees, but with fewer overall pixels.
- Figure 11 illustrates one embodiment of a multi-FOV display, in which a red-green waveguide 1120 outputs red and green light 1125 with a smaller field of view than the blue light 1115 output by blue only waveguide 1110.
- the in-couplers for the color channels are spatially separated.
- the resolution of the blue channel is lower than the resolution of the red and green channels.
- Figure 12 illustrates another embodiment of the multi-FOV display in which an RGB waveguide 1210 outputs red, blue, and green light 1215 from three color display engine 1240, with a larger field of view, and lower resolution than a green-only display 1230 through green-only waveguide 1220.
- the waveguides are different sizes, with the green-only waveguide 1220 a smaller size.
- the in-couplers are different sizes as well.
- the in-coupler for the green light 1225 is smaller than the RGB incoupler.
- the in-coupler for the green-only waveguide 1220 is also smaller than the out-coupler of the RGB waveguide 1210.
- the relative sizes of the waveguides, in-couplers, and out-couplers may differ between the waveguides.
- Figure 13 is a block diagram of a particular machine that may be used with the present invention. It will be apparent to those of ordinary skill in the art, however that other alternative systems of various system architectures may also be used.
- the data processing system illustrated in Figure 13 includes a bus or other internal communication means 1340 for communicating information, and a processing unit 1310 coupled to the bus 1340 for processing information.
- the processing unit 1310 may be a central processing unit (CPU), a digital signal processor (DSP), or another type of processing unit 1310.
- the system further includes, in one embodiment, a random access memory (RAM) or other volatile storage device 1320 (referred to as memory), coupled to bus 1340 for storing information and instructions to be executed by processor 1310.
- RAM random access memory
- Main memory 1320 may also be used for storing temporary variables or other intermediate information during execution of instructions by processing unit 1310.
- the system also comprises in one embodiment a read only memory (ROM) 1350 and/or static storage device 1350 coupled to bus 1340 for storing static information and instructions for processor 1310.
- ROM read only memory
- static storage device 1350 coupled to bus 1340 for storing static information and instructions for processor 1310.
- the system also includes a data storage device 1330 such as a magnetic disk or optical disk and its corresponding disk drive, or Flash memory or other storage which is capable of storing data when no power is supplied to the system.
- Data storage device 1330 in one embodiment is coupled to bus 1340 for storing information and instructions.
- the system may further be coupled to an output device 1370, such as a cathode ray tube (CRT) or a liquid crystal display (LCD) coupled to bus 1340 through bus 1360 for outputting information.
- the output device 1370 may be a visual output device, an audio output device, and/or tactile output device (e.g. vibrations, etc.)
- An input device 1375 may be coupled to the bus 1360.
- the input device 1375 may be an alphanumeric input device, such as a keyboard including alphanumeric and other keys, for enabling a user to communicate information and command selections to processing unit 1310.
- An additional user input device 1380 may further be included.
- cursor control device 1380 such as a mouse, a trackball, stylus, cursor direction keys, or touch screen, may be coupled to bus 1340 through bus 1360 for communicating direction information and command selections to processing unit 1310, and for controlling movement on display device 1370.
- Another device which may optionally be coupled to computer system 1300, is a network device 1385 for accessing other nodes of a distributed system via a network.
- the communication device 1385 may include any of a number of commercially available networking peripheral devices such as those used for coupling to an Ethernet, token ring, Internet, or wide area network, personal area network, wireless network or other method of accessing other devices.
- the communication device 1385 may further be a null-modem connection, or any other mechanism that provides connectivity between the computer system 1300 and the outside world.
- control logic or software implementing the present invention can be stored in main memory 1320, mass storage device 1330, or other storage medium locally or remotely accessible to processor 1310.
- the present invention may also be embodied in a handheld or portable device containing a subset of the computer hardware components described above.
- the handheld device may be configured to contain only the bus 1340, the processor 1310, and memory 1350 and/or 1320.
- the handheld device may be configured to include a set of buttons or input signaling components with which a user may select from a set of available options. These could be considered input device #1 1375 or input device #2 1380.
- the handheld device may also be configured to include an output device 1370 such as a liquid crystal display (LCD) or display element matrix for displaying information to a user of the handheld device. Conventional methods may be used to implement such a handheld device. The implementation of the present invention for such a device would be apparent to one of ordinary skill in the art given the disclosure of the present invention as provided herein.
- LCD liquid crystal display
- the present invention may also be embodied in a special purpose appliance including a subset of the computer hardware components described above, such as a kiosk or a vehicle.
- the appliance may include a processing unit 1310, a data storage device 1330, a bus 1340, and memory 1320, and no input/output mechanisms, or only rudimentary communications mechanisms, such as a small touch-screen that permits the user to communicate in a basic manner with the device.
- the more special-purpose the device is the fewer of the elements need be present for the device to function.
- communications with the user may be through a touch-based screen, or similar mechanism.
- the device may not provide any direct input/output signals, but may be configured and accessed through a website or other network- based connection through network device 1385.
- control logic or software implementing the present invention can be stored on any machine-readable medium locally or remotely accessible to processor 1310.
- a machine-readable medium includes any mechanism for storing information in a form readable by a machine (e.g. a computer).
- a machine readable medium includes read-only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or other storage media which may be used for temporary or permanent data storage.
- the control logic may be implemented as transmittable data, such as electrical, optical, acoustical or other forms of propagated signals (e.g. carrier waves, infrared signals, digital signals, etc.).
- the present application describes and illustrates various embodiments of the system.
- the number of display engines, number of waveguides, and colors adjusted may be varied without departing from the scope of the present invention.
- the settings of the color channels may include any combination of differences in resolution, field of view, focal distance, and foveation.
- the system may modify the generated blue, red, and/or green channels, to create the difference in the settings between the color channels, without departing from the scope of the invention.
- the configurations illustrated herein may be mixed and matched.
- the system may include one or more waveguides, one or more display engines, and separate the color channels into any combination of one, two and/or three colors, and remain within the scope of the present disclosure.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
A head mounted display system to display an image, the head mounted display system comprising a display engine to generate light for a display, the system configured to color specific settings to one or more colors of the light. In one embodiment, the color specific settings comprises one or more of: colors having different resolutions, different focal distances, and different fields of view.
Description
A HEAD MOUNTED SYSTEM WITH COLOR SPECIFIC MODULATION
RELATED APPLICATIONS
[0001] The present application claims priority to U.S. Provisional Application No. 62/957,777 filed on January 6, 2020, and incorporates that application by reference in its entirety.
FIELD OF THE INVENTION
[0002] The present application relates to head mounted displays, and more particularly to color specific modulation in head mounted displays.
BACKGROUND
[0003] There is a need for low-weight low-power head mounted displays (HMD). The core challenge is creating a high resolution, full color, large field of view (FOV), low power, high heat dissipation display that can be comfortably worn on the head. To maintain a high resolution (pixel per degree or PPD) over a large field of view, displays with large numbers of pixels are generally required. For instance, 60 pixels per degree is at the limit of the angular resolution of the typical human eye. To provide enough pixels for a head mounted display with a field of view of 40° horizontal (H) by 40° vertical (V), at 60 pixels per degree, requires a display resolution of 2400 x 2400 pixels, or 5.76 Megapixels per eye. A display panel with this resolution is typically very large because individual pixels have a minimum size. This requires compromises in the industrial design of the head mounted display. The display panel also requires a lot of power to drive the pixels and perform the computation for each pixel value at the frame rates for head mounted displays. The tradeoffs get worse as the field of view gets larger.
[0004] The field of view of a typical human eye is 135° H by 180° V, but the eye cannot resolve 60 pixels per degree across this field of view. The field of view where the eye can resolve maximum acuity is typically 30° H by 30° V and
maximally 70° H by 55° V. The maximal case would require a display panel with a resolution of 4,200 x 3,300, or ~ 14 Megapixels just to cover the high resolution area of the FOV of the eye. To cover the peripheral space beyond that would require even more pixels, and thus more space, computation, and power. With current technology, the display size and power requirements make comfortable, attractive form factors impossible.
BRIEF DESCRIPTION OF THE FIGURES
[0005] The present invention is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings and in which like reference numerals refer to similar elements and in which:
[0006] Figure 1 illustrates one embodiment of spectral responses to various colors.
[0007] Figure 2 is a block diagram of one embodiment of the system.
[0008] Figure 3 illustrates one embodiment of a virtual reality (VR) HMD system, in which one or more colors are displayed using a VR display in combination with a separate display engine for one or more higher frequency colors.
[0009] Figure 4 illustrates one embodiment of a system in which a single optical propagator is used with separate display engines.
[0010] Figure 5 illustrates one embodiment of an augmented reality (AR) HMD, in which one or more colors use a first optical propagator, while one or more higher frequency colors use a second optical propagator.
[0011] Figure 6A illustrates one embodiment of a three-propagator configuration, in which each color has a separate optical propagator.
[0012] Figure 6B illustrates one embodiment of a three-propagator configuration, in which each color has a separate display engine and optical propagator.
[0013] Figure 6C illustrates one embodiment of a two-propagator configuration, in which each optical propagator has an associated display engine.
[0014] Figure 7 illustrates one embodiment of a multi-focal waveguide in which a green-only waveguide provides a second focal distance.
[0015] Figure 8 illustrates one embodiment of a multi-focal waveguide in which a red-green waveguide and a blue-green waveguide are used.
[0016] Figure 9 illustrates another embodiment of a multi-focal waveguide in which a red-green waveguide and a blue-green waveguide are used.
[0017] Figure 10 illustrates one embodiment of a multi-focal waveguide in which separate inputs that are not in-line are used.
[0018] Figure 11 illustrates one embodiment of a multi-FOV waveguide.
[0019] Figure 12 illustrates another embodiment of the multi-FOV display.
[0020] Figure 13 is a block diagram of one embodiment of a computer system that may be used with the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0021] By optimizing the design of head mounted displays (HMD) to take advantage of the way human color vision works, HMDs can be made smaller, lighter, and more efficient without compromising resolution or field of view. In one embodiment, the system applies color specific modulation based on visual perception of wavelength, such that visual information is treated differently based on its color/wavelength. The system applies settings to one color channel to alter its format. In one embodiment, the settings applied to a subset of colors alter its resolution, focal distance, field of view, and/or foveation. In one embodiment, this change is applied to the green color channel. In one embodiment, the change is applied to another subset of colors. In one embodiment, the visual information comprises an alteration of focal distance, field of view, and/or pixel density by color. Other changes to one or two of the three colors in a display may be applied. In one embodiment, the wavelength based modulation takes advantage of the color perception of the human eye to create a display that has an improved quality, reduced cost, reduced power consumption, and/or reduced weight.
[0022] This improved HMD structure and design utilizes optical elements and color encoding in a new way, which reduces the size, power (battery), and processing requirements, and the heat around the user’s head while retaining the perceived pixel density (PePD) or visual acuity of the images. This improved design can be used with either a virtual reality (VR) system, an augmented reality (AR) system, or any other mixed reality or “XR” system in which virtual objects are generated and displayed.
[0023] One way to address the issues of HMDs is by using dynamic foveated displays. Dynamic foveated displays take advantage of the fact that the eye can only sense at its highest resolution within the foveal region, which is only a few degrees wide near the center of the field of view. The resolving power of the eye drops off very quickly, to ½ resolution at ~2.5° away from the center, all the way to ~1/15 at the edge of the field of view. Dynamic foveated displays place the high resolution image portion where the eye is looking. In this way, they are able to reduce the total number of pixels needed by many orders of magnitude to cover the full field of view of an eye. By further considering the structure of the color sensing of the human visual system, high resolution displays can be designed to be even more compact and efficient.
[0024] For most people, the eye’s pixel resolution is higher for the green/yellow spectrum, and lower for the red and blue portions of the spectrum. In one embodiment, the system displays a higher resolution image in the green/yellow colors than the red/blue colors. This results in the eye perceiving a higher resolution image, because most of the sensors near the fovea are sensitive to green/yellow. In one embodiment, this color compression of the data stream reduces the amount of data that is processed and displayed and can also simplify the optics used.
[0025] Having diffractive optical elements with different pixel resolutions for different colors, designed to produce full color images, is an unexpected redesign with many benefits, such as lower cost, lighter weight, and reduced power consumption. This results in enabling smaller batteries, longer runtimes, and/or lower heat dissipation needs for head mounted displays.
[0026] In one embodiment, the system splits the focal distance by color, with a red/green combiner at a first focal distance, and a blue/green combiner at a second focal distance. In one embodiment, the combiners are waveguides. This permits the use of a system with two combiners (red/green and blue/green) instead of six combiners to provide a multi-focal display. It is well known in the art that a combiner design must transmit three colors, for a full color image to be perceived. Having combiners with only a subset of the three colors at different distances, designed to produce full color multifocal images, is an unexpected redesign with many benefits, such as lower cost, lighter weight, and reduced power consumption for longer runtimes for head mounted displays.
[0027] The following detailed description of embodiments of the invention makes reference to the accompanying drawings in which like references indicate similar elements, showing by way of illustration specific embodiments of practicing the invention. Description of these embodiments is in sufficient detail to enable those skilled in the art to practice the invention. One skilled in the art understands that other embodiments may be utilized, and that logical, mechanical, electrical, functional and other changes may be made without departing from the scope of the present invention. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present invention is defined only by the appended claims.
Human Color Vision System
[0028] Rods and cones are the two main photoreceptor cells in the eye that make sight possible. Rods are very sensitive to light and will respond to a single photon, however, they provide no information about color to the brain; color information is provided by the cones. Each cone has a wavelength sensitive pigment that has a specific spectral response. There are three types of cones in a typical human eye: short (S), medium (M), and long (L).
[0029] Figure 1 illustrates a typical spectral response of the rods and cones of an eye. The short pigment’s peak response is at shorter wavelengths in the blue portion of the color spectrum, the medium pigment’s peak response is at medium wavelengths at the high end of the green portion of the spectrum, and the long pigment’s peak response is at longer wavelengths near the yellow-orange portion of the spectrum. The spectral response from each cone is broad and there is significant overlap, especially between the medium and long cones. This means that there is a section of wavelengths from the short green section to the yellow portion of the visible spectrum that will stimulate both medium and long cones, but not the short cones. The spatial distribution of each type of cone can be used to design a more efficient, lighter, cheaper head mounted display.
[0030] The back of the retina in the highest resolution area, the fovea, includes three types of cones: long cones are red, medium cones are green, and short cones are blue. There are significantly fewer short cones than medium and long cones in the fovea. The typical ratio of M+L cones to S cones is ~ 14:1. Most of the resolving power of the eye comes from the light sensed by medium and long cones because their spatial density is so much higher, with the more sporadically spaced S cones providing spectral information at the smaller end of the visible range.
[0031] The improved HMDs described in this application can leverage how the eye works to overcome existing industry challenges.
Micro Displays
[0032] Typical displays create a color image by blending the light from separate color sources to create all of the colors in the display. In one embodiment, a typical display uses one red (R) source, one blue (B) source, and one (sometimes two) green (G) source. These sources can be light emitting diodes (LEDs),
microLEDs, lasers, a scanning laser, a single light source and a rapidly rotating wheel with sections of different color filters, etc.
[0033] In one embodiment, a group of RGB light sources, and/or a single mirror in a digital micromirror device (DMD), are used to display one pixel. The light from each of these sources stimulates the cones and rods in the eye according to the spectral response of the pigment for each of those sensors. The vision system translates the response of the cones into the millions of colors a typical human can see. Different hues are created by setting different output intensities for each of the individual colors. The intensity of each color is encoded with a certain bit-precision. For 3-bit color, 8 levels of each color can be chosen for 512 distinctive colors. On modern displays, this is typically extended to 16.7 million colors by assigning 8-bits to each color channel. A typical way to do this is to give 256 levels of color intensity to the three color channels: 8-bits (28=256) for red, blue and green. This results in 2563 = 16.8 Million color combinations. There are many other ways to encode color data into digital values, such as YUV and its variants. Although the present application discusses using RGB light, one of skill in the art would understand that other ways of encoding color data may be used, without departing from the present invention.
[0034] Combining these encoding approaches with the optical architectures listed below would yield further savings in compute power, video bandwidth requirements, and therefore the power consumption, overall size, weight and the industrial design of a product.
[0035] Figure 2 is a block diagram of one embodiment of the head mounted display system.
[0036] The generation of the virtual image created by a head mounted display 200 starts in the computation system 200. This system 200 can be a desktop computer with a video card or a system on a chip that includes a processor and graphics processor, similar to those used in cell phones, or a cloud-based system in which distributed computers provide the processing. The graphics engine 220 in some embodiments takes in data from sensor inputs 250, such as cameras 252, eye-tracking sensors 254, ambient light sensors 256, and biosensors 258, to encode the appropriate color for each individual pixel into an array of values that constitute one frame of data. The graphics engine 220 in one embodiment generates pixel data for all three color values. In one embodiment, graphics engine 220 includes
resolution selector 225, to select the resolution for each of the colors. In one embodiment, the resolution may differ by color. In another embodiment, one color may have a higher resolution than the other colors. In one embodiment, the higher resolution color is green.
[0037] The system in one embodiment includes a modulator 230 which modulates a portion of the light from the graphics engine 220. The modulation may be to alter the resolution, focal distance, and/or foveation. In one embodiment, the modulation may be part of the graphics engine 220. The computation system 210 provides the settings for the light data, which may include one or more of: the resolution, focal position for a foveated image, focal distance, and field of view for each of the colors.
[0038] In one embodiment, the green light, which is perceived at the highest resolution by the human eye has the highest resolution, while blue and red light have a lower resolution. In one embodiment, this may be achieved by using a down sampler 233 to down-sample the blue and red light. In another embodiment, the resolution selector 225 in the graphics engine 220 may be a separate light engine for the first subset of light, which is at a higher resolution than the image data generated for the remaining portion of the light.
[0039] In one embodiment, the modulation comprises the positioning of a foveated image, using foveated image positioner 238. In one embodiment, the foveated image positioner 238 utilizes data from sensors 250 to position the foveated image for the user.
[0040] In one embodiment, a subset of the light may have a different focal distance. For example, the green light may be at a near distance, while the red and blue light are at an infinite focal distance. Alternatively, red/green may be at one focal distance, while blue/green are at another focal distance. The focal distance logic 236 selects the focal distance for each of the colors.
[0041] In one embodiment, the system includes a subset of the colors of the light which is altered. Thus, in one embodiment, the settings for the light may alter its foveated position, focal distance, field of view, and/or resolution, by color. However, in one embodiment, the remaining unaltered light, may include all colors as well.
[0042] This data is sent over a high-speed data channel 245 from the computation system 210 to the optics system 260. Computing the pixel values and
encoding them into this array must be done very quickly to prevent simulator sickness in VR/AR and to present an object locked to the real world in AR. Frame rates are typically around 90 Hz, or a new frame every 0.011 seconds. This computation is an intensive process that uses a lot of energy and generates a lot of heat. Both of these are challenges for a mobile HMD because batteries 240 to provide the necessary power are heavy and heat around the user’s head is uncomfortable.
[0043] Reducing the computation requirements reduces power consumption, and thus allows a smaller battery size, making the headset lighter and more comfortable, and reduces the generated heat, lowering the thermal dissipation requirement. But reducing computational requirements is in direct conflict with other system preferences, like high resolution and large field of view, both of which traditionally have been accomplished by adding more pixels. The total pixel count increases with the area of the FOV, which causes the total pixel count to increase to levels that are impractical to drive in a head mounted display because they require too much computing power and substantially increase the display panel size. To understand more of the system optimization tradeoffs, more detail in the optical architectures for virtual and augmented reality HMDs is helpful.
Optical Elements of an HMD
[0044] A virtual reality (VR) HMD blocks out the light from the real world and presents an entirely virtual reality to the user. The optical architecture of a VR display is, in simple terms, an opaque display 270, such as an organic light emitting diode array, with a magnifying lens 275 in front of it. VR HMD’s are usually very large because they have a large FOV and need a lot of pixels to create even a blocky image for the user. A large number of pixels requires a large display, with a lot of computing power, which requires a lot of energy to drive.
[0045] An augmented reality (AR) HMD creates a virtual image that mixes with incoming light and augments what a user would already see in the world. The optical design of an AR system is more complicated than VR because it combines the virtual image with the real image of the world. This can be accomplished many ways. In one way, the system uses cameras to capture the light coming from the real world, then combines that with the AR images in the graphics processing unit, which is then displayed in the HMD. This is referred to as a passthrough AR. Another way
is to combine the photons from the real world directly with the generated AR images using a transparent optical combiner, such as a waveguide, birdbath partial mirror, or holographic optical element. This is deferred to as a see-through AR.
[0046] The optics system 260 may include an opaque virtual reality (VR) display 270 or may include lenses 275 to enable an augmented reality (AR) display.
In one embodiment, the AR system is a see-through system in which the display elements are transparent so that the real world can be perceived directly. The optics system includes optical combiner assembly 280 which includes one or more optical combiners. The optical combiners, in one embodiment, are one or more waveguides. The optical combiner assembly 280 directs the light to the user’s eye.
In one embodiment, the system includes one or more display engines 285. In one embodiment, the optical combiner assembly 280 may determine the focal distance for the portion of the light that utilizes the optical combiner. Thus, with two or more optical combiners, the light may be shown at two or more focal distances. The display engines 285 generate the light which is passed through the optical combiner(s). In one embodiment, the system may include a foveated image, which is a smaller image with a higher resolution. For such configurations, foveated display element 290 is provided to move the foveated display within the field of view, to position it. Other elements such as positioning mirrors and lenses may be used, as is known in the art.
[0047] Waveguides are one kind of optical combiner that is used to mix the virtual image of the head mounted display with other light. In an AR system, that light is mixed with light coming from the real world. In a VR system, that light could be mixed with another opaque display, such as an OLED or LCD panel. One or more waveguides which transmit data associated with a single pixel may be referred to as a waveguide assembly, or optical combiner assembly 280. While the present system generally is discussed with a waveguide, one of skill in the art would understand that other optical combiners may be used, in any of the below embodiments.
[0048] The above benefits, due to lower resolution requirements apply to such alternative optical combiners as well. For example, optical combiners may include reflective holographic optical elements (HOEs), curved mirrors, computational holographic displays, birdbath optics including a semi-transparent mirror and beam splitter, or other designs. For these types of optical combiners as
well, the reduction in resolution provides flexibility in tolerances, weights, and materials used. Thus, one of skill in the art would understand that the present improvement may be utilized with any type of optical combiner assembly 280, not just waveguides.
A VR System Using an Optical Combiner
[0049] In a standard VR system using an optical combiner, the source display is coupled into the waveguide of optical material by an input coupler. The light rays bounce inside the optical material because their angle is less than the critical angle for that material. This is known as total internal reflection (TIR). The light rays continue to travel via TIR down the waveguide until they interact with an out-coupler that causes the light rays to leave the waveguide and go towards the user’s eye. There may be other elements inside a waveguide to move the light in other directions to make the eyebox of the system larger. These are known as eyebox expanders. In-couplers, out-couplers, and eye box expanders are referred to as diffractive optical elements (DOEs).
[0050] Many different structures and materials can be used as DOEs in a waveguide. Surface relief gratings are one type of DOE structure. Surface relief gratings have very small grooves and are placed in the areas to diffract light in a different direction. These gratings can be made, for example, by nano-imprinting polymer on top of an optical substrate, they can be etched directly into the substrate, or they can be made in many other ways. The gratings can be perpendicular to the surface of the waveguide, or they can be slanted. The gratings can be pillars or grooves. Another way to make DOEs is with holographic films. These films can be polymers that have been exposed to create diffraction sites inside the polymer.
When the films are laminated to the waveguide in the in-coupling, expander, or outcoupling regions, the light diffracts off of the sites, turning it in the necessary direction to TIR down the waveguide or be presented to the eye.
[0051] The use of various types of DOEs is known in the art. Other methods of making DOEs in a waveguide or optical combiner may be used.
[0052] To maintain high resolution of the final image sent to the user, tight tolerances are required for the flatness of waveguide; for instance, a thickness variation of the material less than 1 pm and the warp is less than 5 pm, is used, in one embodiment. These tight tolerances increase the production cost of the
materials for waveguides. If the resolution requirements for the waveguide are lower, the waveguide may have looser tolerances. In one embodiment, for the lower resolution colors, the waveguide may have thickness variation less than 4 miti and warp less than 20 pm. In one embodiment, other materials, such as plastic rather than glass, and other manufacturing methods, such as injection molding, can be used to make the waveguide for lower resolutions. Thus, by having some of the waveguides as lower-resolution waveguides, the overall product cost can be lowered and/or the product may be made be lighter because of the increased flexibility for the waveguide for lower resolutions. In one embodiment, the system may also enable the use of magnification to reduce pixel density, as will be described below.
[0053] Minimizing the number of individual waveguides is advantageous because it reduces the cost, complexity, and weight, and will increase the transparency of the HMD.
Multi-Resolution Waveguides
[0054] A multiresolution optical combiner assembly provides data at two or more resolutions, based on wavelength. That is, the resolution of the image presented in one color will be different than the resolution presented in a different color. In one embodiment, because human eyes perceive green colored data at a higher resolution, the highest resolution portion of the image is in the green wavelength range.
[0055] In one embodiment, a higher resolution single color display engine is combined with a lower resolution display of the other colors. In one embodiment, the higher resolution single color display engine is foveated, meaning it is directed to the user’s fovea. In one embodiment, the combination provides the perception that the system has the field of view of the VR display and the resolution of the single color display engine.
[0056] In one embodiment, the red and blue channels are presented with a first, lower resolution, while the green channel is presented at a second, higher resolution to the user. Despite the lower resolution of two of the three channels, the perceived resolution is the resolution of the green channel. In one embodiment, the lower resolution is 5-40 pixel per degree (PPD), and the higher resolution is 30-60 ppd.
[0057] In another embodiment, the blue, red, and green channels are each presented at different resolutions, from lowest to highest. In one embodiment, the blue channel is presented at the lowest resolution (5-20 ppd), the red channel is presented at an intermediate resolution (10-40 ppd), and the green channel is presented at the highest resolution (30 - 120 ppd).
[0058] To generate the multi-resolution display, in one embodiment, separate display engines are used for each color. In one embodiment, there are three different display engines 285, one for each color, each one with a different resolution and optionally a different field of view. In one embodiment the resolutions correspond to the resolution ranges above. In one embodiment, each single-color light engine directs its image to an in-coupling grating that is not in the path of any other color. In another embodiment, the three displays are combined together into one image using optical elements, such as an X-cube orX-plate, or other arrangements of dichroic mirrors, or other optical elements, and that image is sent to an optical combiner. In another embodiment, two display panels may be used, one for red and blue, having the same resolution, and one for green with a higher resolution. In another embodiment, a single three color display engine may be used. In one embodiment, the output of a display engine may be separated for input to different optical combiners. The configuration of the display engines, whether one, two, or three display engines are used, is not determinative. It may be altered in any of the configurations below.
[0059] In one embodiment, the initial image has the resolution of the green channel, and the red and blue channels are down-sampled (reducing the pixel count of a frame) while the green channel is kept at a high resolution. In one embodiment, the down-sampling is in the range of ¼ - % of the green channel resolution. In one embodiment, the red and blue channels are down-sampled at the same rate. Alternatively, they may be down-sampled at different rates. This reduces the computing power needed to generate each frame, and the power used to present the image to the user.
[0060] Other ways of generating a lower resolution red/blue and/or higher resolution green image data may be used. In one embodiment, the resolution of the red/blue channels resolution is reduced by increasing the magnification, and thus having larger pixels (e.g. fewer pixels per degree). In one embodiment, this may be used to increase the field of view of the red/blue channels, providing a larger field of view with the same display engine. In one embodiment, the magnification may be
differential magnification, such that the magnification level varies by distance from the fovea/image focus.
[0061] Figure 3 illustrates one embodiment of a VR HMD which has a microdisplay, such as a virtual reality display panel 310, which in one embodiment is an OLED panel, with a lower resolution array of LEDs which is combined with a higher resolution display 360. For simplicity, this illustration shows only one eyebox, and a single light ray. One of skill in the art would understand that this is a simplification to make the figure easier to understand.
[0062] In one embodiment, a waveguide 330 or other optical combiner projects images from the higher resolution single-color pixels 360. The images from the panel 310 pass through VR optics 320. In one embodiment the light from the panel 310 passes through waveguide 330, but is not directed along the waveguide 330.
[0063] In one embodiment, the lower resolution array of the RGB OLED 310 includes two of the three colors. In one embodiment, the colors of the lower resolution array of the OLED are only red and blue.
[0064] In another embodiment, the lower resolution array of the OLED 310 may be a standard three color OLED, and the system sets the green channel to not send data. In one embodiment, the high resolution single color display 360 is green. In another embodiment, the higher resolution color is another color wavelength that substantially stimulates both the M and L cones. In one embodiment, the color is yellow or orange. Having a lower resolution display for at least some of the colors reduces the overall HMD power consumption and weight because there are fewer total pixel values to compute, while the perceived resolution is high because of the higher resolution of the green image displayed through the waveguide 330. In one embodiment, the higher resolution single color display engine has a resolution of 40- 60 ppd.
[0065] The high resolution image from display engine 360 is coupled into the waveguide 330 through in-coupler 340, and out-coupled through out-coupler 350. In one embodiment, the high resolution image sent through the waveguide 330 is dynamically foveated. Dynamic foveation targets a high resolution image to the user’s fovea, which has the highest perceived resolution. By having a high resolution image that is foveated, the system can reduce the field of view of the high resolution image, which lowers the pixel count of the high resolution image, while maintaining
the perceived resolution at the high resolution level. The reduced field of view reduces the power consumption while maintaining high perceived resolution over the entire field of view. In one embodiment, when the high resolution image is dynamically foveated, the opaque VR display panel 310 provides a separate field image that is also displayed. In this embodiment, the field (or lower resolution) image also includes the green channel. Thus, in one embodiment, the green waveguide transmitting the high resolution foveated image also includes the lower resolution OLED image outside the foveal area. In another embodiment, the VR display panel 310 includes green for the field image area. In one embodiment, the field display may have a cutout for the foveated display area. One embodiment of implementing such a foveated image display is described in U.S. Patent No. 10,514,546, issued on December 24, 2019, which is incorporated herein by reference.
[0066] Figure 4 illustrates one embodiment of system in which a single optical propagator is used with separate display engines. The system includes three separate display engines 430, 440, 450. In one embodiment, the light output by each of the different display engines 430 may have different resolutions. In another embodiment, the light from the green display engine 430 has a higher resolution than the light output of the red display engine 440 and blue display engine 450.
[0067] The system includes a waveguide 410, which includes in-couplers 435, 445, 455 for each of the display engines. In one embodiment, the in-couplers do not overlap, and are physically displaced from each other. The waveguide 410 includes a single out-coupler 420, in one embodiment. In one embodiment, the waveguide 410 is optimized for the green light. Waveguides 410 may be optimized for certain frequency ranges. In one embodiment, the waveguide 410 is optimized for the green channel.
[0068] Figure 5 illustrates one embodiment of an augmented reality (AR) head mounted device (HMD), in which one or more colors use a first waveguide 510, while one or more higher frequency colors use a second waveguide 520. In one embodiment, the red and blue color information 515 is presented through the first waveguide 510 at a lower resolution and the green color information 525 is presented through the second waveguide 520 at a higher resolution. By putting a majority of the image spatial information into a green channel that stimulates both the M and L cones, and leaving the red and blue channels at a lower resolution, the
structure of the image comes from the green channel 525 and the rest of the color gamut comes from the lower resolution blue and red channels 515. This lowers the total pixel count, lowering the power requirement and thus the weight and expense of creating an HMD. However, because of how the eye perceives images, the perceived resolution of the resulting image is similar to the higher resolution of the green channel. In one embodiment, a three color display engine 530 may generate the image data for both the red and blue light 515 and the green light 525. The appropriate waveguide is selected based on frequency. In one embodiment, a single display engine 530 may be used, and the system can separate the outputs by frequency (wavelength). In another embodiment, the in-couplers for the waveguides 510, 520 are frequency selective, and in-couple the appropriate color channels.
[0069] In one embodiment, the color channels with lower resolution, typically red and blue, can be sent through cheaper, lower quality waveguides with the green light sent through a waveguide 520 with better imaging capabilities. In one embodiment, the waveguides for the lower resolution colors are made of plastic. In one embodiment the waveguides for the color channels with the lower resolution are made from glass with looser flatness specifications.
[0070] In some embodiments, the green light is dynamically foveated, as discussed above.
[0071] In another embodiment, illustrated in Figure 6A, rather than presenting the red and blue data in a single waveguide, three waveguides are used, one for each color. The high resolution green data is presented in one waveguide 630, and the red and blue data are presented in separate waveguides 620, 610. In one embodiment, in this configuration, the red and blue light may have different resolutions. In one embodiment, red light 625 is medium resolution, and the blue light 615 is low resolution. In one embodiment, a three color display engine 640 is used. Each of the waveguides has a separate in-coupler. In one embodiment, the in-couplers are color-selective, such that each set of wavelengths is coupled into the appropriate waveguide. This improves on current products because cheaper, lighter, lower quality imaging materials can be used for the red and blue waveguides, such as plastic.
[0072] Figure 6B illustrates one embodiment of a three-propagator configuration, in which each color channel has a separate display engine 650, 655, 660 and optical propagator 654, 659, 665. The three waveguide system utilizes
separate display engines 650, 655, 660, for each of the colors. The in-couplers for each of the colors are displaced with respect to each other. Having separate display engines enables the green display engine 660 to be a higher resolution, different focal distance, or be foveated.
[0073] Figure 6C illustrates one embodiment of a two-propagator configuration, in which each optical propagator has an associated display engine. In this configuration, there is a green display engine 670 and a blue/red display engine 680. This allows adjustment of the resolution of the green channel 675 compared to the red/blue light 685. Furthermore, the quality of the green waveguide 695 may be higher than the quality of the red/blue waveguide 690.
Multi-Focal Waveguides
[0074] The problem compounds when the focus of light is considered. The human eye can change its focal depth by distorting its lens; this is called accommodation. For head mounted displays, the accommodation distance needs to match the distance of the gaze point, which is the point at which the gaze vectors from both eyes intersect in space. When these depths don’t match there is a vergence-accommodation conflict that causes headaches and other adverse physiological effects. If the head mounted display can only display virtual objects at one focal distance, the range at which these objects can be displayed needs to be severely limited so as not to cause a vergence-accommodation conflict.
[0075] Generally, the out-coupler of a waveguide creates a virtual image at an infinite focus. In one embodiment, optical power can be added to the out-coupler to change the focal point from infinity to bring it closer to the head, however, this change has to be applied to each out-coupler and is fixed for that waveguide. In another embodiment, an optic is put between the out-coupler and the eye to move the focus in. In this case, a compensating optic is required on the far side of the waveguide such that the light from the real world isn’t affected by the thin-lens. One way to provide a multifocal display is to use two sets of waveguides, one set for RGB at one focal point and another set for RGB at a different focal point. However, this doubles the total waveguide count and increases system complexity as well as weight.
[0076] In one embodiment, the present system includes two or more waveguides which have different focal distances.
[0077] Figure 7 illustrates one embodiment in which a first waveguide 710 guides RGB light 715 at a far focus and a second waveguide 720 guides green-only light 725 at a near focus. In one embodiment, the RGB far focus light is focused in the range of 0.5- ¥ meters, and the green-only near focus light is focused at a distance within the range of 0.25-1 meters. In one embodiment, because green light is present in both waveguides 710, 720, a waveguide selector 730 directs the light to the appropriate waveguide. In one embodiment, the waveguide selector 730 may use polarization to guide a portion of the green light to the RGB waveguide 710 and to the green-only waveguide 720.
[0078] Figure 9 illustrates one embodiment in which a first waveguide 910 guides the blue and green light 915 of the image and creates a virtual image that is focused at a further Z distance. The second waveguide 920 guides red and green light 925 and is focused at a nearer Z distance. In one embodiment, the further (blue green) Z distance is in the range of 0.5-°° meters, and the nearer (red green) Z distance is in 0.25-1 meter. In one embodiment, a waveguide selector 940 may use optical techniques, such as polarization control, to cause the light to couple into only one of the two waveguides. In one embodiment, the waveguide selector 940 is a beam splitter. In one embodiment, a color filter is used to cause the light to couple into the appropriate waveguide. In one embodiment, the color filter is a reflective filter. In one embodiment, the display engine 930 alternates displaying red-green frames and blue-green frames and the waveguide selector 940 is a time based selector.
[0079] In another embodiment, shown in Figure 8, the red-green and blue- green waveguides are switched such that the waveguide displaying the image data nearer to the user is the blue-green waveguide and the waveguide for the image data further away is the red-green waveguide.
[0080] In one embodiment, illustrated in Figure 10, there is a waveguide with R,G, and B 1010 displaying data at a first farther focal length at a distance within the range of 0.5-°° meters and another green-only waveguide 1020 displaying data at a second nearer focal length a distance within the range of 0.25 - 1 meters.
In one embodiment, the display data for the RGB light 1015 is produced by three color display engine 1040, while the display data for the green-only light 1025 is produced by green-only display engine 1030. In one embodiment, the image for the
green-only display engine 1030 is dynamically foveated. In one embodiment, the RGB display engine 1040 is foveated. In one embodiment, both display engines are foveated. In one embodiment, more waveguides that are focused at different distances are combined to produce more than two focal lengths, e.g. 3 waveguides could provide focal planes within the ranges of: 0.5- meters, 0.25-1 meter, and 0.1- 0.5 meter. Each focal plane has at least one wavelength in high resolution, in one embodiment this wavelength is green, to provide the spatial information at that focal plane. Some of the other focal lengths will have other colors to provide the color information of the image.
[0081] In the embodiment illustrated in Figure 10, the pupils are spatially separated. That is, the in-coupler for the first waveguide 1010 is spatially separated from the in-coupler for the second waveguide 1020.
Multi FOV Waveguides
[0082] The human visual system senses colors differently across the field of view (FOV) of the eye, because the distribution of pigmented cones varies across the field of view. The design of the optical combiner assembly can take this distribution into account in order to create large fields of view with lower total color pixel count. For instance, the medium and long cones are found in high concentrations near the fovea. The region outside the fovea is dominated by rods and short cones. In one embodiment, one waveguide could carry green and red light over the field of view that is scanned by the fovea of a rotating eyeball, minimally 30 FI by 30 V degrees up to 70 FI by 55 V degrees, and another waveguide could display blue light over a much larger field of view, up to 135 FI by 180 V degrees to create the perception of a FOV of 135 FI by 180 V degrees, but with fewer overall pixels.
[0083] Figure 11 illustrates one embodiment of a multi-FOV display, in which a red-green waveguide 1120 outputs red and green light 1125 with a smaller field of view than the blue light 1115 output by blue only waveguide 1110. In this configuration, the in-couplers for the color channels are spatially separated. In one embodiment, the resolution of the blue channel is lower than the resolution of the red and green channels.
[0084] Figure 12 illustrates another embodiment of the multi-FOV display in which an RGB waveguide 1210 outputs red, blue, and green light 1215 from three
color display engine 1240, with a larger field of view, and lower resolution than a green-only display 1230 through green-only waveguide 1220. In this configuration, the waveguides are different sizes, with the green-only waveguide 1220 a smaller size. In one embodiment, the in-couplers are different sizes as well. In one embodiment, the in-coupler for the green light 1225 is smaller than the RGB incoupler. In one embodiment, the in-coupler for the green-only waveguide 1220 is also smaller than the out-coupler of the RGB waveguide 1210. In various combinations, the relative sizes of the waveguides, in-couplers, and out-couplers may differ between the waveguides.
[0085] Figure 13 is a block diagram of a particular machine that may be used with the present invention. It will be apparent to those of ordinary skill in the art, however that other alternative systems of various system architectures may also be used.
[0086] The data processing system illustrated in Figure 13 includes a bus or other internal communication means 1340 for communicating information, and a processing unit 1310 coupled to the bus 1340 for processing information. The processing unit 1310 may be a central processing unit (CPU), a digital signal processor (DSP), or another type of processing unit 1310.
[0087] The system further includes, in one embodiment, a random access memory (RAM) or other volatile storage device 1320 (referred to as memory), coupled to bus 1340 for storing information and instructions to be executed by processor 1310. Main memory 1320 may also be used for storing temporary variables or other intermediate information during execution of instructions by processing unit 1310.
[0088] The system also comprises in one embodiment a read only memory (ROM) 1350 and/or static storage device 1350 coupled to bus 1340 for storing static information and instructions for processor 1310. In one embodiment, the system also includes a data storage device 1330 such as a magnetic disk or optical disk and its corresponding disk drive, or Flash memory or other storage which is capable of storing data when no power is supplied to the system. Data storage device 1330 in one embodiment is coupled to bus 1340 for storing information and instructions.
[0089] The system may further be coupled to an output device 1370, such as a cathode ray tube (CRT) or a liquid crystal display (LCD) coupled to bus 1340 through bus 1360 for outputting information. The output device 1370 may be a
visual output device, an audio output device, and/or tactile output device (e.g. vibrations, etc.)
[0090] An input device 1375 may be coupled to the bus 1360. The input device 1375 may be an alphanumeric input device, such as a keyboard including alphanumeric and other keys, for enabling a user to communicate information and command selections to processing unit 1310. An additional user input device 1380 may further be included. One such user input device 1380 is cursor control device 1380, such as a mouse, a trackball, stylus, cursor direction keys, or touch screen, may be coupled to bus 1340 through bus 1360 for communicating direction information and command selections to processing unit 1310, and for controlling movement on display device 1370.
[0091] Another device, which may optionally be coupled to computer system 1300, is a network device 1385 for accessing other nodes of a distributed system via a network. The communication device 1385 may include any of a number of commercially available networking peripheral devices such as those used for coupling to an Ethernet, token ring, Internet, or wide area network, personal area network, wireless network or other method of accessing other devices. The communication device 1385 may further be a null-modem connection, or any other mechanism that provides connectivity between the computer system 1300 and the outside world.
[0092] Note that any or all of the components of this system illustrated in Figure 13 and associated hardware may be used in various embodiments of the present invention.
[0093] It will be appreciated by those of ordinary skill in the art that the particular machine that embodies the present invention may be configured in various ways according to the particular implementation. The control logic or software implementing the present invention can be stored in main memory 1320, mass storage device 1330, or other storage medium locally or remotely accessible to processor 1310.
[0094] It will be apparent to those of ordinary skill in the art that the system, method, and process described herein can be implemented as software stored in main memory 1320 or read only memory 1350 and executed by processor 1310. This control logic or software may also be resident on an article of manufacture comprising a computer readable medium having computer readable
program code embodied therein and being readable by the mass storage device 1330 and for causing the processor 1310 to operate in accordance with the methods and teachings herein.
[0095] The present invention may also be embodied in a handheld or portable device containing a subset of the computer hardware components described above. For example, the handheld device may be configured to contain only the bus 1340, the processor 1310, and memory 1350 and/or 1320.
[0096] The handheld device may be configured to include a set of buttons or input signaling components with which a user may select from a set of available options. These could be considered input device #1 1375 or input device #2 1380. The handheld device may also be configured to include an output device 1370 such as a liquid crystal display (LCD) or display element matrix for displaying information to a user of the handheld device. Conventional methods may be used to implement such a handheld device. The implementation of the present invention for such a device would be apparent to one of ordinary skill in the art given the disclosure of the present invention as provided herein.
[0097] The present invention may also be embodied in a special purpose appliance including a subset of the computer hardware components described above, such as a kiosk or a vehicle. For example, the appliance may include a processing unit 1310, a data storage device 1330, a bus 1340, and memory 1320, and no input/output mechanisms, or only rudimentary communications mechanisms, such as a small touch-screen that permits the user to communicate in a basic manner with the device. In general, the more special-purpose the device is, the fewer of the elements need be present for the device to function. In some devices, communications with the user may be through a touch-based screen, or similar mechanism. In one embodiment, the device may not provide any direct input/output signals, but may be configured and accessed through a website or other network- based connection through network device 1385.
[0098] It will be appreciated by those of ordinary skill in the art that any configuration of the particular machine implemented as the computer system may be used according to the particular implementation. The control logic or software implementing the present invention can be stored on any machine-readable medium locally or remotely accessible to processor 1310. A machine-readable medium includes any mechanism for storing information in a form readable by a machine
(e.g. a computer). For example, a machine readable medium includes read-only memory (ROM), random access memory (RAM), magnetic disk storage media, optical storage media, flash memory devices, or other storage media which may be used for temporary or permanent data storage. In one embodiment, the control logic may be implemented as transmittable data, such as electrical, optical, acoustical or other forms of propagated signals (e.g. carrier waves, infrared signals, digital signals, etc.).
[0099] The present application describes and illustrates various embodiments of the system. The number of display engines, number of waveguides, and colors adjusted may be varied without departing from the scope of the present invention. Furthermore, the settings of the color channels may include any combination of differences in resolution, field of view, focal distance, and foveation. Additionally, the system may modify the generated blue, red, and/or green channels, to create the difference in the settings between the color channels, without departing from the scope of the invention. Also, the configurations illustrated herein may be mixed and matched. Thus, the system may include one or more waveguides, one or more display engines, and separate the color channels into any combination of one, two and/or three colors, and remain within the scope of the present disclosure.
[00100] In the foregoing specification, the invention has been described with reference to specific exemplary embodiments thereof. It will, however, be evident that various modifications and changes may be made thereto without departing from the broader spirit and scope of the invention as set forth in the appended claims.
The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Claims
1 . A head mounted display system to display an image, the head mounted display system comprising: a display engine to generate light for a display; the system configured to apply color specific settings to a subset of colors of the light, such that the subset of colors has different settings than another portion of the light; and an optical combiner to output the one or more colors of light to generate the image.
2. The head mounted display system of claim 1 , wherein the settings comprise a different resolution for the subset of colors.
3. The head mounted display system of claim 2, wherein the subset of colors comprises green light, and the green light has a higher resolution than red light and blue light.
4. The head mounted display system of claim 2, further comprising: the optical combiner to output green light, the green light having a first resolution; and a second optical combiner to output red light, the red light having a second, lower resolution.
5. The head mounted display system of claim 4, further comprising:
a third optical combiner to output blue light, the blue light having a third, lowest resolution.
6. The head mounted display system of claim 4, wherein blue light is passed through the second optical combiner.
7. The head mounted display system of claim 1 , wherein the settings comprise having different focal distances for the one or more colors.
8. The head mounted display system of claim 7, wherein green light has a closer focal distance than red light and blue light.
9. The head mounted display system of claim 8, further comprising: the optical combiner to output the green light at a first focal distance; and a second optical combiner to output red light at a second, further focal distance.
10. The head mounted display system of claim 9, wherein blue light is passed through the second optical combiner.
11. The head mounted display system of claim 7, wherein green-blue light is displayed at a first focal distance, and red-green light is displayed at a second focal distance.
12. The head mounted display system of claim 1 , wherein the display engine comprises: a first display engine to generate a foveal image with the subset of colors; and a second display engine to generate a field display.
13. The head mounted display system of claim 12, wherein the subset of colors comprises a green color channel, and the field display comprises red, blue, and green channels.
14. A head mounted display system to display an image, the head mounted display system comprising: a display engine to generate a spectrum of light for a display; the system configured to apply a color specific setting to a green color channel; and an optical combiner assembly to output the spectrum of light, wherein the green color channel has a setting different from other color channels.
15. The system of claim 14, wherein the color specific setting comprises resolution, and the green color channel has a higher resolution than the other color channels.
16. The system of claim 14, wherein the color specific setting comprises focal distance, and the green color channel has a nearer focal distance than the other color channels.
17. The system of claim 14, wherein the color specific setting comprises foveating, and the green color channel is displayed with a foveated display.
18. The system of claim 14, wherein the optical combiner assembly comprises one or more waveguides.
19. The system of claim 18, wherein when the optical combiner assembly comprises two or more waveguides, a waveguide guiding the green color channel has higher tolerances than the waveguide for the other color channels.
20. The system of claim 14, further comprising: a display engine to generate a full spectrum of light for a display; the system configured to apply a color specific setting to a green color channel; and an optical combiner assembly to output the full spectrum of light, wherein the green color channel has a setting different from other color channels.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180008255.0A CN114930226A (en) | 2020-01-06 | 2021-01-06 | Head-mounted system with color specific modulation |
KR1020227025158A KR20220120615A (en) | 2020-01-06 | 2021-01-06 | Head-mounted system with color-specific modulation |
EP21738820.6A EP4062225A4 (en) | 2020-01-06 | 2021-01-06 | A head mounted system with color specific modulation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202062957777P | 2020-01-06 | 2020-01-06 | |
US62/957,777 | 2020-01-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021142486A1 true WO2021142486A1 (en) | 2021-07-15 |
Family
ID=76654347
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2021/070008 WO2021142486A1 (en) | 2020-01-06 | 2021-01-06 | A head mounted system with color specific modulation |
Country Status (5)
Country | Link |
---|---|
US (2) | US11624921B2 (en) |
EP (1) | EP4062225A4 (en) |
KR (1) | KR20220120615A (en) |
CN (1) | CN114930226A (en) |
WO (1) | WO2021142486A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FI20216043A1 (en) * | 2021-10-08 | 2023-04-09 | Dispelix Oy | Waveguide arrangement |
US20230194870A1 (en) * | 2021-12-16 | 2023-06-22 | Google Llc | Tricolor waveguide exit pupil expansion system with optical power |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130100176A1 (en) * | 2011-10-21 | 2013-04-25 | Qualcomm Mems Technologies, Inc. | Systems and methods for optimizing frame rate and resolution for displays |
US20130208003A1 (en) | 2012-02-15 | 2013-08-15 | David D. Bohn | Imaging structure emitter configurations |
US20160026253A1 (en) * | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US20160327789A1 (en) | 2013-11-27 | 2016-11-10 | Magic Leap, Inc. | Separated pupil optical systems for virtual and augmented reality and methods for displaying images using same |
US20170255766A1 (en) | 2016-03-07 | 2017-09-07 | Magic Leap, Inc. | Blue light adjustment for biometric security |
WO2018175625A1 (en) | 2017-03-22 | 2018-09-27 | Magic Leap, Inc. | Depth based foveated rendering for display systems |
US20190179149A1 (en) | 2017-12-11 | 2019-06-13 | Magic Leap, Inc. | Waveguide illuminator |
US20200271932A1 (en) * | 2019-02-21 | 2020-08-27 | Microsoft Technology Licensing, Llc | Micro led display system |
Family Cites Families (97)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4924522A (en) | 1987-08-26 | 1990-05-08 | Ncr Corporation | Method and apparatus for displaying a high resolution image on a low resolution CRT |
US5035500A (en) | 1988-08-12 | 1991-07-30 | Rorabaugh Dale A | Automated ocular perimetry, particularly kinetic perimetry |
US6008781A (en) | 1992-10-22 | 1999-12-28 | Board Of Regents Of The University Of Washington | Virtual retinal display |
JPH08313843A (en) | 1995-05-16 | 1996-11-29 | Agency Of Ind Science & Technol | Wide visual field and high resolution video presentation device in line of sight followup system |
US8330812B2 (en) | 1995-05-30 | 2012-12-11 | Simulated Percepts, Llc | Method and apparatus for producing and storing, on a resultant non-transitory storage medium, computer generated (CG) video in correspondence with images acquired by an image acquisition device tracked in motion with respect to a 3D reference frame |
US6204974B1 (en) | 1996-10-08 | 2001-03-20 | The Microoptical Corporation | Compact image display system for eyeglasses or other head-borne frames |
US6097353A (en) | 1998-01-20 | 2000-08-01 | University Of Washington | Augmented retinal display with view tracking and data positioning |
CA2375519A1 (en) | 1999-06-21 | 2000-12-28 | The Microoptical Corporation | Eyeglass display lens system employing off-axis optical design |
US6275326B1 (en) | 1999-09-21 | 2001-08-14 | Lucent Technologies Inc. | Control arrangement for microelectromechanical devices and systems |
US6411751B1 (en) | 1999-10-08 | 2002-06-25 | Lucent Technologies Inc. | System and method for training an optical cross-connect comprising steerable switching elements |
US6744173B2 (en) | 2000-03-24 | 2004-06-01 | Analog Devices, Inc. | Multi-layer, self-aligned vertical combdrive electrostatic actuators and fabrication methods |
JP4374708B2 (en) | 2000-03-30 | 2009-12-02 | 株式会社デンソー | Retina scanning display device and optical scanning device |
US7009752B1 (en) | 2003-01-21 | 2006-03-07 | Lockheed Martin Corporation | Actively-supported multi-degree of freedom steerable mirror apparatus and method |
NZ537849A (en) | 2005-01-21 | 2007-09-28 | Peter James Hilton | Direct Retinal Display projecting a scanned optical beam via diverging and converging reflectors |
US8203702B1 (en) | 2005-06-13 | 2012-06-19 | ARETé ASSOCIATES | Optical system |
JP4735234B2 (en) | 2005-12-19 | 2011-07-27 | ブラザー工業株式会社 | Image display system |
US20080015553A1 (en) | 2006-07-12 | 2008-01-17 | Jaime Zacharias | Steering laser treatment system and method of use |
US8246170B2 (en) | 2007-11-21 | 2012-08-21 | Panasonic Corporation | Display apparatus |
US8314814B2 (en) | 2007-12-20 | 2012-11-20 | Raytheon Company | Imaging system |
WO2009094587A1 (en) | 2008-01-23 | 2009-07-30 | Deering Michael F | Eye mounted displays |
JP2009182754A (en) | 2008-01-31 | 2009-08-13 | Sanyo Electric Co Ltd | Image processor |
US20100149073A1 (en) | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
WO2009131626A2 (en) | 2008-04-06 | 2009-10-29 | David Chaum | Proximal image projection systems |
US7786648B2 (en) | 2008-08-18 | 2010-08-31 | New Scale Technologies | Semi-resonant driving systems and methods thereof |
US20110075257A1 (en) | 2009-09-14 | 2011-03-31 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | 3-Dimensional electro-optical see-through displays |
US20110141225A1 (en) | 2009-12-11 | 2011-06-16 | Fotonation Ireland Limited | Panorama Imaging Based on Low-Res Images |
US9128281B2 (en) | 2010-09-14 | 2015-09-08 | Microsoft Technology Licensing, Llc | Eyepiece with uniformly illuminated reflective display |
US20120249797A1 (en) | 2010-02-28 | 2012-10-04 | Osterhout Group, Inc. | Head-worn adaptive display |
TW201222009A (en) | 2010-05-21 | 2012-06-01 | Corning Inc | Systems and methods for reducing speckle using diffusing surfaces |
CA2815461C (en) | 2010-10-21 | 2019-04-30 | Lockheed Martin Corporation | Head-mounted display apparatus employing one or more fresnel lenses |
US9632315B2 (en) | 2010-10-21 | 2017-04-25 | Lockheed Martin Corporation | Head-mounted display apparatus employing one or more fresnel lenses |
US9690099B2 (en) | 2010-12-17 | 2017-06-27 | Microsoft Technology Licensing, Llc | Optimized focal area for augmented reality displays |
US9513490B2 (en) | 2011-01-10 | 2016-12-06 | Eastman Kodak Company | Three channel delivery of stereo images |
EP2744392B1 (en) | 2011-09-07 | 2019-03-27 | Novasight Ltd. | Method and system for treatment of visual impairment |
US8681426B2 (en) | 2011-11-04 | 2014-03-25 | Honeywell International Inc. | Steerable near-to-eye display and steerable near-to-eye display system |
JP5524254B2 (en) | 2012-02-14 | 2014-06-18 | 富士フイルム株式会社 | Mirror drive device and control method thereof |
US20130286053A1 (en) | 2012-04-25 | 2013-10-31 | Rod G. Fleck | Direct view augmented reality eyeglass-type display |
US9435520B2 (en) | 2012-08-16 | 2016-09-06 | Ascendant Engineering Solutions | Gimbal systems providing high-precision imaging capabilities in a compact form-factor |
IL221863A (en) | 2012-09-10 | 2014-01-30 | Elbit Systems Ltd | Digital system for surgical video capturing and display |
US9788714B2 (en) | 2014-07-08 | 2017-10-17 | Iarmourholdings, Inc. | Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance |
WO2014155288A2 (en) | 2013-03-25 | 2014-10-02 | Ecole Polytechnique Federale De Lausanne (Epfl) | Method and apparatus for head worn display with multiple exit pupils |
DE102013208625A1 (en) | 2013-05-10 | 2014-11-13 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | MULTIAPERTUR PROJECTION DISPLAY AND INDIVIDUAL PRODUCER FOR SUCH A |
GB2510001A (en) | 2013-07-05 | 2014-07-23 | Digital Barriers Services Ltd | Terahertz detector scanning mechanism |
US9952042B2 (en) | 2013-07-12 | 2018-04-24 | Magic Leap, Inc. | Method and system for identifying a user location |
US9335548B1 (en) | 2013-08-21 | 2016-05-10 | Google Inc. | Head-wearable display with collimated light source and beam steering mechanism |
JP2015132747A (en) | 2014-01-15 | 2015-07-23 | セイコーエプソン株式会社 | Projector |
US9766463B2 (en) | 2014-01-21 | 2017-09-19 | Osterhout Group, Inc. | See-through computer display systems |
US10068311B2 (en) | 2014-04-05 | 2018-09-04 | Sony Interacive Entertainment LLC | Varying effective resolution by screen location by changing active color sample count within multiple render targets |
US9588408B1 (en) | 2014-05-15 | 2017-03-07 | Autofuss | Methods and systems for projecting a target portion of an image at a higher resolution |
US9874744B2 (en) | 2014-06-25 | 2018-01-23 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays |
CN106415447B (en) | 2014-06-30 | 2019-08-06 | 索尼公司 | Information processing unit, information processing method and image processing system |
US20160077338A1 (en) | 2014-09-16 | 2016-03-17 | Steven John Robbins | Compact Projection Light Engine For A Diffractive Waveguide Display |
GB201417208D0 (en) | 2014-09-30 | 2014-11-12 | Ibvision Ltd | Method Software And Apparatus For Testing A Patiants Visual Field |
IL235073A (en) | 2014-10-07 | 2016-02-29 | Elbit Systems Ltd | Head-mounted displaying of magnified images locked on an object of interest |
EP3006975A3 (en) | 2014-10-08 | 2016-05-25 | Optotune AG | Device for tilting an optical element, particularly a mirror |
CA2971304A1 (en) | 2014-12-19 | 2016-06-23 | Windar Photonics A/S | Lidar based on mems |
KR101844883B1 (en) | 2014-12-23 | 2018-04-03 | 메타 컴퍼니 | Apparatuses, methods and systems coupling visual accommodation and visual convergence to the same plane at any depth of an object of interest |
EP3237953A1 (en) | 2014-12-26 | 2017-11-01 | CY Vision Inc. | Near-to-eye display device with variable resolution |
US10013808B2 (en) | 2015-02-03 | 2018-07-03 | Globus Medical, Inc. | Surgeon head-mounted display apparatuses |
US10284118B2 (en) | 2015-02-06 | 2019-05-07 | New Scale Technologies, Inc. | Two-axis angular pointing device and methods of use thereof |
US10054797B2 (en) | 2015-02-12 | 2018-08-21 | Google Llc | Combining a high resolution narrow field display and a mid resolution wide field display |
US10417832B2 (en) | 2015-03-11 | 2019-09-17 | Facebook Technologies, Llc | Display device supporting configurable resolution regions |
US20160274365A1 (en) | 2015-03-17 | 2016-09-22 | Thalmic Labs Inc. | Systems, devices, and methods for wearable heads-up displays with heterogeneous display quality |
JP6520432B2 (en) | 2015-06-09 | 2019-05-29 | セイコーエプソン株式会社 | Optical device and image display device |
US10210844B2 (en) | 2015-06-29 | 2019-02-19 | Microsoft Technology Licensing, Llc | Holographic near-eye display |
JP2017028510A (en) | 2015-07-23 | 2017-02-02 | 日本放送協会 | Multi-viewpoint video generating device, program therefor, and multi-viewpoint video generating system |
EP3332554B1 (en) | 2015-08-07 | 2022-07-06 | Apple Inc. | System and method for displaying a stream of images |
EP4380150A3 (en) | 2015-10-08 | 2024-07-24 | InterDigital VC Holdings, Inc. | Methods and systems of automatic calibration for dynamic display configurations |
US10726619B2 (en) | 2015-10-29 | 2020-07-28 | Sony Interactive Entertainment Inc. | Foveated geometry tessellation |
WO2017083331A1 (en) | 2015-11-09 | 2017-05-18 | Digital Surgicals Pte Ltd. | Personalized hand-eye coordinated digital stereo microscopic systems and methods |
US20170188021A1 (en) | 2015-12-24 | 2017-06-29 | Meta Company | Optical engine for creating wide-field of view fovea-based display |
CN108474949A (en) | 2015-12-24 | 2018-08-31 | 星风Ip公司 | Virtual reality head-mounted display |
CN107250885B (en) | 2015-12-29 | 2019-11-12 | 深圳市柔宇科技有限公司 | Head-mounted display apparatus |
JP2019505843A (en) | 2016-01-22 | 2019-02-28 | コーニング インコーポレイテッド | Wide-view personal display device |
US20170255012A1 (en) | 2016-03-04 | 2017-09-07 | Sharp Kabushiki Kaisha | Head mounted display using spatial light modulator to move the viewing zone |
US20170255020A1 (en) | 2016-03-04 | 2017-09-07 | Sharp Kabushiki Kaisha | Head mounted display with directional panel illumination unit |
US10438400B2 (en) | 2016-03-08 | 2019-10-08 | Nvidia Corporation | Perceptually-based foveated rendering using a contrast-enhancing filter |
US10192528B2 (en) | 2016-03-31 | 2019-01-29 | Sony Interactive Entertainment Inc. | Real-time user adaptive foveated rendering |
US10460704B2 (en) | 2016-04-01 | 2019-10-29 | Movidius Limited | Systems and methods for head-mounted display adapted to human visual mechanism |
GB2553744B (en) | 2016-04-29 | 2018-09-05 | Advanced Risc Mach Ltd | Graphics processing systems |
US20190278102A1 (en) | 2016-07-25 | 2019-09-12 | Optotune Ag | Optical device for enhancing resolution of an image using multistable states |
US20180031849A1 (en) | 2016-07-29 | 2018-02-01 | Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America | Augmented reality head-up display road correction |
US10255714B2 (en) | 2016-08-24 | 2019-04-09 | Disney Enterprises, Inc. | System and method of gaze predictive rendering of a focal area of an animation |
US9779478B1 (en) | 2016-10-04 | 2017-10-03 | Oculus Vr, Llc | Rendering composite content on a head-mounted display including a high resolution inset |
US10140695B2 (en) | 2016-10-04 | 2018-11-27 | Facebook Technologies, Llc | Head-mounted compound display including a high resolution inset |
CA3042554C (en) | 2016-11-16 | 2023-07-18 | Magic Leap, Inc. | Multi-resolution display assembly for head-mounted display systems |
US9711114B1 (en) | 2016-12-01 | 2017-07-18 | Varjo Technologies Oy | Display apparatus and method of displaying using projectors |
US9711072B1 (en) | 2016-12-01 | 2017-07-18 | Varjo Technologies Oy | Display apparatus and method of displaying using focus and context displays |
WO2018107055A1 (en) | 2016-12-09 | 2018-06-14 | New Scale Technologies, Inc. | Semi-resonant motion devices and methods thereof |
US10162356B2 (en) | 2016-12-15 | 2018-12-25 | Futurewei Technologies, Inc. | Path selection for autonomous vehicles |
GB2560306B (en) | 2017-03-01 | 2020-07-08 | Sony Interactive Entertainment Inc | Image processing |
US20180262758A1 (en) | 2017-03-08 | 2018-09-13 | Ostendo Technologies, Inc. | Compression Methods and Systems for Near-Eye Displays |
US20180269266A1 (en) | 2017-03-16 | 2018-09-20 | Intel Corporation | Foveated displays for virtual and augmented reality |
EP3602181A4 (en) | 2017-03-27 | 2020-05-13 | Avegant Corp. | Steerable foveal display |
WO2019104046A1 (en) * | 2017-11-27 | 2019-05-31 | University Of Central Florida Research | Optical display system, method, and applications |
US20210293931A1 (en) | 2018-07-26 | 2021-09-23 | Innoviz Technologies Ltd. | Lidar system having a mirror with a window |
US11209650B1 (en) | 2018-09-06 | 2021-12-28 | Facebook Technologies, Llc | Waveguide based display with multiple coupling elements for artificial reality |
-
2021
- 2021-01-06 US US17/248,049 patent/US11624921B2/en active Active
- 2021-01-06 CN CN202180008255.0A patent/CN114930226A/en active Pending
- 2021-01-06 KR KR1020227025158A patent/KR20220120615A/en not_active Application Discontinuation
- 2021-01-06 WO PCT/US2021/070008 patent/WO2021142486A1/en unknown
- 2021-01-06 EP EP21738820.6A patent/EP4062225A4/en active Pending
-
2023
- 2023-04-06 US US18/296,965 patent/US12092828B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130100176A1 (en) * | 2011-10-21 | 2013-04-25 | Qualcomm Mems Technologies, Inc. | Systems and methods for optimizing frame rate and resolution for displays |
US20130208003A1 (en) | 2012-02-15 | 2013-08-15 | David D. Bohn | Imaging structure emitter configurations |
US20160327789A1 (en) | 2013-11-27 | 2016-11-10 | Magic Leap, Inc. | Separated pupil optical systems for virtual and augmented reality and methods for displaying images using same |
US20160026253A1 (en) * | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
US20170255766A1 (en) | 2016-03-07 | 2017-09-07 | Magic Leap, Inc. | Blue light adjustment for biometric security |
WO2018175625A1 (en) | 2017-03-22 | 2018-09-27 | Magic Leap, Inc. | Depth based foveated rendering for display systems |
US20190179149A1 (en) | 2017-12-11 | 2019-06-13 | Magic Leap, Inc. | Waveguide illuminator |
US20200271932A1 (en) * | 2019-02-21 | 2020-08-27 | Microsoft Technology Licensing, Llc | Micro led display system |
Non-Patent Citations (1)
Title |
---|
See also references of EP4062225A4 |
Also Published As
Publication number | Publication date |
---|---|
US20210208407A1 (en) | 2021-07-08 |
KR20220120615A (en) | 2022-08-30 |
US12092828B2 (en) | 2024-09-17 |
CN114930226A (en) | 2022-08-19 |
US20230375839A1 (en) | 2023-11-23 |
EP4062225A4 (en) | 2023-12-27 |
US11624921B2 (en) | 2023-04-11 |
EP4062225A1 (en) | 2022-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhan et al. | Multifocal displays: review and prospect | |
US12092828B2 (en) | Head mounted system with color specific modulation | |
EP3607381B1 (en) | Wide field of view scanning display | |
EP3014340B1 (en) | Display efficiency optimization by color filtering | |
US20220295042A1 (en) | Light Field Display | |
US10499043B2 (en) | Enhanced image display in head-mounted displays | |
US10571696B2 (en) | Near-to-eye display device | |
US20230004007A1 (en) | Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes | |
EP3610318B1 (en) | Foveated mems scanning display | |
KR20070033045A (en) | Wide Field of View Binocular Devices, Systems, and Kits | |
US10564428B2 (en) | Near eye display | |
US20100171816A1 (en) | Display device for three dimensional (3d) images | |
CN113508328A (en) | Color correction of virtual images for near-eye displays | |
US11176860B1 (en) | Systems and methods for transferring an image to an array of emissive subpixels | |
Noui et al. | Laser beam scanner and combiner architectures | |
Rao et al. | Display and Optics Architecture for Meta's AR/VR Development | |
Wetzstein et al. | Factored displays: improving resolution, dynamic range, color reproduction, and light field characteristics with advanced signal processing | |
Mohedano et al. | Visual Interfaces in XR | |
US10957240B1 (en) | Apparatus, systems, and methods to compensate for sub-standard sub pixels in an array | |
Wu et al. | Personal near-to-eye light-field displays | |
Yang | Visual Experience Enhancement in Augmented Reality Displays | |
TW202334704A (en) | Waveguide arrangement | |
Magyari et al. | 17‐1: Invited Paper: Are There Objective Requirements for Displays in Near‐to‐Eye Headsets? | |
Sarayeddine et al. | Applications of OLED Microdisplays | |
CN117492210A (en) | Glasses and working method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21738820 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021738820 Country of ref document: EP Effective date: 20220624 |
|
ENP | Entry into the national phase |
Ref document number: 20227025158 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |