US20170108697A1 - Dual-Mode Augmented/Virtual Reality (AR/VR) Near-Eye Wearable Displays - Google Patents
Dual-Mode Augmented/Virtual Reality (AR/VR) Near-Eye Wearable Displays Download PDFInfo
- Publication number
- US20170108697A1 US20170108697A1 US15/294,447 US201615294447A US2017108697A1 US 20170108697 A1 US20170108697 A1 US 20170108697A1 US 201615294447 A US201615294447 A US 201615294447A US 2017108697 A1 US2017108697 A1 US 2017108697A1
- Authority
- US
- United States
- Prior art keywords
- image
- viewer
- mode
- dual
- eye
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 37
- 230000002093 peripheral effect Effects 0.000 claims abstract description 8
- 230000003287 optical effect Effects 0.000 claims description 52
- 238000012545 processing Methods 0.000 claims description 45
- 239000010409 thin film Substances 0.000 claims description 13
- 210000003128 head Anatomy 0.000 claims description 9
- 239000000463 material Substances 0.000 claims description 9
- 239000004983 Polymer Dispersed Liquid Crystal Substances 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 5
- 238000001514 detection method Methods 0.000 claims description 5
- 230000004886 head movement Effects 0.000 claims description 5
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 claims description 3
- 239000013589 supplement Substances 0.000 claims description 3
- 230000000712 assembly Effects 0.000 claims description 2
- 238000000429 assembly Methods 0.000 claims description 2
- 230000009977 dual effect Effects 0.000 claims 3
- 210000001747 pupil Anatomy 0.000 description 11
- 239000011521 glass Substances 0.000 description 7
- 230000006837 decompression Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 238000000034 method Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 230000006835 compression Effects 0.000 description 3
- 238000007906 compression Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000009877 rendering Methods 0.000 description 3
- 238000000926 separation method Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 241000276420 Lophius piscatorius Species 0.000 description 1
- 230000004308 accommodation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 229920000642 polymer Polymers 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000004513 sizing Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0081—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B6/00—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
- G02B6/10—Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings of the optical waveguide type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
- G09G5/026—Control of mixing and/or overlay of colours in general
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0147—Head-up displays characterised by optical features comprising a device modifying the resolution of the displayed image
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/18—Diffraction gratings
- G02B5/1842—Gratings for image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/033—Indexing scheme relating to G06F3/033
- G06F2203/0339—Touch strips, e.g. orthogonal touch strips to control cursor movement or scrolling; single touch strip to adjust parameter or to implement a row of soft keys
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0626—Adjustment of display parameters for control of overall brightness
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0666—Adjustment of display parameters for control of colour parameters, e.g. colour temperature
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
Definitions
- the invention relates generally to wearable electronics and more particularly, to a dual-mode augmented/virtual reality near-eye wearable display.
- Wearable optical electronics are becoming commonplace as integrated circuit size, weight and power (SWaP) and cost scale downward. Wearable optical electronics have a wide number of commercial, military and consumer applications. With respect to wearable optical electronics, there exists prior art, none of which address the need for a high resolution, dual-mode, augmented/virtual reality near-eye wearable display having a form of curved lenses with a non-planar profile and surface, which curved lens profile is used almost exclusively in consumer and other applications and is considered fashionable and aesthetically pleasing. The invention disclosed herein addresses the need for, and enables, such a near-eye wearable display.
- FIG. 1 depicts a perspective view of the dual-mode augmented/virtual reality near-eye wearable display of the invention.
- FIG. 2 depicts a top plan view of the dual-mode augmented/virtual reality near-eye wearable display of the invention.
- FIG. 3A depicts an optical lens of the invention.
- FIG. 3B depicts a cross-section of the lens of FIG. 3A .
- FIG. 3C depicts a top view of the lens element of FIG. 3A .
- FIG. 4 depicts the lens of the dual-mode augmented/virtual reality near-eye wearable display of the invention and illustrates the optical waveguide structures of the lens.
- FIG. 5 depicts a perspective view of the dual-mode augmented/virtual reality near-eye wearable display of the invention showing the battery and connector of the temple of the display frame.
- a dual-mode augmented/virtual reality near-eye wearable display for use with, but not limited to, curved optical lenses.
- a dual-mode augmented/virtual reality near-eye wearable display may comprise an optical lens comprising a first (scene-facing) surface, a lens thickness, and a lens peripheral edge or surface.
- the first surface may comprise an electro-tinting layer comprising a variably optically transmissive layer disposed between a first and a second electrically conductive transparent thin film layer.
- Each of the first and second conductive transparent thin film layers may be coupled to control circuitry configured to vary an optical transmissivity of the variably optically transmissive layer.
- One or more optical waveguide structures are provided within the lens thickness and may comprise at least one input image aperture and at least one exit aperture, which exit aperture may be divided into a plurality of exit aperture sub-regions.
- One or more image sources are optically coupled to their respective input image apertures.
- the image sources may be disposed on the peripheral (i.e.; edge or side) surface of the lens and configured to directly optically couple a displayed optical image from the image source directly into an input image aperture and then to an exit aperture or from a plurality of input image apertures to a plurality of respective exit aperture sub-regions.
- the exit aperture's or exit aperture sub-region's optical characteristics are preferably configured to match a predetermined area and predetermined angle of divergence of the respective input image aperture.
- the dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the variably optically transmissive layer is comprised of a polymer dispersed liquid crystal (PDLC) material.
- PDLC polymer dispersed liquid crystal
- the dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures disposed within the lens element are each individually “piecewise flat”.
- the dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures that are piecewise flat provide image portions that are collectively combined in a tiled arrangement to define optical lenses having a curved or non-planar surface and profile.
- the dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures each are configured to redirect an image that is coupled from its respective input image aperture into its respective exit aperture or exit aperture sub-region.
- Alternatives to the use of optical waveguide structures that are piecewise flat are also disclosed.
- the dual-mode augmented/virtual reality near-eye wearable display may further be provided wherein a plurality of optical waveguide structures collectively define an output eye box of the dual-mode augmented/virtual reality near-eye wearable display.
- the dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures each have a dedicated input image aperture and exit aperture sub-region that are coupled to a respective dedicated individual image source.
- the dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the image source comprises an emissive micro-scale pixel array comprising of pixels that are individually spatially, chromatically and temporally addressable.
- the dual-mode augmented/virtual reality near-eye wearable display may yet further be provided wherein the plurality of optical waveguide structures each have a dedicated image source coupled into a dedicated input image aperture that is configured to display a portion of a collective image for display to a viewer.
- the dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures are each optically configured to relay and magnify an image portion coupled from a separate image source into its corresponding exit aperture sub-region of the dual-mode augmented/virtual reality near-eye wearable display.
- the dual-mode augmented/virtual reality near-eye wearable display may yet further be provided wherein the waveguide structure is in optical or electronic communication with an image detection sensor that is configured to track a position of a viewer's eye or eyes.
- the dual-mode augmented/virtual reality near-eye wearable display may be provided wherein at least one thin film layer is comprised of an indium tin oxide material.
- the dual-mode augmented/virtual reality near-eye wearable display may further comprise processing circuitry configured to sense when a viewer recognizes a displayed image and to supplement or modify the recognized and displayed image with predetermined image data or to modify or supplement some or all of the displayed scene in the viewer's field of view.
- the dual-mode augmented/virtual reality near-eye wearable display may yet further be provided wherein the optical waveguide structure includes a micro-imprinted facet structure as a waveguide layer.
- the optical waveguide layer may comprise a micro-imprinted facet structure.
- the micro-imprinted facet structure may comprise a surface relief optical element or a volume relief diffractive waveguide.
- the micro-imprinted facet structure may comprise a diffractive grating waveguide, a blazed grating waveguide, a multi-level grating waveguide or a Bragg grating waveguide.
- the near-eye wearable display 1 of the invention is preferably configured as a conventional-looking eyewear frame and lens assembly having at least one optical lens 5 .
- the lenses 5 may comprise non-planar surfaces or piecewise planar surfaces and be configured to operate in an augmented reality (AR), a virtual reality (VR) or a hybrid AR/VR mode.
- AR augmented reality
- VR virtual reality
- hybrid AR/VR mode augmented reality
- lenses 5 are comprised of a lens thickness 5 ′ and a lens peripheral or edge surface 5 ′′.
- the front side, scene-facing surfaces 10 of lenses 5 of the disclosed dual-mode AR/VR near-eye wearable display 1 may be provided with an electro-tinting layer 15 .
- Electro-tinting layer 15 may comprise multiple thin-film layers 20 designed to electrically control the transmissivity (or tinting level) through lenses 5 .
- Multiple thin-film layers 20 may comprise at least one variably optically transmissive layer 25 of a variably optically transmissive material such as a polymer-dispersed liquid crystal (PDLC) material or equivalent suitable material that is sandwiched between thin-film layers 20 .
- Thin film layers 20 may comprise an electrically conductive, optically transparent material such as indium tin oxide (ITO).
- ITO indium tin oxide
- the thin-film layers 20 on opposing sides of the variably optically transmissive layer 25 are preferably electrically isolated and separately electrically coupled to appropriate control circuitry to enable multi-level or continuously variable control of the effective transmissivity of each lens 5 , and are capable of being varied from transparent or clear to non-transparent or dark.
- electro-tinting layers 15 of lenses 5 are designed to permit the coupling of viewer-defined multi-voltage level electrical signals through transparent, electrically-conductive ITO thin-film layers 20 to control the crystal alignment of the variably optically transmissive PDLC layer 25 and thus permit the tint level of lenses 5 to be controllably variable from clear to dark across a discrete or continuous range of tinting levels.
- back side, viewer facing surfaces 30 of lenses 5 are provided with one or more optical waveguide structures 40 ( FIG. 4 ).
- back-side surface 30 may be provided with an optical thin-film layer of a polymer comprised of a plurality of waveguide layers 50 that are disposed within lens thickness 5 ′ each defining a respective exit aperture.
- the waveguide layers 50 may be provided as a plurality of micro-imprinted facets or equivalent optical structures that are configured to permit light received into the waveguide structures 40 that are located proximal peripheral surface 5 ′′ (preferably outside the viewer's eye pupil viewing region) of each of lenses 5 to be totally internally reflected (TIR) or “wave-guided” through a respective portion of lens' thickness 5 ′ to a predetermined exit aperture sub-region 45 ′ defined by the respective waveguide layer 50 which is located within the viewing region of the viewer's eye pupil as is depicted in FIG. 4 .
- TIR totally internally reflected
- the waveguide layers 50 generally span the entire respective exit aperture sub-region 45 ′ except for a boundary region adjacent the periphery region of the respective lens 5 so as to enable the creation of a collective image with no gaps or dead regions as a result of the tiling of the individual image portions of each aperture sub-region 45 ′.
- Waveguide structures 40 may be fabricated, for instance, using either surface relief or volume relief diffractive optical structures (DOC) within lens thickness 5 ′ and may be provided as, for example, a diffractive grating, blazed grating, multi-level or Bragg grating or equivalent structure as is known in the optical arts.
- DOC volume relief diffractive optical structures
- Waveguide layers 50 may be designed to diffract broadband light preferably covering the visible light spectrum.
- Waveguide layers 50 are preferably designed to optically couple light that is emitted from an image source or sources 55 into each lens 5 and to the viewer's eye pupil region.
- the waveguide structures together with an appropriate micro-lens array forming part of the image sources 55 are configured to appropriately optically magnify and redirect the image coupled into each lens sub-region 45 ′.
- the micro-imprinted facets on the reflective prism-like grating elements can be provided with separate, viewer-defined facet angles that are different from one another, e.g. which become progressively larger or smaller, over the waveguide layer 50 itself to redirect the light at the exit aperture defining the respective image portion to converge toward the viewer's eye.
- the dual-mode AR/VR near-eye wearable display 1 of the invention may further comprise at least one image source 55 directly optically coupled to a respective waveguide structure 40 of each of lenses 5 whereby each image source 55 is capable of generating and outputting a digital optical image portion comprising a 2D array of multi-color pixels.
- each image source 55 provides an image portion to a respective input aperture 40 , to be presented to the respective exit aperture of a respective exit aperture sub-region 45 ′ of the respective lens 5 , so that each image portion will fill the respective exit aperture sub-region 45 ′ except for a small portion at the outer edge of the lenses 5 , to be able to provide a single combined image in each lens by the tiling of the respective image portions.
- Image sources 55 that are optically coupled to lenses 5 may be provided with the ability to modulate either single-view images or multi-view light field images in the dual-mode augmented/virtual reality near-eye wearable display.
- the image sources 55 that are optically coupled to lenses 5 are preferably sufficiently compact to be coupled to lenses 5 without obstructing the dual-mode AR/VR wearable display viewer's field of view.
- Image sources 55 are provided to enable the requisite compactness of a wearable display by, in a preferred embodiment, being of the emissive type (as opposed to “back-lit” or “transmissive” image sources) and are capable of generating an image that substantially matches the display area and required angle of divergence of input image aperture 40 of lenses 5 .
- Emissive imagers may be optically coupled directly from their emissive surfaces through the micro-lens array of the emissive imagers without the need for bulky optical interface or relay elements that undesirably obstruct a viewer's field of view.
- the image sources 55 that are optically coupled to lenses 5 may be provided, for example, from a class of emissive display devices called Quantum Photonic Imagers (“QPITM”, a trademark of Ostendo Technologies, Inc.) described in, for instance, U.S. Pat. Nos. 7,623,560; 7,829,902; 8,567,960; 7,767,479; 8,049,231; and 8,243,770, which are the subject of multiple patents and patent applications assigned to Ostendo Technologies, Inc., assignee of the instant application.
- QPITM Quantum Photonic Imagers
- Exemplary emissive display elements suitable for use as image sources 55 with the instant invention include, without limitation, light field emissive display devices as taught in, for instance U.S. Pat. Nos. 9,195,053; 8,854,724 and 8,928,969, each entitled “Spatio-temporal Directional Light Modulator” or emissive display elements taught in U.S. Pat. Nos. 7,623,560; 7,829,902; 8,567,960; 7,767,479; 8,049,231; and 8,243,770; each entitled “Quantum Photonic Imagers And Methods Of Fabrication Thereof”; each assigned to Applicant herein and the entire contents of each of which are incorporated herein by reference.
- image source or image sources as used herein encompasses any optoelectronics device that comprises an array of emissive micro-scale solid state light-(SSL) emitting pixels of a suitable size.
- the SSL light-emitting pixels of such devices may be either a light-emitting diode (LED) or laser diode (LD) structure or any solid state light-emitting (preferably multicolor) structure whose on-off state is controlled by drive circuitry, and alternatively may comprise, as an example, an image source 55 comprising an OLED imager device.
- LED light-emitting diode
- LD laser diode
- OLED organic light-emitting
- the pixels within the emissive micro-scale array of the image sources of the above-referenced U.S. patents are beneficially provided as individually addressable, spatially, chromatically and temporally, through associated drive CMOS circuitry, enabling such image sources to emit light that is modulated spatially, chromatically and temporally.
- the multiple colors emitted by the image sources disclosed in the above-referenced patents desirably share the same pixel aperture.
- the pixel apertures emit multi-colored and collimated (or non-Lamberitain) light with an angle of divergence ranging from about ⁇ 5° to about ⁇ 45°.
- the size of the pixels comprising the emissive array of the image sources of the above-referenced patents are typically in the range of approximately 5-20 microns with a typical emissive surface area of the image sources being in the range of approximately 15-150 square millimeters.
- the image sources that are the subject of the above patents are provided with a minimal gap or boundary between its emissive pixel array and the physical edge of the device, enabling a multiplicity of image source devices to be “tiled” to create a viewer-defined arbitrary size display area.
- the image sources 55 that are optically coupled to the lenses 5 of the invention are capable of generating video images with a brightness that is digitally controllable within a range that extends preferably from 1-15 lumens, preferably at a minimal power consumption so as to enable practical integration within the compact configuration of the disclosed dual-mode AR/VR wearable display 1 .
- Image source's 55 controllable brightness level enables generating the appropriate brightness level to match multiple operational modes of dual-mode AR/VR wearable display 1 .
- the image sources 55 that are optically coupled to lenses 5 may be configured to generate an image size and shape (in terms of the number and boundary of the pixels being modulated and coupled into input image aperture 40 ) that can be digitally controlled whereby the controllable image size and shape are used to couple an image with a variably-controlled size and shape into exit aperture 45 or exit aperture sub-regions 45 ′ of the lenses 5 .
- the image sources 55 that are optically coupled to the lenses 5 preferably comprise at least one image source 55 dedicated to each lens 5 as described above or a plurality of image sources 55 coupled into multiple waveguide structures of each lens 5 whereby each image source 55 is coupled to a different sub-region 45 ′ as is depicted in FIG. 4 .
- each image source 55 is effectively coupled to a separate, dedicated exit aperture sub-region 45 ′ permits a waveguide flatness condition (typically required to sustain the TIR waveguiding condition) to be required only across a small portion of lens 5 within lens thickness 5 ′, thus requiring lens 5 to be only “piecewise flat” over the individual exit aperture sub-regions 45 ′.
- This in turn, enables the use of overall curved lenses 5 having a non-planar surface and curved cross-sectional profile.
- the ability to provide lenses 5 as “piecewise flat” enables the use of curved-shaped lenses rather than substantially planar lenses required when typical waveguide optics are used.
- the piecewise flat portions of a curved lens allow the use of a more aesthetically-appealing eyeglass lens shape and a streamlined look for the dual-mode AR/VR near-eye wearable display 1 of the invention.
- the dual-mode AR/VR near-eye wearable display 1 it might be possible to directly project the images from the image sources 55 onto the waveguide layer 50 .
- total internal reflection merely only requires an angle of incidence of the light to the internal surface to be below a critical angle, and the number of internal reflections will normally not be large, such as in the range of one to three, and the curvature of a lens 5 need not be large to get the aesthetic effect desired, it may be possible to use a continuously curved lens 5 rather than a piecewise flat lens 5 .
- the image portion displayed to a viewer would be distorted, the image portion could be oppositely pre-distorted, such as by an appropriate micro-lens layer of the image sources 55 , and/or corrected electronically to remove that distortion.
- the total internal reflection if used, is only needed where the internal reflection is used, namely adjacent the edges of each lenses 5 . Otherwise the lenses 5 may be gently continuously curved like normal glasses and the waveguide layer 50 changes accordingly, and if desired, the edges of the eyeglass frames could be covered by an overhanging edge portion so only the continuously curved portion would be normally visible.
- each image source 55 is coupled to a different and dedicated exit aperture sub-region 45 ′ further allows the waveguide optical path from the plurality of image sources 55 to exit aperture sub-regions 45 ′ to have light rays that converge upon each of the viewer's eye pupils from different directions.
- the use of a plurality of image sources 55 coupled into multiple input image apertures 40 with each being coupled to a different exit aperture sub-region 45 ′ and the respective waveguide optical paths from the plurality of respective image sources 55 through the plurality of respective exit aperture sub-regions 45 ′ causes light emitted from different image sources 55 to converge upon each of the viewer's eye pupils from different directions with the image sources 55 associated with each exit aperture sub-region 45 ′ preferably modulating a different perspective view and enabling the dual-mode AR/VR near-eye wearable display 1 to display a multi-view light field scene.
- the use of a plurality of multi-view light field image sources 55 coupled into multiple waveguide structures 40 with each being coupled to a different sub-region 45 ′ and the waveguide optical path from the plurality of image sources 55 through respective exit aperture sub-regions 45 ′ causes the multi-view light field emitted from different image sources 55 to converge upon each of the viewer's eye pupils from different directions with the image sources 55 associated with each exit aperture sub-regions 45 ′ modulating a different multi-view perspective.
- the dual-mode AR/VR near-eye wearable display 1 to modulate a fine (small) angular (pitch) resolution light field over a wide field of view (FOV) whereby the coarse directional modulation (for example 15° angular separation between chief rays within the total FOV) is accomplished by the plurality of image sources' 55 chief ray angles of convergence into the viewer's eyes and the fine directional modulation of the light field (for example 0.5° angular separation between views within the sub-region FOV) is accomplished by an image source 55 modulating a set of different perspectives separated by the fine angler separation pitch within their respective exit aperture sub-region 45′ directions.
- the coarse directional modulation for example 15° angular separation between chief rays within the total FOV
- the fine directional modulation of the light field for example 0.5° angular separation between views within the sub-region FOV
- VAC vergence accommodation conflict
- the use of a plurality of image sources 55 coupled into multiple waveguide structures 40 enables increasing the display resolution (in terms of the number of pixels being displayed to the viewer) by either increasing the number of image sources 55 being optically coupled to each of the display lenses 5 , for example and not by way of limitation, using eight image sources 55 , each having 125,000, 10-micron pixels to enable one million pixels per eye, or by decreasing the image sources' 55 pixel size, for example and not by way of limitation, using eight image sources 55 of the same physical size as the above example but each having 500,000, five-micron pixels to enable the display of two million pixels per eye.
- Image sources 55 having electronically controllable image size and shape may be used to generate the appropriate image size and shape that matches the various operational modes of the dual-mode AR/VR wearable display 1 and optical image distortions.
- the dual-mode AR/VR near wearable display 1 may comprise at least one eye tracking sensor 65 per eye, the output of eye tracking sensor 65 being configured to detect multiple predetermined parameters of the viewer's eyes including but not limited to the angular position (or look angle) of each eye, the iris diameter, and the distance between the two pupils.
- Eye tracking sensors 65 may comprise a plurality of image detection sensors, such as a CMOS detector array device, that are coupled to an input image aperture 40 of each of the lenses 5 whereby each eye tracking sensor 65 is positioned in close proximity to the image source 55 to take advantage of the optical transfer function of the optical waveguide structure 40 of each lens 5 .
- This enables the use of each lens' 5 optical waveguide structure 40 to serve two functions; one being functioning as an optical path from the plurality of image sources 55 to the waveguide layer and from there to each eye, and the second being functioning as a reverse optical path from each eye to the one or more image detection eye tracking sensors 65 .
- the multiple images captured by the plurality of image detection eye tracking sensors 65 may be blended (or fused) together to form captured images of each pupil and to also form an image of the display exit aperture 45 or exit aperture sub-region 45 ′ to be used to infer the color and brightness uniformity across multiple exit aperture sub-regions 45 ′.
- Eye tracking sensors 65 may be utilized to detect the brightness and color uniformity across multiple display exit aperture sub-regions 45 ′ whereby the images captured by the eye tracking sensor(s) 65 are analyzed to determine the brightness and color of each of the display exit aperture sub-regions 45 ′. Then the determined values are compared and the brightness and/or color of the plurality of image sources 55 that are coupled into multiple waveguide structures 40 may be adjusted accordingly to cause the color and brightness across the entire set of exit aperture sub-regions 45 ′ to become uniform within a given, viewer-defined threshold, for example 10%.
- the eye parameter outputs of eye tracking sensors 65 may be subsequently utilized to adjust the display parameters for each eye by adjusting the parameters of the plurality of multi-view light field image sources 55 that are coupled into multiple input image apertures 40 of each lens 5 , for example, adjusting the display resolution to its highest level in the “eye-look” direction within a region of 1° to 2°, or selecting the light field compression reference holographic elements (hogels) at the depth inferred from the detected eye parameters, or adjusting the depth of the synthesized holographic 3D image to match the depth where the eye is focused, or adjusting the brightness or color within the eye-look direction of region of 1° to 2°, for example, blurring, reducing and/or adjusting the perspective, resolution, brightness and/or color within the image region outside the eye-look direction of region of 1° to 2°.
- the image sources 55 and one or more eye tracking sensors 65 configured to perform an image uniformity function that are optically coupled to the lenses 5 may be electrically coupled to an interface control and processing element (ICPE) configured as a compact printed circuit, preferably integrated within the glasses' frame temple 75 assembly of the dual-mode AR/VR wearable display 1 such as is illustrated in FIGS. 1 and 2 or in a temple of the glasses.
- ICPE interface control and processing element
- the ICPE normally would operate under program control.
- the electrical coupling from the dual-mode AR/VR wearable display interface, control and processing element (ICPE) to the image sources 55 may incorporate, for instance, digital video image input signals, brightness control and image size and shape control signals.
- the interface, control and interface element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise both a wireless and wired interface in the glasses' frame temple assembly 75 and connectivity capabilities that enable the dual-mode AR/VR wearable display 1 to interface and be connected either wirelessly or by wire to an image storage source or a control host processor and/or server such as is seen in FIG. 2 .
- ICPE interface, control and interface element
- the image processing capabilities required for the processing feedback input from eye tracking sensors 65 may be implemented within the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 .
- ICPE interface, control and processing element
- the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise the capability of synchronizing the images being displayed to both eyes, both in the perspective as well as the temporal aspects.
- the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise tilt and orientation sensors 80 preferably implemented using micro-scale gyros and accelerometers to enable sensing of the dual-mode AR/VR wearable display 1 tilt and orientation (head tracking capabilities) as depicted in FIG. 2 .
- tilt and orientation sensors 80 preferably implemented using micro-scale gyros and accelerometers to enable sensing of the dual-mode AR/VR wearable display 1 tilt and orientation (head tracking capabilities) as depicted in FIG. 2 .
- the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise one or more ambient light sensors 85 to enable sensing the brightness of the ambient light environment of the dual-mode AR/VR wearable display.
- the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise the interface capability to output the sensed ambient light, tilt and orientation of the dual-mode AR/VR wearable display 1 (ambient light, tilt and orientation sensors output data) to the connected image source 55 and a control host processor and/or server.
- ICPE control and processing element
- the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise a power converter circuit and power management circuitry 90 that is used to convert, regulate and manage the input power provided to the dual-mode AR/VR wearable display 1 .
- the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise a battery pack as part of the power management circuitry that is coupled to power converter and power management circuits to enable an autonomous (or not-plugged) operational mode.
- ICPE interface, control and processing element
- the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise an input power interface that is coupled to power converter and power management circuitry 90 to enable a plugged operational mode.
- ICPE control and processing element
- the interface, control and processing element (ICPE) of the dual-mode AR/VR wearable display 1 may further comprise a compact input connector to enable power, data and control interfaces to the dual-mode AR/VR wearable display that are preferably located at the terminal portion of at least one of the wearable display frame's temple assemblies 95 of the dual-mode AR/VR wearable display 1 as depicted in FIG. 5 .
- ICPE interface, control and processing element
- the dual-mode AR/VR wearable display assembly 1 may be curved to match the viewer's (viewer) frontal head profile with the temple assembly 75 and lens frame being extended in the vertical axis to sufficiently minimize leakage of excessive ambient light within the viewing region of the dual-mode AR/VR wearable display 1 .
- the dual-mode AR/VR wearable display 1 may be configured to operate in either a virtual reality VR mode, an augmented reality AR mode or a hybrid AR/VR mode as commanded by either the display viewer (by either touching the display temple or by voice command) or by a command embedded within the interface, control and processing element data input from the image source host processor and/or server.
- the tinting of lenses 5 of the dual-mode AR/VR wearable display 1 may be increased to a maximum (or the transmissivity reduced to a minimum) by appropriately setting the level of the electrical signal coupled into the electro-tinting layers 15 of lenses 5 , thus reducing the lens image sources' 55 output image brightness to match the VR brightness level set forth as a preference by the display viewer.
- the dual-mode AR/VR wearable display 1 may provide the sensed tilt and orientation data to the image source for the latter to provide the dual-mode AR/VR wearable display 1 with the appropriately generated VR images depending on the viewer head tilt and orientation. Particularly in the AR mode, tilting or changing the position of the viewer's head will tilt or change the apparent position of the augmented images and not the real images, if not electronically corrected responsive to the tilt and orientation data.
- the lens tinting may be reduced to the minimum level defined by the display viewer when the viewer commands the display to do so (by either touching a touch sensor located on the outside surface of the display arm, the display temple or by voice command) or when the sensed tilt and orientation data indicates the viewer's head as being outside a default viewing volume (or box) set forth by the viewer as a preference.
- This enables the display viewer to safely move about while still wearing the dual-mode AR/VR wearable display 1 by setting a certain physical viewing box outside of which the display lens tinting is reduced to allow the display viewer to move about safely.
- the dual-mode AR/VR wearable display 1 may be reconfigurable to display either 2D or 3D light field images whereby the tilt and orientation data sensed by the display is used by the image source device to expand the displayable field of view, in both the 2D and 3D modes of operations, or enables a full-parallax 3D viewing experience in the 3D mode of operation.
- the tinting of lenses 5 of the dual-mode AR/VR wearable display 1 may be reduced to a desired viewing level that matches the ambient light level sensed by the display by appropriately setting the level of the electrical signal coupled into tinting layers 15 of the lenses using an ambient light sensor 85 .
- the displayed image's brightness may be increased by reducing the lens image sources' 55 output image brightness to match the sensed ambient light level and the brightness level set forth as a preference by the display viewer.
- the dual-mode AR/VR wearable display 1 may provide tilt and orientation data to the image sources 55 through the ICPE for the latter to provide the dual-mode AR/VR wearable display 1 with the appropriately-generated AR images depending on the viewer head tilt and orientation. Since in the AR mode, both the image visible through the lenses 5 (the real world) and the augmented image, computer generated or otherwise, are visible to a viewer in a coordinated, overlaid manner, if not compensated, the tilt and orientation of the viewer's head would disturb the relative positions of the two images.
- the image displayed at any one time can be considered part of a larger augmented image such as may be stored in memory or that can be generated on demand as needed (or both), and that in effect, head movement is effectively corrected by merely moving or twisting the viewable area around that larger image in a compensating manner. Consequently while the physical size of any one image portion displayed in any sub-region 45 ′ won't change, the part of the larger image that is displayed and how it is displayed in any one sub-region 45 ′ is changed by the ICPE with head movement to maintain the alignment of the real and augmented images. This feature may also be valuable in the VR mode also fix the spatial position of the viewable image to add to the viewer's VR experience.
- the image displayed in any sub-region of a lens 5 by an image source will have one or more image source pixels that are not used (black or off) around the edges. This allows the electronic accurate sizing, location and angular position of the image displayed in the sub-region in pixel increments to avoid unreasonable mechanical alignment requirements, etc.
- the lens tinting may be increased to the level set by the display viewer when the viewer commands the display to do so by either touching a touch sensor 100 located on the outside surface of the display temple or by voice command, or when the sensed ambient light data indicates the ambient light has increased to a point that would reduce contrast of the displayed image.
- the dual-mode AR/VR wearable display 1 can display either 2D or 3D light field images whereby the tilt and orientation data sensed by the display may be used by the image source 55 to expand the displayable field of view, in both the 2D and 3D modes of operations, or enable full-parallax 3D viewing experience in the 3D mode of operation.
- the operational mode may be controlled by the display content which contains embedded mode control command data packets that, depending on the content of the scene being displayed and/or where the viewer's eye is directed and focused, causes the dual-mode AR/VR wearable display 1 to emphasize certain objects or focal planes within the displayed image by modifying the tinting levels and/or the contrast level of such scene objects or focal depth.
- the dual-mode AR/VR wearable display 1 may comprise both touch and voice control capabilities that are used, as explained above, to switch between the AR and VR modes of operation and to control various operational parameters of each mode.
- the touch control capabilities may be implemented as a touch sensor 100 integrated on the temple assembly outside casing (or enclosure).
- Touch sensor 100 may be designed to respond to single touch, multiple touches or touch and drag type of commands.
- the default setting of the exemplary right side touch sensor 100 is a drag to control touch sensor to control the display lens tinting level and on the left side touch sensor 100 is a drag to control touch sensor to control the display brightness level.
- the VR mode a single touch to either side may change the display lenses' tinting to allow the display viewer to safely move about. Multiple touches may be used to change the default control of the touch sensor based on a programmable menu that allows the display viewer to set and change each operational mode parameter to match their needs.
- Voice control capabilities may be provided to enable the display viewer to control the display mode such as AR versus VR or hybrid AR/VR and the display operational parameters such as brightness or image size.
- the dual-mode AR/VR wearable display 1 enables an interface with a hand gesture sensor 105 that allows the display viewer to control the display mode such as AR versus VR or hybrid AR/VR and the display operational parameters such as brightness or image size and to also control and/or select the display contents using soft buttons or icons that may be added or removed from the viewer display area by either hand, voice or touch gestures.
- a hand gesture sensor 105 that allows the display viewer to control the display mode such as AR versus VR or hybrid AR/VR and the display operational parameters such as brightness or image size and to also control and/or select the display contents using soft buttons or icons that may be added or removed from the viewer display area by either hand, voice or touch gestures.
- the dual-mode AR/VR wearable display 1 may further comprise at least one “reality” sensor 110 (preferably a light field camera) that preferably captures ambient light field content and couples the captured images to the interface, control and processing element (ICPE) which then blends or fuses the images being displayed to fit and optically match the reality perspective being viewed in the AR mode or to integrate the images captured by the reality sensors into the displayed content in the VR mode.
- at least one “reality” sensor 110 preferably a light field camera
- ICPE control and processing element
- the dual-mode AR/VR wearable display 1 may further comprise the capabilities to accept input image data or video data in a compressed format (such as MPEG or JPEG for example) and either first decompress the input images, then display them to the viewer, or directly display them to the viewer using Visual Decompression techniques as discussed below to reduce the decompression processing and memory requirements and reduce power consumption.
- a compressed format such as MPEG or JPEG for example
- the plurality of image sources 55 of the dual-mode AR/VR wearable display 1 may further comprise Visual Decompression capabilities to modulate images using high order bases of (n ⁇ n) pixels (instead of the standard 1 pixel modulation bases) then modulating the coefficient of a commensurate discrete wavelet, (DWT) transform or a discrete cosine transform (DCT) representation of the image (which are typically the coefficients used by MPEG and JPEG compression techniques), thus enabling the dual-mode AR/VR wearable display 1 to modulate images using the compressed image data directly.
- DWT discrete wavelet
- DCT discrete cosine transform
- the dual-mode AR/VR wearable display 1 may further comprise capabilities to accept input images or videos compressed using light field compression techniques and formats and applying compressed light field rendering in order to decompress and synthesize the light field to be displayed from a set of compressed reference holographic elements (hogels) in order to reduce image interface bandwidth, decompression processing and memory requirements and to reduce power consumption.
- the dual-mode AR/VR wearable display 1 may further comprise the capability of interfacing with a cloud server 115 and to query that server to download a selected set of compressed light field holographic elements (hogels) based on the detected viewer's eyes and head position and orientation, then to accept from the server a set of requested light field holographic elements (hogels), then applying compressed light field rendering in order to decompress and synthesize the light field to be displayed from a set of compressed reference holographic elements (hogels).
- This beneficially further reduces image interface bandwidth, as well as decompression processing and memory requirements and power consumption.
- the dual-mode AR/VR wearable display 1 may further comprise capabilities to interface with a cloud server 115 and query the server to download a selected set of compressed light field holographic elements (hogels), herein referred to as reference hogels, based on the detected viewer's eyes focus depth or distance, then to accept from that server a set of requested reference light field hogels, then applying compressed light field rendering in order to decompress and synthesize the light field to be displayed from a set of compressed reference hogels in order to further reduce the image interface bandwidth, as well as decompression processing and memory requirements and reduce power consumption.
- a cloud server 115 may further comprise capabilities to interface with a cloud server 115 and query the server to download a selected set of compressed light field holographic elements (hogels), herein referred to as reference hogels, based on the detected viewer's eyes focus depth or distance, then to accept from that server a set of requested reference light field hogels, then applying compressed light field rendering in order to decompress and synthesize the light field to be displayed
- the dual-mode AR/VR wearable display 1 may further comprise capabilities to interface with a cloud server 115 configured as a Networked Light Field Photography cloud server, then interact with the server to upload the ambient light field images being captured by its reality sensors 110 and download the images of the viewer-extended light field to allow the viewer to view the contents of its ambient light field beyond its visual reach, i.e., extended light field, or in order to allow the display viewer to browse through a downloaded light field using either the VR or AR modes of the display.
- a cloud server 115 configured as a Networked Light Field Photography cloud server
- the dual-mode AR/VR wearable display 1 may further comprise capabilities to interface and query a cloud server 115 to download video content of a selected portion of a video data set depending on the eye parameters (look angle and depth of focus, for example) detected by one or more eye tracking sensors 65 .
- the dual-mode AR/VR wearable display 1 may further comprise capabilities to interface with an audio interface 120 which may comprise an audio speaker and a microphone both integrated within the volumetric perimeters of temple assembly 75 whereby the microphone is electrically coupled to the interface, control and processing element (ICPE) and is used to interface the viewer voice commands to the voice recognition processing element (software) of the interface, control and processing element (ICPE) and the speaker electrically coupled to interface, control and processing element (ICPE) and used for audio content interface to the viewer.
- an audio interface 120 which may comprise an audio speaker and a microphone both integrated within the volumetric perimeters of temple assembly 75 whereby the microphone is electrically coupled to the interface, control and processing element (ICPE) and is used to interface the viewer voice commands to the voice recognition processing element (software) of the interface, control and processing element (ICPE) and the speaker electrically coupled to interface, control and processing element (ICPE) and used for audio content interface to the viewer.
- ICPE control and processing element
- the plurality of image sources 55 and eye tracking sensors 65 , reality sensors 110 and the interface, control and processing element (ICPE) may be embossed within the volumetric perimeters of the rims of the lens bezel and the temples of the near-eye wearable display glasses frame, respectively, to create a streamlined-looking near-eye wearable display glasses that are aesthetically and cosmetically appealing when worn in public, such as is illustrated in FIG. 1 .
- the dual-mode AR/VR wearable display 1 may be expected to display reference images of objects, icons and/or markers from time to time, and a processing element of the device may further comprise capabilities to keep track in its interface, control and processing element memory of the subset of reference images of objects, icons and/or marker that frequently appear within the displayed content then subsequently abbreviate the fine details or lower the resolution of this subset of reference images in order to reduce processing and memory requirements and reduce power consumption.
- This feature leverages the cognitive perception capabilities of the human visual system (HVS) by virtually filling in the details required to recognize and/or identify familiar or previously visually-sensed objects and images in order to maximize the efficiency, in terms of response latency, processing throughput and memory requirement and power consumption of the dual-mode AR/VR near-eye wearable display 1 .
- HVS human visual system
- the dual-mode AR/VR wearable display 1 may further comprise capabilities to analyze the content to be displayed on the device on a frame-by-frame basis to deduce the color gamut size, in terms of the coordinates of the gamut color primaries, then command the plurality of image sources 55 to synthesize the deduced color gamut using the measured gamut color primaries in the modulation of the images being displayed to the viewer.
- This feature leverages the fact that the color gamut of image content from frame-to-frame is typically much smaller than the full color gamut that can be synthesized by laser diode or LED based image sources 55 referred to above in order to maximize efficiency in terms of brightness, color content, processing throughput and memory requirement and power consumption of the dual-mode AR/VR near-eye wearable display 1 .
- FIG. 2 Referred to herein are an ICPE and a host processor shown in FIG. 2 , both of which would include one or more processing elements, which in turn would include memory as required. It should be understood that any processing may be done on or off the device of FIG. 1 , or both, using wireless or wired communication, and references to a processing element in the claims to follow are to be understood to refer to one or more processing elements on and/or off the device of FIG. 1 .
Abstract
Description
- This application claims the benefit of U.S. Provisional Patent Application No. 62/242,963 filed Oct. 16, 2015, the contents of which are hereby incorporated by reference as if fully stated herein.
- 1. Field of the Invention
- The invention relates generally to wearable electronics and more particularly, to a dual-mode augmented/virtual reality near-eye wearable display.
- 2. Prior Art
- Wearable optical electronics are becoming commonplace as integrated circuit size, weight and power (SWaP) and cost scale downward. Wearable optical electronics have a wide number of commercial, military and consumer applications. With respect to wearable optical electronics, there exists prior art, none of which address the need for a high resolution, dual-mode, augmented/virtual reality near-eye wearable display having a form of curved lenses with a non-planar profile and surface, which curved lens profile is used almost exclusively in consumer and other applications and is considered fashionable and aesthetically pleasing. The invention disclosed herein addresses the need for, and enables, such a near-eye wearable display.
-
FIG. 1 depicts a perspective view of the dual-mode augmented/virtual reality near-eye wearable display of the invention. -
FIG. 2 depicts a top plan view of the dual-mode augmented/virtual reality near-eye wearable display of the invention. -
FIG. 3A depicts an optical lens of the invention. -
FIG. 3B depicts a cross-section of the lens ofFIG. 3A . -
FIG. 3C depicts a top view of the lens element ofFIG. 3A . -
FIG. 4 depicts the lens of the dual-mode augmented/virtual reality near-eye wearable display of the invention and illustrates the optical waveguide structures of the lens. -
FIG. 5 depicts a perspective view of the dual-mode augmented/virtual reality near-eye wearable display of the invention showing the battery and connector of the temple of the display frame. - The invention and various of its embodiments are set forth in the following description of the preferred embodiments which are presented as illustrated examples of the invention in the subsequent claims. It is expressly noted that the invention as defined by such claims may be broader than the illustrated embodiments described below.
- Turning to the description and the various Figs. wherein like references denote like elements among the several views, disclosed is a dual-mode augmented/virtual reality near-eye wearable display for use with, but not limited to, curved optical lenses.
- In a first aspect of the invention, a dual-mode augmented/virtual reality near-eye wearable display is disclosed and may comprise an optical lens comprising a first (scene-facing) surface, a lens thickness, and a lens peripheral edge or surface. The first surface may comprise an electro-tinting layer comprising a variably optically transmissive layer disposed between a first and a second electrically conductive transparent thin film layer. Each of the first and second conductive transparent thin film layers may be coupled to control circuitry configured to vary an optical transmissivity of the variably optically transmissive layer. One or more optical waveguide structures are provided within the lens thickness and may comprise at least one input image aperture and at least one exit aperture, which exit aperture may be divided into a plurality of exit aperture sub-regions. One or more image sources, such as electronic display elements, are optically coupled to their respective input image apertures. The image sources may be disposed on the peripheral (i.e.; edge or side) surface of the lens and configured to directly optically couple a displayed optical image from the image source directly into an input image aperture and then to an exit aperture or from a plurality of input image apertures to a plurality of respective exit aperture sub-regions. The exit aperture's or exit aperture sub-region's optical characteristics are preferably configured to match a predetermined area and predetermined angle of divergence of the respective input image aperture.
- The dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the variably optically transmissive layer is comprised of a polymer dispersed liquid crystal (PDLC) material. The dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures disposed within the lens element are each individually “piecewise flat”.
- The dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures that are piecewise flat provide image portions that are collectively combined in a tiled arrangement to define optical lenses having a curved or non-planar surface and profile. The dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures each are configured to redirect an image that is coupled from its respective input image aperture into its respective exit aperture or exit aperture sub-region. Alternatives to the use of optical waveguide structures that are piecewise flat are also disclosed.
- The dual-mode augmented/virtual reality near-eye wearable display may further be provided wherein a plurality of optical waveguide structures collectively define an output eye box of the dual-mode augmented/virtual reality near-eye wearable display. The dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures each have a dedicated input image aperture and exit aperture sub-region that are coupled to a respective dedicated individual image source. The dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the image source comprises an emissive micro-scale pixel array comprising of pixels that are individually spatially, chromatically and temporally addressable.
- The dual-mode augmented/virtual reality near-eye wearable display may yet further be provided wherein the plurality of optical waveguide structures each have a dedicated image source coupled into a dedicated input image aperture that is configured to display a portion of a collective image for display to a viewer. The dual-mode augmented/virtual reality near-eye wearable display may be provided wherein the plurality of optical waveguide structures are each optically configured to relay and magnify an image portion coupled from a separate image source into its corresponding exit aperture sub-region of the dual-mode augmented/virtual reality near-eye wearable display.
- The dual-mode augmented/virtual reality near-eye wearable display may yet further be provided wherein the waveguide structure is in optical or electronic communication with an image detection sensor that is configured to track a position of a viewer's eye or eyes. The dual-mode augmented/virtual reality near-eye wearable display may be provided wherein at least one thin film layer is comprised of an indium tin oxide material. The dual-mode augmented/virtual reality near-eye wearable display may further comprise processing circuitry configured to sense when a viewer recognizes a displayed image and to supplement or modify the recognized and displayed image with predetermined image data or to modify or supplement some or all of the displayed scene in the viewer's field of view.
- The dual-mode augmented/virtual reality near-eye wearable display may yet further be provided wherein the optical waveguide structure includes a micro-imprinted facet structure as a waveguide layer. The optical waveguide layer may comprise a micro-imprinted facet structure. The micro-imprinted facet structure may comprise a surface relief optical element or a volume relief diffractive waveguide. The micro-imprinted facet structure may comprise a diffractive grating waveguide, a blazed grating waveguide, a multi-level grating waveguide or a Bragg grating waveguide.
- As depicted, for instance in
FIG. 1 , the near-eyewearable display 1 of the invention is preferably configured as a conventional-looking eyewear frame and lens assembly having at least oneoptical lens 5. Thelenses 5 may comprise non-planar surfaces or piecewise planar surfaces and be configured to operate in an augmented reality (AR), a virtual reality (VR) or a hybrid AR/VR mode. - Turning to
FIGS. 2, 3A -C and 4,lenses 5 are comprised of alens thickness 5′ and a lens peripheral oredge surface 5″. As detailed in thelens 5 cross-section ofFIG. 3B , in a preferred embodiment, the front side, scene-facingsurfaces 10 oflenses 5 of the disclosed dual-mode AR/VR near-eyewearable display 1 may be provided with an electro-tinting layer 15. Electro-tinting layer 15 may comprise multiple thin-film layers 20 designed to electrically control the transmissivity (or tinting level) throughlenses 5. Multiple thin-film layers 20 may comprise at least one variably opticallytransmissive layer 25 of a variably optically transmissive material such as a polymer-dispersed liquid crystal (PDLC) material or equivalent suitable material that is sandwiched between thin-film layers 20.Thin film layers 20 may comprise an electrically conductive, optically transparent material such as indium tin oxide (ITO).Thin film layers 20 are configured to enable the coupling of an electrical signal or potential across variably opticallytransmissive layer 25 for the purpose of electrically varying (or controlling) the tinting or transmissivity level oflenses 5. The thin-film layers 20 on opposing sides of the variably opticallytransmissive layer 25 are preferably electrically isolated and separately electrically coupled to appropriate control circuitry to enable multi-level or continuously variable control of the effective transmissivity of eachlens 5, and are capable of being varied from transparent or clear to non-transparent or dark. - In a preferred embodiment, electro-tinting
layers 15 oflenses 5 are designed to permit the coupling of viewer-defined multi-voltage level electrical signals through transparent, electrically-conductive ITO thin-film layers 20 to control the crystal alignment of the variably opticallytransmissive PDLC layer 25 and thus permit the tint level oflenses 5 to be controllably variable from clear to dark across a discrete or continuous range of tinting levels. - The back side,
viewer facing surfaces 30 oflenses 5 are provided with one or more optical waveguide structures 40 (FIG. 4 ). In a preferred embodiment, back-side surface 30 may be provided with an optical thin-film layer of a polymer comprised of a plurality of waveguide layers 50 that are disposed withinlens thickness 5′ each defining a respective exit aperture. The waveguide layers 50 may be provided as a plurality of micro-imprinted facets or equivalent optical structures that are configured to permit light received into thewaveguide structures 40 that are located proximalperipheral surface 5″ (preferably outside the viewer's eye pupil viewing region) of each oflenses 5 to be totally internally reflected (TIR) or “wave-guided” through a respective portion of lens'thickness 5′ to a predeterminedexit aperture sub-region 45′ defined by therespective waveguide layer 50 which is located within the viewing region of the viewer's eye pupil as is depicted inFIG. 4 . (Not all waveguide layers are shown so as to not unnecessarily obscure other aspects of the Figure.).FIG. 4 generally illustrates a plurality ofwaveguide structures 40 coupled to a plurality ofexit aperture sub-regions 45′ within thelens 5 viewing area. The waveguide layers 50 generally span the entire respectiveexit aperture sub-region 45′ except for a boundary region adjacent the periphery region of therespective lens 5 so as to enable the creation of a collective image with no gaps or dead regions as a result of the tiling of the individual image portions of eachaperture sub-region 45′. -
Waveguide structures 40 may be fabricated, for instance, using either surface relief or volume relief diffractive optical structures (DOC) withinlens thickness 5′ and may be provided as, for example, a diffractive grating, blazed grating, multi-level or Bragg grating or equivalent structure as is known in the optical arts. - Waveguide layers 50 may be designed to diffract broadband light preferably covering the visible light spectrum.
- Waveguide layers 50 are preferably designed to optically couple light that is emitted from an image source or
sources 55 into eachlens 5 and to the viewer's eye pupil region. The waveguide structures together with an appropriate micro-lens array forming part of the image sources 55 are configured to appropriately optically magnify and redirect the image coupled into eachlens sub-region 45′. In particular, the micro-imprinted facets on the reflective prism-like grating elements can be provided with separate, viewer-defined facet angles that are different from one another, e.g. which become progressively larger or smaller, over thewaveguide layer 50 itself to redirect the light at the exit aperture defining the respective image portion to converge toward the viewer's eye. - The dual-mode AR/VR near-eye
wearable display 1 of the invention may further comprise at least oneimage source 55 directly optically coupled to arespective waveguide structure 40 of each oflenses 5 whereby eachimage source 55 is capable of generating and outputting a digital optical image portion comprising a 2D array of multi-color pixels. As stated before, eachimage source 55 provides an image portion to arespective input aperture 40, to be presented to the respective exit aperture of a respectiveexit aperture sub-region 45′ of therespective lens 5, so that each image portion will fill the respectiveexit aperture sub-region 45′ except for a small portion at the outer edge of thelenses 5, to be able to provide a single combined image in each lens by the tiling of the respective image portions. -
Image sources 55 that are optically coupled tolenses 5 may be provided with the ability to modulate either single-view images or multi-view light field images in the dual-mode augmented/virtual reality near-eye wearable display. - The image sources 55 that are optically coupled to
lenses 5 are preferably sufficiently compact to be coupled tolenses 5 without obstructing the dual-mode AR/VR wearable display viewer's field of view. -
Image sources 55 are provided to enable the requisite compactness of a wearable display by, in a preferred embodiment, being of the emissive type (as opposed to “back-lit” or “transmissive” image sources) and are capable of generating an image that substantially matches the display area and required angle of divergence ofinput image aperture 40 oflenses 5. Emissive imagers may be optically coupled directly from their emissive surfaces through the micro-lens array of the emissive imagers without the need for bulky optical interface or relay elements that undesirably obstruct a viewer's field of view. - The image sources 55 that are optically coupled to
lenses 5 may be provided, for example, from a class of emissive display devices called Quantum Photonic Imagers (“QPI™”, a trademark of Ostendo Technologies, Inc.) described in, for instance, U.S. Pat. Nos. 7,623,560; 7,829,902; 8,567,960; 7,767,479; 8,049,231; and 8,243,770, which are the subject of multiple patents and patent applications assigned to Ostendo Technologies, Inc., assignee of the instant application. - Exemplary emissive display elements suitable for use as
image sources 55 with the instant invention include, without limitation, light field emissive display devices as taught in, for instance U.S. Pat. Nos. 9,195,053; 8,854,724 and 8,928,969, each entitled “Spatio-temporal Directional Light Modulator” or emissive display elements taught in U.S. Pat. Nos. 7,623,560; 7,829,902; 8,567,960; 7,767,479; 8,049,231; and 8,243,770; each entitled “Quantum Photonic Imagers And Methods Of Fabrication Thereof”; each assigned to Applicant herein and the entire contents of each of which are incorporated herein by reference. - The above-referenced image sources that are the subject of the above-referenced respective U.S. patents desirably feature high brightness, high resolution and very fast response with multi-color light, some with spatial modulation capabilities, in a single emissive display device that includes all necessary display drive circuitry. While the devices disclosed in the above-referenced patents are well-suited for use in the invention, it is expressly contemplated that within the context of this invention, the term image source or image sources as used herein encompasses any optoelectronics device that comprises an array of emissive micro-scale solid state light-(SSL) emitting pixels of a suitable size. The SSL light-emitting pixels of such devices, hereinafter referred to collectively as image sources, may be either a light-emitting diode (LED) or laser diode (LD) structure or any solid state light-emitting (preferably multicolor) structure whose on-off state is controlled by drive circuitry, and alternatively may comprise, as an example, an
image source 55 comprising an OLED imager device. - The pixels within the emissive micro-scale array of the image sources of the above-referenced U.S. patents are beneficially provided as individually addressable, spatially, chromatically and temporally, through associated drive CMOS circuitry, enabling such image sources to emit light that is modulated spatially, chromatically and temporally. The multiple colors emitted by the image sources disclosed in the above-referenced patents desirably share the same pixel aperture. The pixel apertures emit multi-colored and collimated (or non-Lamberitain) light with an angle of divergence ranging from about ±5° to about ±45°. The size of the pixels comprising the emissive array of the image sources of the above-referenced patents are typically in the range of approximately 5-20 microns with a typical emissive surface area of the image sources being in the range of approximately 15-150 square millimeters. The image sources that are the subject of the above patents are provided with a minimal gap or boundary between its emissive pixel array and the physical edge of the device, enabling a multiplicity of image source devices to be “tiled” to create a viewer-defined arbitrary size display area. However when disbursed individually around the periphery of the lenses of the present invention as shown in
FIGS. 3A, 3C and 4 and as described above, it is the image portions that are tiled, not the image sources themselves, so that a boundary on the image sources themselves is of no consequence unless somehow the image sources themselves are to be tiled. - The image sources 55 that are optically coupled to the
lenses 5 of the invention are capable of generating video images with a brightness that is digitally controllable within a range that extends preferably from 1-15 lumens, preferably at a minimal power consumption so as to enable practical integration within the compact configuration of the disclosed dual-mode AR/VRwearable display 1. - Image source's 55 controllable brightness level enables generating the appropriate brightness level to match multiple operational modes of dual-mode AR/VR
wearable display 1. - The image sources 55 that are optically coupled to
lenses 5 may be configured to generate an image size and shape (in terms of the number and boundary of the pixels being modulated and coupled into input image aperture 40) that can be digitally controlled whereby the controllable image size and shape are used to couple an image with a variably-controlled size and shape intoexit aperture 45 orexit aperture sub-regions 45′ of thelenses 5. - The image sources 55 that are optically coupled to the
lenses 5 preferably comprise at least oneimage source 55 dedicated to eachlens 5 as described above or a plurality ofimage sources 55 coupled into multiple waveguide structures of eachlens 5 whereby eachimage source 55 is coupled to adifferent sub-region 45′ as is depicted inFIG. 4 . - The use of a plurality of
image sources 55 coupled into multipleinput image apertures 40 of eachlens 5 whereby eachimage source 55 is effectively coupled to a separate, dedicatedexit aperture sub-region 45′ permits a waveguide flatness condition (typically required to sustain the TIR waveguiding condition) to be required only across a small portion oflens 5 withinlens thickness 5′, thus requiringlens 5 to be only “piecewise flat” over the individualexit aperture sub-regions 45′. This in turn, enables the use of overallcurved lenses 5 having a non-planar surface and curved cross-sectional profile. - The ability to provide
lenses 5 as “piecewise flat” enables the use of curved-shaped lenses rather than substantially planar lenses required when typical waveguide optics are used. The piecewise flat portions of a curved lens allow the use of a more aesthetically-appealing eyeglass lens shape and a streamlined look for the dual-mode AR/VR near-eyewearable display 1 of the invention. - As possible alternatives, depending on the overall design of the dual-mode AR/VR near-eye
wearable display 1 it might be possible to directly project the images from the image sources 55 onto thewaveguide layer 50. As a further alternative, since total internal reflection merely only requires an angle of incidence of the light to the internal surface to be below a critical angle, and the number of internal reflections will normally not be large, such as in the range of one to three, and the curvature of alens 5 need not be large to get the aesthetic effect desired, it may be possible to use a continuouslycurved lens 5 rather than a piecewiseflat lens 5. While the image portion displayed to a viewer would be distorted, the image portion could be oppositely pre-distorted, such as by an appropriate micro-lens layer of the image sources 55, and/or corrected electronically to remove that distortion. Also it should be noted that the total internal reflection, if used, is only needed where the internal reflection is used, namely adjacent the edges of eachlenses 5. Otherwise thelenses 5 may be gently continuously curved like normal glasses and thewaveguide layer 50 changes accordingly, and if desired, the edges of the eyeglass frames could be covered by an overhanging edge portion so only the continuously curved portion would be normally visible. - The use of a plurality of
image sources 55 coupled onto multipleinput waveguide structures 40 of eachlens 5 whereby eachimage source 55 is coupled to a different and dedicatedexit aperture sub-region 45′ further allows the waveguide optical path from the plurality ofimage sources 55 to exitaperture sub-regions 45′ to have light rays that converge upon each of the viewer's eye pupils from different directions. - The use of a plurality of
image sources 55 coupled into multipleinput image apertures 40 with each being coupled to a differentexit aperture sub-region 45′ and the respective waveguide optical paths from the plurality ofrespective image sources 55 through the plurality of respectiveexit aperture sub-regions 45′ causes light emitted fromdifferent image sources 55 to converge upon each of the viewer's eye pupils from different directions with the image sources 55 associated with eachexit aperture sub-region 45′ preferably modulating a different perspective view and enabling the dual-mode AR/VR near-eyewearable display 1 to display a multi-view light field scene. - The use of a plurality of multi-view light
field image sources 55 coupled intomultiple waveguide structures 40 with each being coupled to adifferent sub-region 45′ and the waveguide optical path from the plurality ofimage sources 55 through respectiveexit aperture sub-regions 45′ causes the multi-view light field emitted fromdifferent image sources 55 to converge upon each of the viewer's eye pupils from different directions with the image sources 55 associated with eachexit aperture sub-regions 45′ modulating a different multi-view perspective. This enables the dual-mode AR/VR near-eyewearable display 1 to modulate a fine (small) angular (pitch) resolution light field over a wide field of view (FOV) whereby the coarse directional modulation (for example 15° angular separation between chief rays within the total FOV) is accomplished by the plurality of image sources' 55 chief ray angles of convergence into the viewer's eyes and the fine directional modulation of the light field (for example 0.5° angular separation between views within the sub-region FOV) is accomplished by animage source 55 modulating a set of different perspectives separated by the fine angler separation pitch within their respectiveexit aperture sub-region 45′ directions. - The use of a plurality of multi-view light
field image sources 55 coupled intomultiple waveguide structures 40 enables the modulation of a light field that provides a sufficient number of views to each of the viewer's pupils (preferably 8-12 views per pupil with at least six views along the horizontal parallax) to the extent that it substantially eliminates the so-called “vergence accommodation conflict” (VAC) effect (which causes severe viewer discomfort) and which is commonly encountered in prior art near-eye autostereoscopic displays, thus making the disclosed dual-mode AR/VR near-eye wearable display 1 a VAC-free display. - The use of a plurality of
image sources 55 coupled intomultiple waveguide structures 40 enables increasing the display resolution (in terms of the number of pixels being displayed to the viewer) by either increasing the number ofimage sources 55 being optically coupled to each of thedisplay lenses 5, for example and not by way of limitation, using eightimage sources 55, each having 125,000, 10-micron pixels to enable one million pixels per eye, or by decreasing the image sources' 55 pixel size, for example and not by way of limitation, using eightimage sources 55 of the same physical size as the above example but each having 500,000, five-micron pixels to enable the display of two million pixels per eye. - The use of a plurality of
image sources 55 coupled into respectivemultiple waveguide structures 40 of eachlens 5 enables a high pixel resolution per eye modulating a sufficient number of views to each of the viewer's pupils making it possible to modulate digital holographic images or light field images to the viewer. -
Image sources 55 having electronically controllable image size and shape may be used to generate the appropriate image size and shape that matches the various operational modes of the dual-mode AR/VRwearable display 1 and optical image distortions. - Turning back to
FIG. 2 , the dual-mode AR/VR nearwearable display 1 may comprise at least oneeye tracking sensor 65 per eye, the output ofeye tracking sensor 65 being configured to detect multiple predetermined parameters of the viewer's eyes including but not limited to the angular position (or look angle) of each eye, the iris diameter, and the distance between the two pupils. -
Eye tracking sensors 65 may comprise a plurality of image detection sensors, such as a CMOS detector array device, that are coupled to aninput image aperture 40 of each of thelenses 5 whereby eacheye tracking sensor 65 is positioned in close proximity to theimage source 55 to take advantage of the optical transfer function of theoptical waveguide structure 40 of eachlens 5. This enables the use of each lens' 5optical waveguide structure 40 to serve two functions; one being functioning as an optical path from the plurality ofimage sources 55 to the waveguide layer and from there to each eye, and the second being functioning as a reverse optical path from each eye to the one or more image detectioneye tracking sensors 65. - The multiple images captured by the plurality of image detection
eye tracking sensors 65 may be blended (or fused) together to form captured images of each pupil and to also form an image of thedisplay exit aperture 45 orexit aperture sub-region 45′ to be used to infer the color and brightness uniformity across multipleexit aperture sub-regions 45′. -
Eye tracking sensors 65 may be utilized to detect the brightness and color uniformity across multiple displayexit aperture sub-regions 45′ whereby the images captured by the eye tracking sensor(s) 65 are analyzed to determine the brightness and color of each of the displayexit aperture sub-regions 45′. Then the determined values are compared and the brightness and/or color of the plurality ofimage sources 55 that are coupled intomultiple waveguide structures 40 may be adjusted accordingly to cause the color and brightness across the entire set ofexit aperture sub-regions 45′ to become uniform within a given, viewer-defined threshold, for example 10%. - The eye parameter outputs of
eye tracking sensors 65 may be subsequently utilized to adjust the display parameters for each eye by adjusting the parameters of the plurality of multi-view lightfield image sources 55 that are coupled into multipleinput image apertures 40 of eachlens 5, for example, adjusting the display resolution to its highest level in the “eye-look” direction within a region of 1° to 2°, or selecting the light field compression reference holographic elements (hogels) at the depth inferred from the detected eye parameters, or adjusting the depth of the synthesized holographic 3D image to match the depth where the eye is focused, or adjusting the brightness or color within the eye-look direction of region of 1° to 2°, for example, blurring, reducing and/or adjusting the perspective, resolution, brightness and/or color within the image region outside the eye-look direction of region of 1° to 2°. - The image sources 55 and one or more
eye tracking sensors 65 configured to perform an image uniformity function that are optically coupled to thelenses 5, may be electrically coupled to an interface control and processing element (ICPE) configured as a compact printed circuit, preferably integrated within the glasses'frame temple 75 assembly of the dual-mode AR/VRwearable display 1 such as is illustrated inFIGS. 1 and 2 or in a temple of the glasses. The ICPE normally would operate under program control. - The electrical coupling from the dual-mode AR/VR wearable display interface, control and processing element (ICPE) to the image sources 55 may incorporate, for instance, digital video image input signals, brightness control and image size and shape control signals.
- The interface, control and interface element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise both a wireless and wired interface in the glasses'frame temple assembly 75 and connectivity capabilities that enable the dual-mode AR/VRwearable display 1 to interface and be connected either wirelessly or by wire to an image storage source or a control host processor and/or server such as is seen inFIG. 2 . - The image processing capabilities required for the processing feedback input from
eye tracking sensors 65, may be implemented within the interface, control and processing element (ICPE) of the dual-mode AR/VRwearable display 1. - The interface, control and processing element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise the capability of synchronizing the images being displayed to both eyes, both in the perspective as well as the temporal aspects. - The interface, control and processing element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise tilt andorientation sensors 80 preferably implemented using micro-scale gyros and accelerometers to enable sensing of the dual-mode AR/VRwearable display 1 tilt and orientation (head tracking capabilities) as depicted inFIG. 2 . - The interface, control and processing element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise one or more ambientlight sensors 85 to enable sensing the brightness of the ambient light environment of the dual-mode AR/VR wearable display. - The interface, control and processing element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise the interface capability to output the sensed ambient light, tilt and orientation of the dual-mode AR/VR wearable display 1 (ambient light, tilt and orientation sensors output data) to theconnected image source 55 and a control host processor and/or server. - The interface, control and processing element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise a power converter circuit andpower management circuitry 90 that is used to convert, regulate and manage the input power provided to the dual-mode AR/VRwearable display 1. - The interface, control and processing element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise a battery pack as part of the power management circuitry that is coupled to power converter and power management circuits to enable an autonomous (or not-plugged) operational mode. - The interface, control and processing element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise an input power interface that is coupled to power converter andpower management circuitry 90 to enable a plugged operational mode. - The interface, control and processing element (ICPE) of the dual-mode AR/VR
wearable display 1 may further comprise a compact input connector to enable power, data and control interfaces to the dual-mode AR/VR wearable display that are preferably located at the terminal portion of at least one of the wearable display frame'stemple assemblies 95 of the dual-mode AR/VRwearable display 1 as depicted inFIG. 5 . - Making use of its
curved lens 5 andoptical waveguide structure 40 feature, the dual-mode AR/VRwearable display assembly 1 may be curved to match the viewer's (viewer) frontal head profile with thetemple assembly 75 and lens frame being extended in the vertical axis to sufficiently minimize leakage of excessive ambient light within the viewing region of the dual-mode AR/VRwearable display 1. - The dual-mode AR/VR
wearable display 1 may be configured to operate in either a virtual reality VR mode, an augmented reality AR mode or a hybrid AR/VR mode as commanded by either the display viewer (by either touching the display temple or by voice command) or by a command embedded within the interface, control and processing element data input from the image source host processor and/or server. - In the VR mode, the tinting of
lenses 5 of the dual-mode AR/VRwearable display 1 may be increased to a maximum (or the transmissivity reduced to a minimum) by appropriately setting the level of the electrical signal coupled into the electro-tintinglayers 15 oflenses 5, thus reducing the lens image sources' 55 output image brightness to match the VR brightness level set forth as a preference by the display viewer. - In the VR mode, the dual-mode AR/VR
wearable display 1 may provide the sensed tilt and orientation data to the image source for the latter to provide the dual-mode AR/VRwearable display 1 with the appropriately generated VR images depending on the viewer head tilt and orientation. Particularly in the AR mode, tilting or changing the position of the viewer's head will tilt or change the apparent position of the augmented images and not the real images, if not electronically corrected responsive to the tilt and orientation data. - In the VR mode of the dual-mode AR/VR
wearable display 1, the lens tinting may be reduced to the minimum level defined by the display viewer when the viewer commands the display to do so (by either touching a touch sensor located on the outside surface of the display arm, the display temple or by voice command) or when the sensed tilt and orientation data indicates the viewer's head as being outside a default viewing volume (or box) set forth by the viewer as a preference. This enables the display viewer to safely move about while still wearing the dual-mode AR/VRwearable display 1 by setting a certain physical viewing box outside of which the display lens tinting is reduced to allow the display viewer to move about safely. - In the VR mode, the dual-mode AR/VR
wearable display 1 may be reconfigurable to display either 2D or 3D light field images whereby the tilt and orientation data sensed by the display is used by the image source device to expand the displayable field of view, in both the 2D and 3D modes of operations, or enables a full-parallax 3D viewing experience in the 3D mode of operation. - In the AR mode, the tinting of
lenses 5 of the dual-mode AR/VRwearable display 1 may be reduced to a desired viewing level that matches the ambient light level sensed by the display by appropriately setting the level of the electrical signal coupled into tinting layers 15 of the lenses using an ambientlight sensor 85. The displayed image's brightness may be increased by reducing the lens image sources' 55 output image brightness to match the sensed ambient light level and the brightness level set forth as a preference by the display viewer. - In the AR mode, the dual-mode AR/VR
wearable display 1 may provide tilt and orientation data to the image sources 55 through the ICPE for the latter to provide the dual-mode AR/VRwearable display 1 with the appropriately-generated AR images depending on the viewer head tilt and orientation. Since in the AR mode, both the image visible through the lenses 5 (the real world) and the augmented image, computer generated or otherwise, are visible to a viewer in a coordinated, overlaid manner, if not compensated, the tilt and orientation of the viewer's head would disturb the relative positions of the two images. In a preferred embodiment, the image displayed at any one time can be considered part of a larger augmented image such as may be stored in memory or that can be generated on demand as needed (or both), and that in effect, head movement is effectively corrected by merely moving or twisting the viewable area around that larger image in a compensating manner. Consequently while the physical size of any one image portion displayed in anysub-region 45′ won't change, the part of the larger image that is displayed and how it is displayed in any onesub-region 45′ is changed by the ICPE with head movement to maintain the alignment of the real and augmented images. This feature may also be valuable in the VR mode also fix the spatial position of the viewable image to add to the viewer's VR experience. - Note that in either the AR or VR mode, typically the image displayed in any sub-region of a
lens 5 by an image source will have one or more image source pixels that are not used (black or off) around the edges. This allows the electronic accurate sizing, location and angular position of the image displayed in the sub-region in pixel increments to avoid unreasonable mechanical alignment requirements, etc. - In the AR mode of the dual-mode AR/VR
wearable display 1, the lens tinting may be increased to the level set by the display viewer when the viewer commands the display to do so by either touching atouch sensor 100 located on the outside surface of the display temple or by voice command, or when the sensed ambient light data indicates the ambient light has increased to a point that would reduce contrast of the displayed image. - In the AR mode, the dual-mode AR/VR
wearable display 1 can display either 2D or 3D light field images whereby the tilt and orientation data sensed by the display may be used by theimage source 55 to expand the displayable field of view, in both the 2D and 3D modes of operations, or enable full-parallax 3D viewing experience in the 3D mode of operation. - In the hybrid AR/VR mode of the dual-mode AR/VR
wearable display 1, the operational mode may be controlled by the display content which contains embedded mode control command data packets that, depending on the content of the scene being displayed and/or where the viewer's eye is directed and focused, causes the dual-mode AR/VRwearable display 1 to emphasize certain objects or focal planes within the displayed image by modifying the tinting levels and/or the contrast level of such scene objects or focal depth. - The dual-mode AR/VR
wearable display 1 may comprise both touch and voice control capabilities that are used, as explained above, to switch between the AR and VR modes of operation and to control various operational parameters of each mode. - The touch control capabilities may be implemented as a
touch sensor 100 integrated on the temple assembly outside casing (or enclosure).Touch sensor 100 may be designed to respond to single touch, multiple touches or touch and drag type of commands. In both the AR and VR modes, the default setting of the exemplary rightside touch sensor 100 is a drag to control touch sensor to control the display lens tinting level and on the leftside touch sensor 100 is a drag to control touch sensor to control the display brightness level. In the VR mode, a single touch to either side may change the display lenses' tinting to allow the display viewer to safely move about. Multiple touches may be used to change the default control of the touch sensor based on a programmable menu that allows the display viewer to set and change each operational mode parameter to match their needs. - Voice control capabilities may be provided to enable the display viewer to control the display mode such as AR versus VR or hybrid AR/VR and the display operational parameters such as brightness or image size.
- Through its wireless or wired interface capabilities in the glasses'
frame temple 75 assembly, the dual-mode AR/VRwearable display 1 enables an interface with ahand gesture sensor 105 that allows the display viewer to control the display mode such as AR versus VR or hybrid AR/VR and the display operational parameters such as brightness or image size and to also control and/or select the display contents using soft buttons or icons that may be added or removed from the viewer display area by either hand, voice or touch gestures. - The dual-mode AR/VR
wearable display 1 may further comprise at least one “reality” sensor 110 (preferably a light field camera) that preferably captures ambient light field content and couples the captured images to the interface, control and processing element (ICPE) which then blends or fuses the images being displayed to fit and optically match the reality perspective being viewed in the AR mode or to integrate the images captured by the reality sensors into the displayed content in the VR mode. - The dual-mode AR/VR
wearable display 1 may further comprise the capabilities to accept input image data or video data in a compressed format (such as MPEG or JPEG for example) and either first decompress the input images, then display them to the viewer, or directly display them to the viewer using Visual Decompression techniques as discussed below to reduce the decompression processing and memory requirements and reduce power consumption. - The plurality of
image sources 55 of the dual-mode AR/VRwearable display 1 may further comprise Visual Decompression capabilities to modulate images using high order bases of (n×n) pixels (instead of the standard 1 pixel modulation bases) then modulating the coefficient of a commensurate discrete wavelet, (DWT) transform or a discrete cosine transform (DCT) representation of the image (which are typically the coefficients used by MPEG and JPEG compression techniques), thus enabling the dual-mode AR/VRwearable display 1 to modulate images using the compressed image data directly. This, in turn, results in efficiencies in data processing throughput and memory usage and consequently reducing the volumetric and power consumption requirement of the interface, control and processing element (ICPE) of the dual-mode AR/VRwearable display 1. - The dual-mode AR/VR
wearable display 1 may further comprise capabilities to accept input images or videos compressed using light field compression techniques and formats and applying compressed light field rendering in order to decompress and synthesize the light field to be displayed from a set of compressed reference holographic elements (hogels) in order to reduce image interface bandwidth, decompression processing and memory requirements and to reduce power consumption. - The dual-mode AR/VR
wearable display 1 may further comprise the capability of interfacing with acloud server 115 and to query that server to download a selected set of compressed light field holographic elements (hogels) based on the detected viewer's eyes and head position and orientation, then to accept from the server a set of requested light field holographic elements (hogels), then applying compressed light field rendering in order to decompress and synthesize the light field to be displayed from a set of compressed reference holographic elements (hogels). This beneficially further reduces image interface bandwidth, as well as decompression processing and memory requirements and power consumption. - The dual-mode AR/VR
wearable display 1 may further comprise capabilities to interface with acloud server 115 and query the server to download a selected set of compressed light field holographic elements (hogels), herein referred to as reference hogels, based on the detected viewer's eyes focus depth or distance, then to accept from that server a set of requested reference light field hogels, then applying compressed light field rendering in order to decompress and synthesize the light field to be displayed from a set of compressed reference hogels in order to further reduce the image interface bandwidth, as well as decompression processing and memory requirements and reduce power consumption. - The dual-mode AR/VR
wearable display 1 may further comprise capabilities to interface with acloud server 115 configured as a Networked Light Field Photography cloud server, then interact with the server to upload the ambient light field images being captured by itsreality sensors 110 and download the images of the viewer-extended light field to allow the viewer to view the contents of its ambient light field beyond its visual reach, i.e., extended light field, or in order to allow the display viewer to browse through a downloaded light field using either the VR or AR modes of the display. - The dual-mode AR/VR
wearable display 1 may further comprise capabilities to interface and query acloud server 115 to download video content of a selected portion of a video data set depending on the eye parameters (look angle and depth of focus, for example) detected by one or moreeye tracking sensors 65. - The dual-mode AR/VR
wearable display 1 may further comprise capabilities to interface with anaudio interface 120 which may comprise an audio speaker and a microphone both integrated within the volumetric perimeters oftemple assembly 75 whereby the microphone is electrically coupled to the interface, control and processing element (ICPE) and is used to interface the viewer voice commands to the voice recognition processing element (software) of the interface, control and processing element (ICPE) and the speaker electrically coupled to interface, control and processing element (ICPE) and used for audio content interface to the viewer. - The plurality of
image sources 55 andeye tracking sensors 65,reality sensors 110 and the interface, control and processing element (ICPE) may be embossed within the volumetric perimeters of the rims of the lens bezel and the temples of the near-eye wearable display glasses frame, respectively, to create a streamlined-looking near-eye wearable display glasses that are aesthetically and cosmetically appealing when worn in public, such as is illustrated inFIG. 1 . - The dual-mode AR/VR
wearable display 1 may be expected to display reference images of objects, icons and/or markers from time to time, and a processing element of the device may further comprise capabilities to keep track in its interface, control and processing element memory of the subset of reference images of objects, icons and/or marker that frequently appear within the displayed content then subsequently abbreviate the fine details or lower the resolution of this subset of reference images in order to reduce processing and memory requirements and reduce power consumption. This feature leverages the cognitive perception capabilities of the human visual system (HVS) by virtually filling in the details required to recognize and/or identify familiar or previously visually-sensed objects and images in order to maximize the efficiency, in terms of response latency, processing throughput and memory requirement and power consumption of the dual-mode AR/VR near-eyewearable display 1. - The dual-mode AR/VR
wearable display 1 may further comprise capabilities to analyze the content to be displayed on the device on a frame-by-frame basis to deduce the color gamut size, in terms of the coordinates of the gamut color primaries, then command the plurality ofimage sources 55 to synthesize the deduced color gamut using the measured gamut color primaries in the modulation of the images being displayed to the viewer. This feature leverages the fact that the color gamut of image content from frame-to-frame is typically much smaller than the full color gamut that can be synthesized by laser diode or LED basedimage sources 55 referred to above in order to maximize efficiency in terms of brightness, color content, processing throughput and memory requirement and power consumption of the dual-mode AR/VR near-eyewearable display 1. - Referred to herein are an ICPE and a host processor shown in
FIG. 2 , both of which would include one or more processing elements, which in turn would include memory as required. It should be understood that any processing may be done on or off the device ofFIG. 1 , or both, using wireless or wired communication, and references to a processing element in the claims to follow are to be understood to refer to one or more processing elements on and/or off the device ofFIG. 1 . - Many alterations and modifications may be made by those having ordinary skill in the art without departing from the spirit and scope of the invention. Therefore, it must be understood that the illustrated embodiment has been set forth only for the purposes of example and that it should not be taken as limiting the invention as defined by any claims in any subsequent application claiming priority to this application.
- For example, notwithstanding the fact that the elements of such a claim may be set forth in a certain combination, it must be expressly understood that the invention includes other combinations of fewer, more or different elements, which are disclosed in above even when not initially claimed in such combinations.
- The words used in this specification to describe the invention and its various embodiments are to be understood not only in the sense of their commonly defined meanings, but to include by special definition in this specification structure, material or acts beyond the scope of the commonly defined meanings. Thus, if an element can be understood in the context of this specification as including more than one meaning, then its use in a subsequent claim must be understood as being generic to all possible meanings supported by the specification and by the word itself.
- The definitions of the words or elements of any claims in any subsequent application claiming priority to this application should be, therefore, defined to include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result. In this sense, it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements in such claims below or that a single element may be substituted for two or more elements in such a claim.
- Although elements may be described above as acting in certain combinations and even subsequently claimed as such, it is to be expressly understood that one or more elements from a claimed combination can in some cases be excised from the combination and that such claimed combination may be directed to a subcombination or variation of a subcombination.
- Insubstantial changes from any subsequently claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalently within the scope of such claims. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements.
- Any claims in any subsequent application claiming priority to this application are thus to be understood to include what is specifically illustrated and described above, what is conceptually equivalent, what can be obviously substituted and also what essentially incorporates the essential idea of the invention.
Claims (25)
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/294,447 US11609427B2 (en) | 2015-10-16 | 2016-10-14 | Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays |
KR1020187013560A KR20180070626A (en) | 2015-10-16 | 2016-10-17 | Dual Mode Augmented / Virtual Reality Muscle Wearable Display |
JP2018519398A JP7198663B2 (en) | 2015-10-16 | 2016-10-17 | Dual-mode augmented/virtual reality (AR/VR) near-eye wearable display |
CN201680073919.0A CN108369339B (en) | 2015-10-16 | 2016-10-17 | Dual mode augmented/virtual reality (AR/VR) near-eye wearable display |
EP16856441.7A EP3362838A4 (en) | 2015-10-16 | 2016-10-17 | Dual-mode augmented/virtual reality (ar/vr) near-eye wearable displays |
TW105133472A TWI767891B (en) | 2015-10-16 | 2016-10-17 | Dual-mode augmented/virtual reality (ar/vr) near-eye wearable displays |
PCT/US2016/057418 WO2017066802A1 (en) | 2015-10-16 | 2016-10-17 | Dual-mode augmented/virtual reality (ar/vr) near-eye wearable displays |
HK19101840.9A HK1259436A1 (en) | 2015-10-16 | 2019-02-01 | Dual-mode augmented/virtual reality (ar/vr) near-eye wearable displays |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562242963P | 2015-10-16 | 2015-10-16 | |
US15/294,447 US11609427B2 (en) | 2015-10-16 | 2016-10-14 | Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170108697A1 true US20170108697A1 (en) | 2017-04-20 |
US11609427B2 US11609427B2 (en) | 2023-03-21 |
Family
ID=58518430
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/294,447 Active US11609427B2 (en) | 2015-10-16 | 2016-10-14 | Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays |
Country Status (8)
Country | Link |
---|---|
US (1) | US11609427B2 (en) |
EP (1) | EP3362838A4 (en) |
JP (1) | JP7198663B2 (en) |
KR (1) | KR20180070626A (en) |
CN (1) | CN108369339B (en) |
HK (1) | HK1259436A1 (en) |
TW (1) | TWI767891B (en) |
WO (1) | WO2017066802A1 (en) |
Cited By (64)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170038836A1 (en) * | 2015-08-03 | 2017-02-09 | Oculus Vr, Llc | Display with an Embedded Eye Tracker |
US20170111723A1 (en) * | 2015-10-20 | 2017-04-20 | Bragi GmbH | Personal Area Network Devices System and Method |
US20170212351A1 (en) * | 2016-01-07 | 2017-07-27 | Magic Leap, Inc. | Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes |
US9842433B2 (en) * | 2016-04-15 | 2017-12-12 | Superd Co. Ltd. | Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality |
CN107889074A (en) * | 2017-10-20 | 2018-04-06 | 深圳市眼界科技有限公司 | Dodgem data processing method, apparatus and system for VR |
US10104464B2 (en) * | 2016-08-25 | 2018-10-16 | Bragi GmbH | Wireless earpiece and smart glasses system and method |
US10129984B1 (en) | 2018-02-07 | 2018-11-13 | Lockheed Martin Corporation | Three-dimensional electronics distribution by geodesic faceting |
CN109085711A (en) * | 2017-06-13 | 2018-12-25 | 深圳市光场视觉有限公司 | A kind of vision conversion equipment of adjustable light transmittance |
WO2019013865A1 (en) * | 2017-07-13 | 2019-01-17 | Google Llc | Non-planar computational displays |
WO2019023040A1 (en) * | 2017-07-27 | 2019-01-31 | Command Sight, Inc. | Animal wearable head mountable display system |
US10203566B2 (en) | 2015-12-21 | 2019-02-12 | Facebook Technologies, Llc | Enhanced spatial resolution using a segmented electrode array |
US10247858B2 (en) | 2015-10-25 | 2019-04-02 | Facebook Technologies, Llc | Liquid crystal half-wave plate lens |
US20190107723A1 (en) * | 2016-12-20 | 2019-04-11 | Facebook Technologies, Llc | Waveguide display with a small form factor, a large field of view, and a large eyebox |
WO2019089094A1 (en) * | 2017-10-31 | 2019-05-09 | Google Llc | Multi-perspective eye-tracking for vr/ar systems |
US10297180B2 (en) | 2015-08-03 | 2019-05-21 | Facebook Technologies, Llc | Compensation of chromatic dispersion in a tunable beam steering device for improved display |
US20190196209A1 (en) * | 2016-10-31 | 2019-06-27 | Boe Technology Group Co., Ltd. | Display Panel and Display Apparatus |
US10338400B2 (en) | 2017-07-03 | 2019-07-02 | Holovisions LLC | Augmented reality eyewear with VAPE or wear technology |
US10338451B2 (en) | 2015-08-03 | 2019-07-02 | Facebook Technologies, Llc | Devices and methods for removing zeroth order leakage in beam steering devices |
US10416454B2 (en) | 2015-10-25 | 2019-09-17 | Facebook Technologies, Llc | Combination prism array for focusing light |
US20190324276A1 (en) * | 2018-04-19 | 2019-10-24 | Magic Leap, Inc. | Systems and methods for operating a display system based on user perceptibility |
US10459305B2 (en) | 2015-08-03 | 2019-10-29 | Facebook Technologies, Llc | Time-domain adjustment of phase retardation in a liquid crystal grating for a color display |
US10481321B1 (en) | 2018-09-06 | 2019-11-19 | Facebook Technologies, Llc | Canted augmented reality display for improved ergonomics |
DE102018209377A1 (en) | 2018-06-12 | 2019-12-12 | Volkswagen Aktiengesellschaft | A method of presenting AR / VR content on a mobile terminal and mobile terminal presenting AR / VR content |
US10510812B2 (en) | 2017-11-09 | 2019-12-17 | Lockheed Martin Corporation | Display-integrated infrared emitter and sensor structures |
US10552676B2 (en) | 2015-08-03 | 2020-02-04 | Facebook Technologies, Llc | Methods and devices for eye tracking based on depth sensing |
US10594951B2 (en) | 2018-02-07 | 2020-03-17 | Lockheed Martin Corporation | Distributed multi-aperture camera array |
US10630873B2 (en) | 2017-07-27 | 2020-04-21 | Command Sight, Inc. | Animal-wearable first person view system |
US10627565B1 (en) | 2018-09-06 | 2020-04-21 | Facebook Technologies, Llc | Waveguide-based display for artificial reality |
US10652529B2 (en) | 2018-02-07 | 2020-05-12 | Lockheed Martin Corporation | In-layer Signal processing |
WO2020094479A1 (en) * | 2018-11-07 | 2020-05-14 | Robert Bosch Gmbh | Spectacle lens for data glasses, data glasses, and method for operating a spectacle lens or data glasses |
WO2020117459A1 (en) * | 2018-12-03 | 2020-06-11 | Lockheed Martin Corporation | Eccentric incident luminance pupil tracking |
US10690910B2 (en) | 2018-02-07 | 2020-06-23 | Lockheed Martin Corporation | Plenoptic cellular vision correction |
US10698201B1 (en) | 2019-04-02 | 2020-06-30 | Lockheed Martin Corporation | Plenoptic cellular axis redirection |
JP2020519960A (en) * | 2017-05-17 | 2020-07-02 | ビュージックス コーポレーションVuzix Corporation | Fixed-focus image light guide with zoned grating |
US10718942B2 (en) | 2018-10-23 | 2020-07-21 | Microsoft Technology Licensing, Llc | Eye tracking systems and methods for near-eye-display (NED) devices |
US20200257065A1 (en) * | 2019-02-11 | 2020-08-13 | Facebook Technologies, Llc | Dispersion compensation for light coupling through slanted facet of optical waveguide |
US10838250B2 (en) | 2018-02-07 | 2020-11-17 | Lockheed Martin Corporation | Display assemblies with electronically emulated transparency |
US10838490B2 (en) | 2018-10-23 | 2020-11-17 | Microsoft Technology Licensing, Llc | Translating combinations of user gaze direction and predetermined facial gestures into user input instructions for near-eye-display (NED) devices |
US10852823B2 (en) | 2018-10-23 | 2020-12-01 | Microsoft Technology Licensing, Llc | User-specific eye tracking calibration for near-eye-display (NED) devices |
US10855979B2 (en) | 2018-10-23 | 2020-12-01 | Microsoft Technology Licensing, Llc | Interpreting eye gaze direction as user input to near-eye-display (NED) devices for enabling hands free positioning of virtual items |
US10859834B2 (en) | 2017-07-03 | 2020-12-08 | Holovisions | Space-efficient optical structures for wide field-of-view augmented reality (AR) eyewear |
CN112365861A (en) * | 2020-10-26 | 2021-02-12 | 深圳Tcl新技术有限公司 | Display image adjusting method, electronic device and computer readable storage medium |
US10930709B2 (en) | 2017-10-03 | 2021-02-23 | Lockheed Martin Corporation | Stacked transparent pixel structures for image sensors |
US10951883B2 (en) | 2018-02-07 | 2021-03-16 | Lockheed Martin Corporation | Distributed multi-screen array for high density display |
WO2021061448A1 (en) * | 2019-09-23 | 2021-04-01 | Akalana Management Llc | Optical systems with switchable lenses for mitigating variations in ambient brightness |
US10979699B2 (en) | 2018-02-07 | 2021-04-13 | Lockheed Martin Corporation | Plenoptic cellular imaging system |
US10996746B2 (en) | 2018-10-23 | 2021-05-04 | Microsoft Technology Licensing, Llc | Real-time computational solutions to a three-dimensional eye tracking framework |
US11209681B2 (en) * | 2018-12-03 | 2021-12-28 | Disney Enterprises, Inc. | Virtual reality and/or augmented reality viewer having variable transparency |
US11209650B1 (en) | 2018-09-06 | 2021-12-28 | Facebook Technologies, Llc | Waveguide based display with multiple coupling elements for artificial reality |
US11210832B2 (en) | 2018-04-24 | 2021-12-28 | Hewlett-Packard Development Company, L.P. | Animated gazes on head mounted displays |
US11247607B1 (en) * | 2016-03-16 | 2022-02-15 | Deepstone LLC | Extended perception system |
US20220078398A1 (en) * | 2020-09-04 | 2022-03-10 | Samsung Display Co., Ltd. | Light field display device and method of processing image of the same |
US20220146822A1 (en) * | 2019-08-15 | 2022-05-12 | Ostendo Technologies, Inc. | Wearable Display Systems and Design Methods Thereof |
WO2022177334A1 (en) * | 2021-02-18 | 2022-08-25 | 삼성전자 주식회사 | Wearable electronic device |
US11448918B2 (en) | 2019-01-30 | 2022-09-20 | Samsung Electronics Co., Ltd. | Grating device, screen including the grating device, method of manufacturing the screen and display apparatus for augmented reality and/or virtual reality including the screen |
US11454815B2 (en) | 2017-06-01 | 2022-09-27 | NewSight Reality, Inc. | Transparent optical module using pixel patches and associated lenslets |
US20220342219A1 (en) * | 2021-04-26 | 2022-10-27 | Meta Platforms Technologies, Llc | Apparatus, system, and method for disposing photonic integrated circuits on surfaces |
US11526013B2 (en) | 2019-12-30 | 2022-12-13 | Acer Incorporated | Wearable display device |
US11616941B2 (en) | 2018-02-07 | 2023-03-28 | Lockheed Martin Corporation | Direct camera-to-display system |
US11686943B2 (en) | 2019-03-05 | 2023-06-27 | Samsung Display Co., Ltd. | Display device |
US20230204958A1 (en) * | 2021-12-28 | 2023-06-29 | David Fliszar | Eyewear electronic tinting lens with integrated waveguide |
US20230213762A1 (en) * | 2021-12-31 | 2023-07-06 | Beijing Ned+Ar Display Technology Co., Ltd. | Ultra-thin lens, virtual image display device using same, and near-eye display |
US11852830B2 (en) | 2020-06-18 | 2023-12-26 | Samsung Electronics Co., Ltd. | Augmented reality glass and operating method therefor |
EP4343406A1 (en) * | 2022-09-20 | 2024-03-27 | Rockwell Collins, Inc. | Method for creating uniform contrast on a headworn display against high dynamic range scene |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11609427B2 (en) | 2015-10-16 | 2023-03-21 | Ostendo Technologies, Inc. | Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays |
CN109387939B (en) * | 2017-08-09 | 2021-02-12 | 中强光电股份有限公司 | Near-to-eye display device and correction method of display image thereof |
TWI691907B (en) * | 2018-06-12 | 2020-04-21 | 網銀國際股份有限公司 | Mobile apparatus and positioning method in space |
KR20200084498A (en) * | 2019-01-02 | 2020-07-13 | 삼성디스플레이 주식회사 | Device for providing augmented reality |
US10785473B2 (en) * | 2019-01-10 | 2020-09-22 | Honeywell International Inc. | Virtual window display |
KR20200110489A (en) * | 2019-03-13 | 2020-09-24 | 삼성디스플레이 주식회사 | Flexible display device and augmented reality providing device including the same |
KR20220049548A (en) * | 2019-08-30 | 2022-04-21 | 엘지전자 주식회사 | Electronic devices that can be worn on the head |
CN114424110A (en) * | 2019-08-30 | 2022-04-29 | Pcms控股公司 | Creating 3D multiview displays with elastic optical layer buckling |
US11275250B2 (en) | 2019-11-19 | 2022-03-15 | Apple Inc. | Optical alignment for head-mountable device |
CN111458880B (en) * | 2020-05-09 | 2022-04-22 | 三生万物(北京)人工智能技术有限公司 | Waveguide light field display device and head-mounted augmented reality glasses |
KR20220078093A (en) * | 2020-12-03 | 2022-06-10 | 삼성전자주식회사 | Wearable electronic device including light emitting unit |
CN115240820A (en) | 2021-04-23 | 2022-10-25 | 中强光电股份有限公司 | Wearable device and method for adjusting display state based on environment |
CN113629014A (en) * | 2021-08-05 | 2021-11-09 | 安徽熙泰智能科技有限公司 | Near-to-eye micro display and preparation method thereof |
KR20230029434A (en) * | 2021-08-24 | 2023-03-03 | 삼성전자주식회사 | Electronic device and controlling method of electronic device |
CN114280789B (en) * | 2021-12-29 | 2024-02-27 | Oppo广东移动通信有限公司 | Near-eye display optical system and near-eye display optical apparatus |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6804066B1 (en) * | 2001-05-23 | 2004-10-12 | University Of Central Florida | Compact lens assembly for the teleportal augmented reality system |
US20090199900A1 (en) * | 2008-02-12 | 2009-08-13 | Qualcomm Mems Technologies, Inc. | Thin film holographic solar concentrator/collector |
US20100046070A1 (en) * | 2008-08-21 | 2010-02-25 | Sony Corporation | Head-mounted display |
US20120154441A1 (en) * | 2010-12-16 | 2012-06-21 | Electronics And Telecommunications Research Institute | Augmented reality display system and method for vehicle |
US20120154277A1 (en) * | 2010-12-17 | 2012-06-21 | Avi Bar-Zeev | Optimized focal area for augmented reality displays |
US20130077049A1 (en) * | 2011-09-26 | 2013-03-28 | David D. Bohn | Integrated eye tracking and display system |
US20130258451A1 (en) * | 2012-03-27 | 2013-10-03 | Ostendo Technologies, Inc. | Spatio-Temporal Directional Light Modulator |
US20140049983A1 (en) * | 2010-11-18 | 2014-02-20 | Anthony John Nichol | Light emitting device comprising a lightguide film and aligned coupling lightguides |
US20140124173A1 (en) * | 2009-10-29 | 2014-05-08 | Wistron Corporation | Heat dissipating device and heat dissipating fin |
US20150035832A1 (en) * | 2011-12-01 | 2015-02-05 | Microsoft Corporation | Virtual light in augmented reality |
US20150235467A1 (en) * | 2013-11-27 | 2015-08-20 | Magic Leap, Inc. | Waveguide assembly to display images at multiple focal planes |
US20170116897A1 (en) * | 2014-05-15 | 2017-04-27 | Samsung Electronics Co., Ltd. | Image display device and method using unidirectional beam |
Family Cites Families (207)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4427912A (en) | 1982-05-13 | 1984-01-24 | Ausonics Pty. Ltd. | Ultrasound transducer for enhancing signal reception in ultrasound equipment |
US4757714A (en) | 1986-09-25 | 1988-07-19 | Insight, Inc. | Speed sensor and head-mounted data display |
JPH05500461A (en) | 1989-06-09 | 1993-02-04 | オニール,ジョン,エル | Biofeedback device that monitors muscle movement |
CA2030874C (en) | 1989-11-30 | 1994-11-01 | Ronald G. Hegg | Dual-mirror virtual image display for vehicle instrument cluster |
US5696521A (en) | 1994-06-22 | 1997-12-09 | Astounding Technologies (M) Sdn. Bhd. | Video headset |
US5544268A (en) | 1994-09-09 | 1996-08-06 | Deacon Research | Display panel with electrically-controlled waveguide-routing |
US5986811A (en) | 1995-06-07 | 1999-11-16 | Meso Scale Technologies Llp | Method of and apparatus for generating a 3-D image from a 2-D image having a changeable focusing micro-lens array |
US5619373A (en) | 1995-06-07 | 1997-04-08 | Hasbro, Inc. | Optical system for a head mounted display |
JPH099301A (en) | 1995-06-24 | 1997-01-10 | Victor Co Of Japan Ltd | Head mount display device |
US5818359A (en) | 1995-07-10 | 1998-10-06 | Beach; Kirk | Process and apparatus for computerizing translation of motion of subcutaneous body parts |
JP3632271B2 (en) | 1995-12-28 | 2005-03-23 | 富士ゼロックス株式会社 | Glasses display |
US5886822A (en) | 1996-10-08 | 1999-03-23 | The Microoptical Corporation | Image combining system for eyeglasses and face masks |
ATE232621T1 (en) | 1996-12-20 | 2003-02-15 | Hitachi Europ Ltd | METHOD AND SYSTEM FOR RECOGNIZING HAND GESTURES |
US6764012B2 (en) | 1997-02-10 | 2004-07-20 | Symbol Technologies, Inc. | Signaling arrangement for and method of signaling in a wireless local area network |
JPH10319240A (en) * | 1997-05-22 | 1998-12-04 | Fuji Xerox Co Ltd | Head-mounted display |
US20020075232A1 (en) | 1997-08-15 | 2002-06-20 | Wolfgang Daum | Data glove |
US6937221B2 (en) | 1998-08-05 | 2005-08-30 | Microvision, Inc. | Scanned beam display |
US6583772B1 (en) | 1998-08-05 | 2003-06-24 | Microvision, Inc. | Linked scanner imaging system and method |
US6151167A (en) | 1998-08-05 | 2000-11-21 | Microvision, Inc. | Scanned display with dual signal fiber transmission |
US6681031B2 (en) | 1998-08-10 | 2004-01-20 | Cybernet Systems Corporation | Gesture-controlled interfaces for self-service machines and other applications |
US6147807A (en) | 1999-05-04 | 2000-11-14 | Honeywell, Inc. | High brightness see-through head-mounted display |
CA2375519A1 (en) | 1999-06-21 | 2000-12-28 | The Microoptical Corporation | Eyeglass display lens system employing off-axis optical design |
US6525310B2 (en) | 1999-08-05 | 2003-02-25 | Microvision, Inc. | Frequency tunable resonant scanner |
US6515781B2 (en) | 1999-08-05 | 2003-02-04 | Microvision, Inc. | Scanned imaging apparatus with switched feeds |
US6433907B1 (en) | 1999-08-05 | 2002-08-13 | Microvision, Inc. | Scanned display with plurality of scanning assemblies |
US6795221B1 (en) | 1999-08-05 | 2004-09-21 | Microvision, Inc. | Scanned display with switched feeds and distortion correction |
US6924476B2 (en) | 2002-11-25 | 2005-08-02 | Microvision, Inc. | Resonant beam scanner with raster pinch compensation |
US6456438B1 (en) | 1999-08-12 | 2002-09-24 | Honeywell Inc. | Variable immersion vignetting display |
JP2001264683A (en) | 2000-03-17 | 2001-09-26 | Minolta Co Ltd | Information display optical system, optical element or optical system and information display device |
GB2360603A (en) | 2000-03-20 | 2001-09-26 | Cambridge 3D Display Ltd | Planar optical waveguide and float glass process |
AU2001260356B2 (en) | 2000-05-11 | 2005-11-10 | Clothing Plus Oy | Wearable projector and intelligent clothing |
ATE473464T1 (en) | 2000-06-05 | 2010-07-15 | Lumus Ltd | OPTICAL BEAM EXPANDER WITH SUBSTRATE LIGHT WAVE GUIDE |
US20040051392A1 (en) | 2000-09-22 | 2004-03-18 | Ziad Badarneh | Operating device |
JP4436445B2 (en) | 2000-11-17 | 2010-03-24 | キヤノン株式会社 | Inventory management system, inventory management method and program |
US6724012B2 (en) | 2000-12-14 | 2004-04-20 | Semiconductor Energy Laboratory Co., Ltd. | Display matrix with pixels having sensor and light emitting portions |
US7193758B2 (en) | 2001-02-06 | 2007-03-20 | Microvision, Inc. | Scanner and method for sweeping a beam across a target |
US7082393B2 (en) | 2001-03-27 | 2006-07-25 | Rast Associates, Llc | Head-worn, trimodal device to increase transcription accuracy in a voice recognition system and to process unvocalized speech |
US7061450B2 (en) | 2001-04-09 | 2006-06-13 | Microvision, Inc. | Electronically scanned beam display |
JP4772204B2 (en) | 2001-04-13 | 2011-09-14 | オリンパス株式会社 | Observation optical system |
US6529331B2 (en) | 2001-04-20 | 2003-03-04 | Johns Hopkins University | Head mounted display with full field of view and high resolution |
US20040125076A1 (en) | 2001-06-08 | 2004-07-01 | David Green | Method and apparatus for human interface with a computer |
US6666825B2 (en) | 2001-07-05 | 2003-12-23 | General Electric Company | Ultrasound transducer for improving resolution in imaging system |
JP2006504116A (en) | 2001-12-14 | 2006-02-02 | ディジタル・オプティクス・インターナショナル・コーポレイション | Uniform lighting system |
US6719693B2 (en) | 2002-03-29 | 2004-04-13 | Becs Technology, Inc. | Apparatus and system for real-time synthetic focus ultrasonic imaging |
KR100939017B1 (en) | 2002-05-17 | 2010-01-26 | 마이크로비젼, 인코퍼레이티드 | Apparatus and method for sweeping an image beam in one dimension and bidirectionally sweeping an image beam in a second dimension |
US6984208B2 (en) | 2002-08-01 | 2006-01-10 | The Hong Kong Polytechnic University | Method and apparatus for sensing body gesture, posture and movement |
US6747785B2 (en) | 2002-10-24 | 2004-06-08 | Hewlett-Packard Development Company, L.P. | MEMS-actuated color light modulator and methods |
US7071594B1 (en) | 2002-11-04 | 2006-07-04 | Microvision, Inc. | MEMS scanner with dual magnetic and capacitive drive |
US20040138935A1 (en) | 2003-01-09 | 2004-07-15 | Johnson Christopher D. | Visualizing business analysis results |
US7145722B2 (en) | 2003-04-24 | 2006-12-05 | Banpil Photonics, Inc. | Optical filter and method of manufacturing thereof |
CA2530987C (en) | 2003-07-03 | 2012-04-17 | Holotouch, Inc. | Holographic human-machine interfaces |
GB2403814A (en) | 2003-07-10 | 2005-01-12 | Ocuity Ltd | Directional display apparatus with birefringent lens structure |
GB2403815A (en) | 2003-07-10 | 2005-01-12 | Ocuity Ltd | Birefringent lens array structure |
US7106519B2 (en) | 2003-07-31 | 2006-09-12 | Lucent Technologies Inc. | Tunable micro-lens arrays |
US7209538B2 (en) | 2003-08-07 | 2007-04-24 | Xoran Technologies, Inc. | Intraoperative stereo imaging system |
TW200512298A (en) | 2003-09-24 | 2005-04-01 | Oncotherapy Science Inc | Method of diagnosing breast cancer |
US7232071B2 (en) | 2003-11-14 | 2007-06-19 | Microvision, Inc. | Scanned beam imager |
US6999238B2 (en) | 2003-12-01 | 2006-02-14 | Fujitsu Limited | Tunable micro-lens array |
GB0328904D0 (en) | 2003-12-12 | 2004-01-14 | Swan Thomas & Co Ltd | Scarab |
DE102005001417B4 (en) | 2004-01-29 | 2009-06-25 | Heidelberger Druckmaschinen Ag | Projection screen-dependent display / operating device |
US20050280879A1 (en) | 2004-02-09 | 2005-12-22 | Gibson Gregory T | Method and apparatus for scanning a beam of light |
US9652032B2 (en) | 2004-03-02 | 2017-05-16 | Brian T. Mitchell | Simulated training environments based upon fixated objects in specified regions |
US7724210B2 (en) | 2004-05-07 | 2010-05-25 | Microvision, Inc. | Scanned light display system using large numerical aperture light source, method of using same, and method of making scanning mirror assemblies |
US7486255B2 (en) | 2004-07-21 | 2009-02-03 | Microvision, Inc. | Scanned beam system and method using a plurality of display zones |
US7545571B2 (en) | 2004-09-08 | 2009-06-09 | Concurrent Technologies Corporation | Wearable display system |
US20060132383A1 (en) | 2004-09-27 | 2006-06-22 | Idc, Llc | System and method for illuminating interferometric modulator display |
US7619807B2 (en) | 2004-11-08 | 2009-11-17 | Angstrom, Inc. | Micromirror array lens with optical surface profiles |
US7747301B2 (en) | 2005-03-30 | 2010-06-29 | Skyline Biomedical, Inc. | Apparatus and method for non-invasive and minimally-invasive sensing of parameters relating to blood |
US7190508B2 (en) | 2005-06-15 | 2007-03-13 | Miradia Inc. | Method and structure of patterning landing pad structures for spatial light modulators |
US20070083120A1 (en) | 2005-09-22 | 2007-04-12 | Cain Charles A | Pulsed cavitational ultrasound therapy |
US8405618B2 (en) | 2006-03-24 | 2013-03-26 | Northwestern University | Haptic device with indirect haptic feedback |
US20070276658A1 (en) | 2006-05-23 | 2007-11-29 | Barry Grayson Douglass | Apparatus and Method for Detecting Speech Using Acoustic Signals Outside the Audible Frequency Range |
US7542210B2 (en) | 2006-06-29 | 2009-06-02 | Chirieleison Sr Anthony | Eye tracking head mounted display |
EP2503363A1 (en) | 2006-07-10 | 2012-09-26 | Sony Corporation | Lens array |
US9319741B2 (en) | 2006-09-07 | 2016-04-19 | Rateze Remote Mgmt Llc | Finding devices in an entertainment system |
US7804624B2 (en) | 2006-11-30 | 2010-09-28 | Honeywell International Inc. | Image capture device |
US7369321B1 (en) | 2007-01-16 | 2008-05-06 | University Of Central Florida Research Foundation, Inc. | Variable-focus liquid lens |
WO2008090000A1 (en) | 2007-01-25 | 2008-07-31 | Rodenstock Gmbh | Glasses and eyeglass lens for data reflection |
US7874666B2 (en) | 2007-03-26 | 2011-01-25 | University Of Washington Through Its Center For Commercialization | Smart sunglasses, helmet faceshields and goggles based on electrochromic polymers |
US7623560B2 (en) | 2007-09-27 | 2009-11-24 | Ostendo Technologies, Inc. | Quantum photonic imagers and methods of fabrication thereof |
US8031172B2 (en) | 2007-10-12 | 2011-10-04 | Immersion Corporation | Method and apparatus for wearable remote interface device |
US8672752B2 (en) | 2007-11-09 | 2014-03-18 | Wms Gaming, Inc. | Interface for wagering game environments |
KR20100072377A (en) | 2007-11-19 | 2010-06-30 | 노키아 코포레이션 | Input device |
US7791810B2 (en) | 2007-12-21 | 2010-09-07 | Microvision, Inc. | Scanned beam display having high uniformity and diminished coherent artifacts |
US7945338B2 (en) | 2008-03-03 | 2011-05-17 | Rockwell Automation Technologies, Inc. | Automation human machine interface having virtual graphic controls |
US8293354B2 (en) | 2008-04-09 | 2012-10-23 | The Regents Of The University Of Michigan | UV curable silsesquioxane resins for nanoprint lithography |
US8447704B2 (en) | 2008-06-26 | 2013-05-21 | Microsoft Corporation | Recognizing gestures from forearm EMG signals |
US7954953B2 (en) | 2008-07-30 | 2011-06-07 | Microvision, Inc. | Scanned beam overlay projection |
US20100053151A1 (en) | 2008-09-02 | 2010-03-04 | Samsung Electronics Co., Ltd | In-line mediation for manipulating three-dimensional content on a display device |
US8098265B2 (en) | 2008-10-10 | 2012-01-17 | Ostendo Technologies, Inc. | Hierarchical multicolor primaries temporal multiplexing system |
US8289162B2 (en) | 2008-12-22 | 2012-10-16 | Wimm Labs, Inc. | Gesture-based user interface for a wearable portable device |
US9569001B2 (en) | 2009-02-03 | 2017-02-14 | Massachusetts Institute Of Technology | Wearable gestural interface |
US8510244B2 (en) | 2009-03-20 | 2013-08-13 | ISC8 Inc. | Apparatus comprising artificial neuronal assembly |
US8107147B2 (en) | 2009-03-27 | 2012-01-31 | Microvision, Inc. | Two-mirror scanning system |
US20150138086A1 (en) | 2009-04-02 | 2015-05-21 | John S. Underkoffler | Calibrating control device for use with spatial operating system |
US9317128B2 (en) | 2009-04-02 | 2016-04-19 | Oblong Industries, Inc. | Remote devices used in a markerless installation of a spatial operating environment incorporating gestural control |
WO2010123934A1 (en) | 2009-04-20 | 2010-10-28 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Optical see-through free-form head-mounted display |
US10684489B2 (en) | 2009-06-23 | 2020-06-16 | Seereal Technologies S.A. | Light modulation device for a display for representing two- and/or three-dimensional image content |
JP4988016B2 (en) | 2009-08-27 | 2012-08-01 | 韓國電子通信研究院 | Finger motion detection apparatus and method |
US9981193B2 (en) | 2009-10-27 | 2018-05-29 | Harmonix Music Systems, Inc. | Movement based recognition and evaluation |
KR101647722B1 (en) | 2009-11-13 | 2016-08-23 | 엘지전자 주식회사 | Image Display Device and Operating Method for the Same |
WO2011106798A1 (en) | 2010-02-28 | 2011-09-01 | Osterhout Group, Inc. | Local advertising content on an interactive head-mounted eyepiece |
US8477425B2 (en) | 2010-02-28 | 2013-07-02 | Osterhout Group, Inc. | See-through near-eye display glasses including a partially reflective, partially transmitting optical element |
US8467133B2 (en) | 2010-02-28 | 2013-06-18 | Osterhout Group, Inc. | See-through display with an optical assembly including a wedge-shaped illumination system |
US8482859B2 (en) | 2010-02-28 | 2013-07-09 | Osterhout Group, Inc. | See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film |
US9097890B2 (en) | 2010-02-28 | 2015-08-04 | Microsoft Technology Licensing, Llc | Grating in a light transmissive illumination system for see-through near-eye display glasses |
WO2011113066A1 (en) | 2010-03-12 | 2011-09-15 | The University Of North Carolina At Greensboro | Methods and systems using integrated metabolomics and pharmacokinetics for multi-component drug evaluation |
CA2796451A1 (en) | 2010-04-16 | 2011-10-20 | Nicholas J. Mastandrea | Wearable motion sensing computing interface |
JP2011244425A (en) | 2010-04-23 | 2011-12-01 | Canon Inc | Electromechanical transducer and its manufacturing method |
WO2011134169A1 (en) | 2010-04-30 | 2011-11-03 | Beijing Institute Of Technology | Wide angle and high resolution tiled head-mounted display device |
US20110285666A1 (en) | 2010-05-21 | 2011-11-24 | Ivan Poupyrev | Electrovibration for touch surfaces |
CN105301786A (en) * | 2010-06-30 | 2016-02-03 | 松下知识产权经营株式会社 | Optical device |
US9916006B2 (en) | 2010-07-23 | 2018-03-13 | Telepatheye Inc. | Eye-wearable device user interface and method |
US8743145B1 (en) | 2010-08-26 | 2014-06-03 | Amazon Technologies, Inc. | Visual overlay for augmenting reality |
DE102010040962A1 (en) | 2010-09-17 | 2012-03-22 | Carl Zeiss Ag | Display device with an attachable to the head of a user holding device |
US8941559B2 (en) | 2010-09-21 | 2015-01-27 | Microsoft Corporation | Opacity filter for display device |
US20120075173A1 (en) | 2010-09-23 | 2012-03-29 | Nokia Corporation | Apparatus and method for user input |
US20120075196A1 (en) | 2010-09-23 | 2012-03-29 | Nokia Corporation | Apparatus and method for user input |
US9632315B2 (en) | 2010-10-21 | 2017-04-25 | Lockheed Martin Corporation | Head-mounted display apparatus employing one or more fresnel lenses |
US9529191B2 (en) | 2010-11-03 | 2016-12-27 | Trex Enterprises Corporation | Dynamic foveal vision display |
KR101670927B1 (en) | 2010-11-05 | 2016-11-01 | 삼성전자주식회사 | Display apparatus and method |
JP5659768B2 (en) | 2010-12-16 | 2015-01-28 | 凸版印刷株式会社 | Oblique electric field liquid crystal display device |
US8994718B2 (en) | 2010-12-21 | 2015-03-31 | Microsoft Technology Licensing, Llc | Skeletal control of three-dimensional virtual world |
WO2012093662A1 (en) | 2011-01-06 | 2012-07-12 | 株式会社日立メディコ | Ultrasonic probe |
RU2480941C2 (en) | 2011-01-20 | 2013-04-27 | Корпорация "Самсунг Электроникс Ко., Лтд" | Method of adaptive frame prediction for multiview video sequence coding |
US20120236201A1 (en) | 2011-01-27 | 2012-09-20 | In The Telling, Inc. | Digital asset management, authoring, and presentation techniques |
US20120195461A1 (en) | 2011-01-31 | 2012-08-02 | Qualcomm Incorporated | Correlating areas on the physical object to areas on the phone screen |
US20130169536A1 (en) | 2011-02-17 | 2013-07-04 | Orcam Technologies Ltd. | Control of a wearable device |
CN103460256B (en) | 2011-03-29 | 2016-09-14 | 高通股份有限公司 | In Augmented Reality system, virtual image is anchored to real world surface |
US10061387B2 (en) | 2011-03-31 | 2018-08-28 | Nokia Technologies Oy | Method and apparatus for providing user interfaces |
WO2012139241A1 (en) | 2011-04-11 | 2012-10-18 | Intel Corporation | Hand gesture recognition system |
US20120290943A1 (en) | 2011-05-10 | 2012-11-15 | Nokia Corporation | Method and apparatus for distributively managing content between multiple users |
US8912017B2 (en) | 2011-05-10 | 2014-12-16 | Ostendo Technologies, Inc. | Semiconductor wafer bonding incorporating electrical and optical interconnects |
US8508830B1 (en) | 2011-05-13 | 2013-08-13 | Google Inc. | Quantum dot near-to-eye display |
US8619049B2 (en) | 2011-05-17 | 2013-12-31 | Microsoft Corporation | Monitoring interactions between two or more objects within an environment |
US20120299962A1 (en) | 2011-05-27 | 2012-11-29 | Nokia Corporation | Method and apparatus for collaborative augmented reality displays |
KR101423536B1 (en) | 2011-06-14 | 2014-08-01 | 한국전자통신연구원 | System for constructiing mixed reality using print medium and method therefor |
US9218058B2 (en) | 2011-06-16 | 2015-12-22 | Daniel Bress | Wearable digital input device for multipoint free space data collection and analysis |
US20120326948A1 (en) | 2011-06-22 | 2012-12-27 | Microsoft Corporation | Environmental-light filter for see-through head-mounted display device |
PT105814A (en) | 2011-07-14 | 2013-01-14 | Yd Ynvisible S A | METHOD FOR THE PRODUCTION OF ELECTROCHROMIC PARTICLES AND CONTROL OF THEIR NIR AND VIS SPECIAL PROPERTIES |
US8471967B2 (en) | 2011-07-15 | 2013-06-25 | Google Inc. | Eyepiece for near-to-eye display with multi-reflectors |
US8508851B2 (en) | 2011-07-20 | 2013-08-13 | Google Inc. | Compact see-through display system |
US9931230B2 (en) | 2011-08-01 | 2018-04-03 | George Mason University | Artificial body part control system using ultrasonic imaging |
US9274595B2 (en) | 2011-08-26 | 2016-03-01 | Reincloud Corporation | Coherent presentation of multiple reality and interaction models |
CA2750287C (en) | 2011-08-29 | 2012-07-03 | Microsoft Corporation | Gaze detection in a see-through, near-eye, mixed reality display |
US9672049B2 (en) | 2011-09-22 | 2017-06-06 | Qualcomm Incorporated | Dynamic and configurable user interface |
US20130083303A1 (en) | 2011-10-04 | 2013-04-04 | Palo Alto Research Center Incorporated | Multi-Level Imaging Using Single-Pass Imaging System Having Spatial Light Modulator and Anamorphic Projection Optics |
US8773599B2 (en) | 2011-10-24 | 2014-07-08 | Google Inc. | Near-to-eye display with diffraction grating that bends and focuses light |
US8279716B1 (en) | 2011-10-26 | 2012-10-02 | Google Inc. | Smart-watch including flip up display |
US8553910B1 (en) | 2011-11-17 | 2013-10-08 | Jianchun Dong | Wearable computing device with behind-ear bone-conduction speaker |
US8928969B2 (en) | 2011-12-06 | 2015-01-06 | Ostendo Technologies, Inc. | Spatio-optical directional light modulator |
WO2013106731A1 (en) | 2012-01-11 | 2013-07-18 | Howard Hughes Medical Institute | Multi-dimensional imaging using multi-focus microscopy |
US8894484B2 (en) | 2012-01-30 | 2014-11-25 | Microsoft Corporation | Multiplayer game invitation system |
US20130225999A1 (en) | 2012-02-29 | 2013-08-29 | Toshiba Medical Systems Corporation | Gesture commands user interface for ultrasound imaging systems |
US20130286053A1 (en) | 2012-04-25 | 2013-10-31 | Rod G. Fleck | Direct view augmented reality eyeglass-type display |
WO2013177111A1 (en) | 2012-05-21 | 2013-11-28 | Medplate Lifesciences Corporation | Collapsible, shape memory alloy structures and methods for forming same |
US9179126B2 (en) | 2012-06-01 | 2015-11-03 | Ostendo Technologies, Inc. | Spatio-temporal light field cameras |
US9430055B2 (en) | 2012-06-15 | 2016-08-30 | Microsoft Technology Licensing, Llc | Depth of field control for see-thru display |
CN103546181A (en) | 2012-07-17 | 2014-01-29 | 高寿谦 | Wearable wireless intelligent electronic device with detachable and free combination functions |
US8754829B2 (en) | 2012-08-04 | 2014-06-17 | Paul Lapstun | Scanning light field camera and display |
US20140049417A1 (en) | 2012-08-20 | 2014-02-20 | Playtabase, LLC | Wireless motion activated command transfer device, system, and method |
US20140085177A1 (en) | 2012-09-21 | 2014-03-27 | Nokia Corporation | Method and apparatus for responding to input based upon relative finger position |
US10620902B2 (en) | 2012-09-28 | 2020-04-14 | Nokia Technologies Oy | Method and apparatus for providing an indication regarding content presented to another user |
US9921687B2 (en) | 2012-10-02 | 2018-03-20 | Autodesk, Inc. | Always-available input through finger instrumentation |
US10234941B2 (en) | 2012-10-04 | 2019-03-19 | Microsoft Technology Licensing, Llc | Wearable sensor for tracking articulated body-parts |
KR20140052640A (en) | 2012-10-25 | 2014-05-07 | 삼성전자주식회사 | Method for displaying a cursor on a display and system performing the same |
JP6155448B2 (en) | 2012-11-01 | 2017-07-05 | アイカム エルエルシー | Wireless wrist computing and controlling device and method for 3D imaging, mapping, networking and interfacing |
US10185416B2 (en) | 2012-11-20 | 2019-01-22 | Samsung Electronics Co., Ltd. | User gesture input to wearable electronic device involving movement of device |
US9274608B2 (en) | 2012-12-13 | 2016-03-01 | Eyesight Mobile Technologies Ltd. | Systems and methods for triggering actions based on touch-free gesture detection |
US9345609B2 (en) | 2013-01-11 | 2016-05-24 | Elwha Llc | Position sensing active torso support |
JP6010450B2 (en) | 2012-12-20 | 2016-10-19 | 浜松ホトニクス株式会社 | Light observation apparatus and light observation method |
US20140176417A1 (en) | 2012-12-21 | 2014-06-26 | Ian A. Young | Wearable projector for portable display |
US8947783B2 (en) | 2013-01-02 | 2015-02-03 | Google Inc. | Optical combiner for near-eye display |
JP6197295B2 (en) | 2013-01-22 | 2017-09-20 | セイコーエプソン株式会社 | Optical device and image display apparatus |
US10386970B2 (en) | 2013-02-08 | 2019-08-20 | Apple Inc. | Force determination based on capacitive sensing |
WO2014127126A1 (en) | 2013-02-14 | 2014-08-21 | New York University | Handphone |
US9223139B2 (en) | 2013-02-15 | 2015-12-29 | Google Inc. | Cascading optics in optical combiners of head mounted displays |
US20140301662A1 (en) | 2013-03-17 | 2014-10-09 | ISC8 Inc. | Analysis, Labeling and Exploitation of Sensor Data in Real Time |
US20140304646A1 (en) | 2013-04-04 | 2014-10-09 | Klip, Inc. | Sliding side menu gui with menu items displaying indicia of updated content |
US9405124B2 (en) | 2013-04-09 | 2016-08-02 | Massachusetts Institute Of Technology | Methods and apparatus for light field projection |
KR102116551B1 (en) | 2013-05-10 | 2020-05-28 | 한국전자통신연구원 | System for stereoscopic display |
DE202014010839U1 (en) | 2013-05-15 | 2016-11-16 | Google Inc. | Efficient retrieval of map data during animation |
US8725842B1 (en) | 2013-07-11 | 2014-05-13 | Khalid Al-Nasser | Smart watch |
US20140129207A1 (en) | 2013-07-19 | 2014-05-08 | Apex Technology Ventures, LLC | Augmented Reality Language Translation |
CN103424803B (en) | 2013-08-16 | 2015-07-29 | 上海理工大学 | Optical waveguide device system |
US9451162B2 (en) | 2013-08-21 | 2016-09-20 | Jaunt Inc. | Camera array including camera modules |
US9164290B2 (en) | 2013-11-06 | 2015-10-20 | Microsoft Corporation | Grating configurations for a tiled waveguide display |
CN103558918B (en) | 2013-11-15 | 2016-07-27 | 上海威璞电子科技有限公司 | The method realizing Gesture Recognition in intelligent watch |
CN106104408B (en) | 2013-11-29 | 2021-07-27 | 行动股份有限公司 | Wearable computing device |
US9244539B2 (en) | 2014-01-07 | 2016-01-26 | Microsoft Technology Licensing, Llc | Target positioning with gaze tracking |
US9651784B2 (en) | 2014-01-21 | 2017-05-16 | Osterhout Group, Inc. | See-through computer display systems |
US10554962B2 (en) | 2014-02-07 | 2020-02-04 | Samsung Electronics Co., Ltd. | Multi-layer high transparency display for light field generation |
JP2015184560A (en) | 2014-03-25 | 2015-10-22 | ソニー株式会社 | Light guide device, image display device, and display device |
US20150323998A1 (en) | 2014-05-06 | 2015-11-12 | Qualcomm Incorporated | Enhanced user interface for a wearable electronic device |
US9595138B2 (en) * | 2014-05-29 | 2017-03-14 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Augmented reality display device |
CN104468578B (en) | 2014-12-10 | 2017-12-26 | 怀效宁 | The priority traffic system and the means of communication of a kind of wireless telecommunications |
US10234952B2 (en) | 2014-07-18 | 2019-03-19 | Maxim Integrated Products, Inc. | Wearable device for using human body as input mechanism |
US9335602B2 (en) | 2014-07-25 | 2016-05-10 | Tintable Kibing Co., Ltd. | Method for control of electrochromic device |
CN104460992B (en) | 2014-11-20 | 2017-07-21 | 大连理工大学 | The finger motion detection means and method of a kind of use infrared radiation carpal ligaments |
CN104597602A (en) * | 2015-01-24 | 2015-05-06 | 上海理湃光晶技术有限公司 | Efficiently coupled tooth embedded slab guide optical element in compact structure |
US11166698B2 (en) | 2015-01-30 | 2021-11-09 | Canon Medical Systems Corporation | Ultrasonic diagnostic apparatus |
US10295990B2 (en) | 2015-05-18 | 2019-05-21 | Milwaukee Electric Tool Corporation | User interface for tool configuration and data capture |
US10080950B2 (en) | 2015-09-05 | 2018-09-25 | Aspire Sports Inc. | System of communication in a wearable device |
US9898869B2 (en) | 2015-09-09 | 2018-02-20 | Microsoft Technology Licensing, Llc | Tactile interaction in virtual environments |
US11609427B2 (en) | 2015-10-16 | 2023-03-21 | Ostendo Technologies, Inc. | Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays |
US10578882B2 (en) | 2015-12-28 | 2020-03-03 | Ostendo Technologies, Inc. | Non-telecentric emissive micro-pixel array light modulators and methods of fabrication thereof |
US10054503B2 (en) | 2016-03-11 | 2018-08-21 | Microsoft Technology Licensing, Llc | Force sensor |
-
2016
- 2016-10-14 US US15/294,447 patent/US11609427B2/en active Active
- 2016-10-17 KR KR1020187013560A patent/KR20180070626A/en not_active Application Discontinuation
- 2016-10-17 TW TW105133472A patent/TWI767891B/en not_active IP Right Cessation
- 2016-10-17 WO PCT/US2016/057418 patent/WO2017066802A1/en active Application Filing
- 2016-10-17 CN CN201680073919.0A patent/CN108369339B/en active Active
- 2016-10-17 EP EP16856441.7A patent/EP3362838A4/en not_active Withdrawn
- 2016-10-17 JP JP2018519398A patent/JP7198663B2/en active Active
-
2019
- 2019-02-01 HK HK19101840.9A patent/HK1259436A1/en unknown
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6804066B1 (en) * | 2001-05-23 | 2004-10-12 | University Of Central Florida | Compact lens assembly for the teleportal augmented reality system |
US20090199900A1 (en) * | 2008-02-12 | 2009-08-13 | Qualcomm Mems Technologies, Inc. | Thin film holographic solar concentrator/collector |
US20100046070A1 (en) * | 2008-08-21 | 2010-02-25 | Sony Corporation | Head-mounted display |
US20140124173A1 (en) * | 2009-10-29 | 2014-05-08 | Wistron Corporation | Heat dissipating device and heat dissipating fin |
US20140049983A1 (en) * | 2010-11-18 | 2014-02-20 | Anthony John Nichol | Light emitting device comprising a lightguide film and aligned coupling lightguides |
US20120154441A1 (en) * | 2010-12-16 | 2012-06-21 | Electronics And Telecommunications Research Institute | Augmented reality display system and method for vehicle |
US20120154277A1 (en) * | 2010-12-17 | 2012-06-21 | Avi Bar-Zeev | Optimized focal area for augmented reality displays |
US20130077049A1 (en) * | 2011-09-26 | 2013-03-28 | David D. Bohn | Integrated eye tracking and display system |
US20150035832A1 (en) * | 2011-12-01 | 2015-02-05 | Microsoft Corporation | Virtual light in augmented reality |
US20130258451A1 (en) * | 2012-03-27 | 2013-10-03 | Ostendo Technologies, Inc. | Spatio-Temporal Directional Light Modulator |
US20150235467A1 (en) * | 2013-11-27 | 2015-08-20 | Magic Leap, Inc. | Waveguide assembly to display images at multiple focal planes |
US20170116897A1 (en) * | 2014-05-15 | 2017-04-27 | Samsung Electronics Co., Ltd. | Image display device and method using unidirectional beam |
Cited By (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10451876B2 (en) | 2015-08-03 | 2019-10-22 | Facebook Technologies, Llc | Enhanced visual perception through distance-based ocular projection |
US10552676B2 (en) | 2015-08-03 | 2020-02-04 | Facebook Technologies, Llc | Methods and devices for eye tracking based on depth sensing |
US10338451B2 (en) | 2015-08-03 | 2019-07-02 | Facebook Technologies, Llc | Devices and methods for removing zeroth order leakage in beam steering devices |
US20170038836A1 (en) * | 2015-08-03 | 2017-02-09 | Oculus Vr, Llc | Display with an Embedded Eye Tracker |
US10437061B2 (en) | 2015-08-03 | 2019-10-08 | Facebook Technologies, Llc | Near-ocular display based on hologram projection |
US10534173B2 (en) | 2015-08-03 | 2020-01-14 | Facebook Technologies, Llc | Display with a tunable mask for augmented reality |
US10459305B2 (en) | 2015-08-03 | 2019-10-29 | Facebook Technologies, Llc | Time-domain adjustment of phase retardation in a liquid crystal grating for a color display |
US10297180B2 (en) | 2015-08-03 | 2019-05-21 | Facebook Technologies, Llc | Compensation of chromatic dispersion in a tunable beam steering device for improved display |
US10274730B2 (en) * | 2015-08-03 | 2019-04-30 | Facebook Technologies, Llc | Display with an embedded eye tracker |
US10345599B2 (en) | 2015-08-03 | 2019-07-09 | Facebook Technologies, Llc | Tile array for near-ocular display |
US10162182B2 (en) | 2015-08-03 | 2018-12-25 | Facebook Technologies, Llc | Enhanced pixel resolution through non-uniform ocular projection |
US10359629B2 (en) | 2015-08-03 | 2019-07-23 | Facebook Technologies, Llc | Ocular projection based on pupil position |
US20170111723A1 (en) * | 2015-10-20 | 2017-04-20 | Bragi GmbH | Personal Area Network Devices System and Method |
US10342428B2 (en) | 2015-10-20 | 2019-07-09 | Bragi GmbH | Monitoring pulse transmissions using radar |
US10247858B2 (en) | 2015-10-25 | 2019-04-02 | Facebook Technologies, Llc | Liquid crystal half-wave plate lens |
US10705262B2 (en) | 2015-10-25 | 2020-07-07 | Facebook Technologies, Llc | Liquid crystal half-wave plate lens |
US10416454B2 (en) | 2015-10-25 | 2019-09-17 | Facebook Technologies, Llc | Combination prism array for focusing light |
US10670929B2 (en) | 2015-12-21 | 2020-06-02 | Facebook Technologies, Llc | Enhanced spatial resolution using a segmented electrode array |
US10203566B2 (en) | 2015-12-21 | 2019-02-12 | Facebook Technologies, Llc | Enhanced spatial resolution using a segmented electrode array |
US10670928B2 (en) | 2015-12-21 | 2020-06-02 | Facebook Technologies, Llc | Wide angle beam steering for virtual reality and augmented reality |
US11500208B2 (en) | 2016-01-07 | 2022-11-15 | Magic Leap, Inc. | Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes |
US10466480B2 (en) * | 2016-01-07 | 2019-11-05 | Magic Leap, Inc. | Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes |
US20170212351A1 (en) * | 2016-01-07 | 2017-07-27 | Magic Leap, Inc. | Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes |
US10890773B2 (en) | 2016-01-07 | 2021-01-12 | Magic Leap, Inc. | Virtual and augmented reality systems and methods having unequal numbers of component color images distributed across depth planes |
US11247607B1 (en) * | 2016-03-16 | 2022-02-15 | Deepstone LLC | Extended perception system |
US9842433B2 (en) * | 2016-04-15 | 2017-12-12 | Superd Co. Ltd. | Method, apparatus, and smart wearable device for fusing augmented reality and virtual reality |
US10104464B2 (en) * | 2016-08-25 | 2018-10-16 | Bragi GmbH | Wireless earpiece and smart glasses system and method |
US20190196209A1 (en) * | 2016-10-31 | 2019-06-27 | Boe Technology Group Co., Ltd. | Display Panel and Display Apparatus |
US10642061B2 (en) * | 2016-10-31 | 2020-05-05 | Boe Technology Group Co., Ltd. | Display panel and display apparatus |
US20190107723A1 (en) * | 2016-12-20 | 2019-04-11 | Facebook Technologies, Llc | Waveguide display with a small form factor, a large field of view, and a large eyebox |
US10585287B2 (en) * | 2016-12-20 | 2020-03-10 | Facebook Technologies, Llc | Waveguide display with a small form factor, a large field of view, and a large eyebox |
US11493761B2 (en) | 2017-05-17 | 2022-11-08 | Vuzix Corporation | Fixed focus image light guide with zoned diffraction gratings |
JP7190447B2 (en) | 2017-05-17 | 2022-12-15 | ビュージックス コーポレーション | Fixed focus imaging light guide with zoned grating |
JP2020519960A (en) * | 2017-05-17 | 2020-07-02 | ビュージックス コーポレーションVuzix Corporation | Fixed-focus image light guide with zoned grating |
US11454815B2 (en) | 2017-06-01 | 2022-09-27 | NewSight Reality, Inc. | Transparent optical module using pixel patches and associated lenslets |
CN109085711A (en) * | 2017-06-13 | 2018-12-25 | 深圳市光场视觉有限公司 | A kind of vision conversion equipment of adjustable light transmittance |
US10338400B2 (en) | 2017-07-03 | 2019-07-02 | Holovisions LLC | Augmented reality eyewear with VAPE or wear technology |
US10859834B2 (en) | 2017-07-03 | 2020-12-08 | Holovisions | Space-efficient optical structures for wide field-of-view augmented reality (AR) eyewear |
CN110583016A (en) * | 2017-07-13 | 2019-12-17 | 谷歌有限责任公司 | non-planar computing display |
WO2019013865A1 (en) * | 2017-07-13 | 2019-01-17 | Google Llc | Non-planar computational displays |
US10659771B2 (en) * | 2017-07-13 | 2020-05-19 | Google Llc | Non-planar computational displays |
US11165938B2 (en) | 2017-07-27 | 2021-11-02 | Command Sight, Inc. | Animal-wearable first person view system |
WO2019023040A1 (en) * | 2017-07-27 | 2019-01-31 | Command Sight, Inc. | Animal wearable head mountable display system |
US10609902B2 (en) | 2017-07-27 | 2020-04-07 | Command Sight, Inc. | Animal wearable head mountable display system |
US10630873B2 (en) | 2017-07-27 | 2020-04-21 | Command Sight, Inc. | Animal-wearable first person view system |
US11659751B2 (en) | 2017-10-03 | 2023-05-23 | Lockheed Martin Corporation | Stacked transparent pixel structures for electronic displays |
US10930709B2 (en) | 2017-10-03 | 2021-02-23 | Lockheed Martin Corporation | Stacked transparent pixel structures for image sensors |
CN107889074A (en) * | 2017-10-20 | 2018-04-06 | 深圳市眼界科技有限公司 | Dodgem data processing method, apparatus and system for VR |
CN110573996A (en) * | 2017-10-31 | 2019-12-13 | 谷歌有限责任公司 | Multi-view eye tracking for VR/AR systems |
WO2019089094A1 (en) * | 2017-10-31 | 2019-05-09 | Google Llc | Multi-perspective eye-tracking for vr/ar systems |
US10998386B2 (en) | 2017-11-09 | 2021-05-04 | Lockheed Martin Corporation | Display-integrated infrared emitter and sensor structures |
US10510812B2 (en) | 2017-11-09 | 2019-12-17 | Lockheed Martin Corporation | Display-integrated infrared emitter and sensor structures |
US10838250B2 (en) | 2018-02-07 | 2020-11-17 | Lockheed Martin Corporation | Display assemblies with electronically emulated transparency |
US10129984B1 (en) | 2018-02-07 | 2018-11-13 | Lockheed Martin Corporation | Three-dimensional electronics distribution by geodesic faceting |
US11616941B2 (en) | 2018-02-07 | 2023-03-28 | Lockheed Martin Corporation | Direct camera-to-display system |
US10652529B2 (en) | 2018-02-07 | 2020-05-12 | Lockheed Martin Corporation | In-layer Signal processing |
US10979699B2 (en) | 2018-02-07 | 2021-04-13 | Lockheed Martin Corporation | Plenoptic cellular imaging system |
US10594951B2 (en) | 2018-02-07 | 2020-03-17 | Lockheed Martin Corporation | Distributed multi-aperture camera array |
US11146781B2 (en) | 2018-02-07 | 2021-10-12 | Lockheed Martin Corporation | In-layer signal processing |
US10690910B2 (en) | 2018-02-07 | 2020-06-23 | Lockheed Martin Corporation | Plenoptic cellular vision correction |
US10951883B2 (en) | 2018-02-07 | 2021-03-16 | Lockheed Martin Corporation | Distributed multi-screen array for high density display |
US20190324276A1 (en) * | 2018-04-19 | 2019-10-24 | Magic Leap, Inc. | Systems and methods for operating a display system based on user perceptibility |
US11210832B2 (en) | 2018-04-24 | 2021-12-28 | Hewlett-Packard Development Company, L.P. | Animated gazes on head mounted displays |
DE102018209377A1 (en) | 2018-06-12 | 2019-12-12 | Volkswagen Aktiengesellschaft | A method of presenting AR / VR content on a mobile terminal and mobile terminal presenting AR / VR content |
US11209650B1 (en) | 2018-09-06 | 2021-12-28 | Facebook Technologies, Llc | Waveguide based display with multiple coupling elements for artificial reality |
US10481321B1 (en) | 2018-09-06 | 2019-11-19 | Facebook Technologies, Llc | Canted augmented reality display for improved ergonomics |
US10627565B1 (en) | 2018-09-06 | 2020-04-21 | Facebook Technologies, Llc | Waveguide-based display for artificial reality |
US10852823B2 (en) | 2018-10-23 | 2020-12-01 | Microsoft Technology Licensing, Llc | User-specific eye tracking calibration for near-eye-display (NED) devices |
US10855979B2 (en) | 2018-10-23 | 2020-12-01 | Microsoft Technology Licensing, Llc | Interpreting eye gaze direction as user input to near-eye-display (NED) devices for enabling hands free positioning of virtual items |
US10996746B2 (en) | 2018-10-23 | 2021-05-04 | Microsoft Technology Licensing, Llc | Real-time computational solutions to a three-dimensional eye tracking framework |
US10718942B2 (en) | 2018-10-23 | 2020-07-21 | Microsoft Technology Licensing, Llc | Eye tracking systems and methods for near-eye-display (NED) devices |
US10838490B2 (en) | 2018-10-23 | 2020-11-17 | Microsoft Technology Licensing, Llc | Translating combinations of user gaze direction and predetermined facial gestures into user input instructions for near-eye-display (NED) devices |
WO2020094479A1 (en) * | 2018-11-07 | 2020-05-14 | Robert Bosch Gmbh | Spectacle lens for data glasses, data glasses, and method for operating a spectacle lens or data glasses |
WO2020117459A1 (en) * | 2018-12-03 | 2020-06-11 | Lockheed Martin Corporation | Eccentric incident luminance pupil tracking |
US10866413B2 (en) | 2018-12-03 | 2020-12-15 | Lockheed Martin Corporation | Eccentric incident luminance pupil tracking |
US11209681B2 (en) * | 2018-12-03 | 2021-12-28 | Disney Enterprises, Inc. | Virtual reality and/or augmented reality viewer having variable transparency |
US11448918B2 (en) | 2019-01-30 | 2022-09-20 | Samsung Electronics Co., Ltd. | Grating device, screen including the grating device, method of manufacturing the screen and display apparatus for augmented reality and/or virtual reality including the screen |
US10942320B2 (en) * | 2019-02-11 | 2021-03-09 | Facebook Technologies, Llc | Dispersion compensation for light coupling through slanted facet of optical waveguide |
US20200257065A1 (en) * | 2019-02-11 | 2020-08-13 | Facebook Technologies, Llc | Dispersion compensation for light coupling through slanted facet of optical waveguide |
US11686943B2 (en) | 2019-03-05 | 2023-06-27 | Samsung Display Co., Ltd. | Display device |
US10698201B1 (en) | 2019-04-02 | 2020-06-30 | Lockheed Martin Corporation | Plenoptic cellular axis redirection |
US20220146822A1 (en) * | 2019-08-15 | 2022-05-12 | Ostendo Technologies, Inc. | Wearable Display Systems and Design Methods Thereof |
WO2021061448A1 (en) * | 2019-09-23 | 2021-04-01 | Akalana Management Llc | Optical systems with switchable lenses for mitigating variations in ambient brightness |
US11526013B2 (en) | 2019-12-30 | 2022-12-13 | Acer Incorporated | Wearable display device |
US11852830B2 (en) | 2020-06-18 | 2023-12-26 | Samsung Electronics Co., Ltd. | Augmented reality glass and operating method therefor |
US11553171B2 (en) * | 2020-09-04 | 2023-01-10 | Samsung Display Co., Ltd. | Light field display device and method of processing image of the same |
US20220078398A1 (en) * | 2020-09-04 | 2022-03-10 | Samsung Display Co., Ltd. | Light field display device and method of processing image of the same |
CN112365861A (en) * | 2020-10-26 | 2021-02-12 | 深圳Tcl新技术有限公司 | Display image adjusting method, electronic device and computer readable storage medium |
WO2022177334A1 (en) * | 2021-02-18 | 2022-08-25 | 삼성전자 주식회사 | Wearable electronic device |
US20220342219A1 (en) * | 2021-04-26 | 2022-10-27 | Meta Platforms Technologies, Llc | Apparatus, system, and method for disposing photonic integrated circuits on surfaces |
US20230204958A1 (en) * | 2021-12-28 | 2023-06-29 | David Fliszar | Eyewear electronic tinting lens with integrated waveguide |
US20230213762A1 (en) * | 2021-12-31 | 2023-07-06 | Beijing Ned+Ar Display Technology Co., Ltd. | Ultra-thin lens, virtual image display device using same, and near-eye display |
US11966058B2 (en) * | 2021-12-31 | 2024-04-23 | Beijing Ned+Ar Display Technology Co., Ltd. | Ultra-thin lens, virtual image display device using same, and near-eye display |
EP4343406A1 (en) * | 2022-09-20 | 2024-03-27 | Rockwell Collins, Inc. | Method for creating uniform contrast on a headworn display against high dynamic range scene |
Also Published As
Publication number | Publication date |
---|---|
EP3362838A4 (en) | 2019-12-04 |
WO2017066802A1 (en) | 2017-04-20 |
KR20180070626A (en) | 2018-06-26 |
TWI767891B (en) | 2022-06-21 |
JP7198663B2 (en) | 2023-01-04 |
US11609427B2 (en) | 2023-03-21 |
JP2018533765A (en) | 2018-11-15 |
CN108369339A (en) | 2018-08-03 |
HK1259436A1 (en) | 2019-11-29 |
EP3362838A1 (en) | 2018-08-22 |
CN108369339B (en) | 2022-08-26 |
TW201728961A (en) | 2017-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11609427B2 (en) | Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays | |
US10867451B2 (en) | Apparatus, systems, and methods for display devices including local dimming | |
US11551602B2 (en) | Non-uniform resolution, large field-of-view headworn display | |
CN109407835B (en) | Virtual and augmented reality systems and methods | |
CN104932102B (en) | Display device and optical device | |
US9720231B2 (en) | Display, imaging system and controller for eyewear display device | |
US8619005B2 (en) | Switchable head-mounted display transition | |
CN103999445B (en) | Head-mounted display | |
US9124865B2 (en) | Display apparatus and method of adjusting 3D image therein | |
US20130107340A1 (en) | Autostereoscopic Steering Light-Guide Three-Dimensional Displays | |
JPWO2017066802A5 (en) | ||
CN106489177A (en) | There is the near-to-eye of self-luminous micro display engine | |
US20200209609A1 (en) | Adaptive resolution for multi-view display system and method thereof | |
US10674141B1 (en) | Apparatuses, systems, and methods for determining interpupillary distances of head-mounted displays | |
US10775617B2 (en) | Eye tracked lens for increased screen resolution | |
US20230088014A1 (en) | Optical module having active array of elements capable of synchronizing with see-through display | |
US11436987B1 (en) | Adaptive backlight activation for low-persistence liquid crystal displays | |
KR102139746B1 (en) | Transparent display apparatus and method thereof | |
US11768376B1 (en) | Head-mounted display system with display and adjustable optical components | |
US20220343606A1 (en) | Optical module with active microlens array capable of synchronizing with see-through display to provide multiple modes of functionality | |
WO2022179312A1 (en) | Optical display system and electronics apparatus | |
US20240155098A1 (en) | Multiview image capture system and method | |
US20230324686A1 (en) | Adaptive control of optical transmission | |
TW202315250A (en) | Vcsel chip for generation of linear structured light patterns and flood illumination |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: OSTENDO TECHNOLOGIES, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:EL-GHOROURY, HUSSEIN S.;CHUANG, CHIH-LI;AGOSTINELLI, BIAGIO;SIGNING DATES FROM 20170103 TO 20170109;REEL/FRAME:041010/0578 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |
|
STCC | Information on status: application revival |
Free format text: WITHDRAWN ABANDONMENT, AWAITING EXAMINER ACTION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |